Post AsY3ea2b3ywVL8Q1sO by bdf2121cc3334b35b6ecda66e471@mastodon.social
(DIR) More posts by bdf2121cc3334b35b6ecda66e471@mastodon.social
(DIR) Post #AsWwnvURn7f7dGzlBY by lxo@snac.lx.oliva.nom.br
2025-03-29T00:42:07Z
3 likes, 2 repeats
new blog post: against proof of wastehttps://blog.lx.oliva.nom.br/2025-03-28-against-proof-of-waste.en.htmlAs web servers get overwhelmed by LLM bots, some operators are resorting to programs that demand visiting web clients to perform some relatively expensive computation to be granted access to the website. This is called Proof of Work, but when that computation doesn't yield any useful results, it might as well be called Proof of Waste. Why not get clients to compute something useful and valuable, so that LLM scrapers become an essentially infinite supply of computing power, and our servers pay for themselves and for the creative works we put in them?
(DIR) Post #AsXoWwIquVjwkuv9Bw by tyil@fedi.tyil.nl
2025-03-29T10:47:50.854Z
0 likes, 0 repeats
@lxo@snac.lx.oliva.nom.br I've not seen a system that supports this, so for the moment I am running Anubis (which is a proof-of-waste system). If I can instead drop in a system that produces useful computation results, I would love to.I generally hate that we need any system at all to deter LLM scrapers from destroying the services we host, and I hate that it introduces so much extra waste to combat them.
(DIR) Post #AsXqDc1uGIQgHkuFoe by bdf2121cc3334b35b6ecda66e471@mastodon.social
2025-03-29T10:49:23Z
1 likes, 0 repeats
@lxo because the expensive-to-compute result needs to be easy to verify.If the correctness of the result isn't knowable by the server without doing all the work again, then it fails as a proof-of-work verification.That's the reason one-way cryptographic functions are used, and not work such as, say, protein folding.
(DIR) Post #AsXqDckZaE9sWHnvRA by bdf2121cc3334b35b6ecda66e471@mastodon.social
2025-03-29T10:53:28Z
0 likes, 0 repeats
@lxo besides, if the work keeps the service available to humans while removing the cost of bots, then the work _is useful._
(DIR) Post #AsXqDdWmgyisvoMQaG by lxo@snac.lx.oliva.nom.br
2025-03-29T11:04:59Z
0 likes, 0 repeats
what if it makes the service unavailable to humans like me (this is true), while making for bots to waste more computing without relieving the server from any load (this will be counterfactual, thus unknowable)? would you still consider it useful?
(DIR) Post #AsXr2j9xVqppAQ2vpo by lxo@snac.lx.oliva.nom.br
2025-03-29T11:14:35Z
0 likes, 0 repeats
there are plenty of problems that are very expensive to solve and very easy to verify. all NP-complete problems are like that. all of these can serve as proof-of-useful-work, there's no need to demand proof-of-waste. plus, verification can also be made into proof-of-useful-work, if you're smart enough.now, the token-granters need not really be concerned with any of that: it could be up to the customers paying for computing to be performed by other parties to figure out how to break up their problems into smaller verifiable pieces.even if they're hard to break up, the architecture could have room for longer tasks that grant more tokens, that dedicated compute machines (as opposed to browser-based PoWs) could engage in.I don't know enough about proteinfolding@home@mastodon.social to tell how verifiable the results are, but I know SETI@home@mastodon.social had verification built into the architecture. those are inspiring examples of very distributed computing architectures, not necessarily systems that could be trivially integrated into such a hostile environment as proof-of-useful-work systems.
(DIR) Post #AsXsU6dIRpfRjzO6LY by bdf2121cc3334b35b6ecda66e471@mastodon.social
2025-03-29T11:22:39Z
0 likes, 0 repeats
@lxo @home no, not all NP-complete problems are like that.You give me a map, and tell me to solve the traveling salesman problem, and I give you a result. How do you verify the result I give you _really is_ the shortest possible path, and not a lie?
(DIR) Post #AsXsU7uLhutzhANEZ6 by lxo@snac.lx.oliva.nom.br
2025-03-29T11:30:50Z
0 likes, 0 repeats
maybe go back to your reference book and see how the traveling salesman problem that is NP-complete is defined.while at that, look at the definition of NP-complete: problems that (apparently) cannot be solved in polynomial time, but whose solutions can be verified in polynomial timeCC: @home@mastodon.social
(DIR) Post #AsXso1og45aQavD82y by bdf2121cc3334b35b6ecda66e471@mastodon.social
2025-03-29T11:19:01Z
0 likes, 0 repeats
@lxo if the point is to not make the service available to humans, but to make neural networks do useful work, then what you should be using is... a neural network.
(DIR) Post #AsXso2ZpEnIgx9GmXI by lxo@snac.lx.oliva.nom.br
2025-03-29T11:34:24Z
0 likes, 0 repeats
trolling much?now, if you weren't trolling and this was an honest misunderstanding: the current browser-based proof-of-waste systems make the service unavailable to people with accessibility needs that rule out javascript, like me.also, they do not stop bots, at best they demand them to spend more (traditional) computing resources to train their neural nets, while adding more load on the server (because of the added Proof-of-Waste server side).that makes it useless, per your own definition
(DIR) Post #AsY3eVmUttgi82r7E8 by bdf2121cc3334b35b6ecda66e471@mastodon.social
2025-03-29T11:56:43Z
1 likes, 0 repeats
@lxo except in practice, it has reduced traffic to the affected services and made them more available to most humans. Otherwise, the humans in question wouldn't be using them.Even if the bots are programmed to solve these problems, the point is to make redundant, repeated visits prohibitively expensive, which matters when the LLM bots are performing thousands of times the visits that human actors are.
(DIR) Post #AsY3eWuggvpjdjhAdE by lxo@snac.lx.oliva.nom.br
2025-03-29T13:35:59Z
0 likes, 0 repeats
that's exactly the race we're already losingthey have infinite computing power, we don't, and most of us are limited by the battery life of slow devicesthat's the externalization of costs that server operators who adopt proof of waste are duplicating from crawlers, sending us a "screw you" like the one they got from crawlers
(DIR) Post #AsY3ea2b3ywVL8Q1sO by bdf2121cc3334b35b6ecda66e471@mastodon.social
2025-03-29T12:30:07Z
0 likes, 0 repeats
@lxo what I consider to be much more viable is an offline verifier for people without JavaScript in the browser. Provide a form with a difficulty and an input box, and allow people to run some CLI tool or such to produce an output that can be pasted into the form.The fact that crawlers don't do proof-of-work right now isn't really the point of PoW and can potentially be patched relatively quickly anyway.
(DIR) Post #AsY3eeMazZJgkJo3bU by bdf2121cc3334b35b6ecda66e471@mastodon.social
2025-03-29T12:31:06Z
0 likes, 0 repeats
@lxo but people have been talking about making PoW systems that do some useful scientific work since Bitcoin in 2009, and in that time nobody's actually built it. I don't think that's for lack of effort.
(DIR) Post #AsY3wQ154YtbSiDR44 by lxo@snac.lx.oliva.nom.br
2025-03-29T13:38:09Z
0 likes, 0 repeats
sure, that was my first thought.but why should I stop at something that will only repeal crawler bots, if we can (contribute to?) solve the problem of funding community servers and the production of cultural works?
(DIR) Post #AsY4FG2YO84y1Vt6WG by lxo@snac.lx.oliva.nom.br
2025-03-29T13:39:39Z
0 likes, 0 repeats
any pointers to such talking? I'm curious to see what has been tried (if it has), and why it hasn't succeeded (if it really hasn't)