@mfed1122 yeah that is my worry, what’s an acceptable wait time for users? A tenth of a second is usually not noticeable to a human, but is it useful in this context? What about half a second, etc
I don’t know that I want a web where everything is artificially slowed by a full second for each document
@rtxn all right, that’s all you had to say initially, rather than try convincing me that the network client was out of the loop: it isn’t, that’s the whole point of Anubis
@Passerby6497 my stance is that the LLM might recognize that the best way to solve the problem is to run chromium and get the answer from there, then pass it on?
This is a typical network thing: client asks for resource, server says here’s a challenge, client responds or doesn’t, has the correct response or not, but has the challenge regardless
@mfed1122@tofu any client-side tech to avoid (some of the) bots is bound to, as its popularity grows, be either circumvented by the bot’s developers or the model behind the bot will have picked up enough to solve it
I don’t see how any of these are going to do better than a short term patch
@rikudou@voxel
ASFAIR it used to be even worse than that, because if you didn’t want SNI (for compatibility reasons or whatever), but you still wanted a certificate, you had to have one server for every hostname (because each had its own IP), assuming you could afford the additional IP space
Granted you didn’t need a physical server, but that was still a bigger cost
Some servers are more flexible on that front, but early SNI didn’t have those
@mfed1122 yeah that is my worry, what’s an acceptable wait time for users? A tenth of a second is usually not noticeable to a human, but is it useful in this context? What about half a second, etc
I don’t know that I want a web where everything is artificially slowed by a full second for each document