Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Rot13 a challenge string. It could be any arbitrary function.




That wouldn’t have the fallback rate-limiting functionality. It’s too cheap.

It’s too cheap as a rate limiter as it is if you read TFA.

That's a configurable setting.

There’s no possible setting that would make it expensive enough to deter AI scrapers while preserving an acceptable user experience. The more zeros you add the more real users suffer, despite not creating much of a challenge to datacenter-hosted scrapers.

Real users suffer much more if the site is entirely down due to being DDoSed by aggressive AI scrapers.

Yeah, and if this tool doesn’t stop them then the site is down anyway.

Right. The choice is presumably between:

Bad: A site being usable for a significant amount of time per day, but also unusable for a significant amount of time per day, and the ratio between usable and unusable time per day significantly deteriorating.

Worse: A site being usable for a significant amount of time per day, but also unusable for a significant amount of time per day, and the ratio between usable and unusable time per day significantly deteriorating _significantly faster_.

Clearly, Anubis is at best an interim measure. The interim period might not be significant.

But it might be. That is presumably the point of Anubis.

That said, the only time I've heard of Anubis being tried was when Perl's MetaCPAN became ever more unusable over the summer. [0]

Unfortunately Anubis and Fastly fought, and Fastly won. [1]

----

[0] https://www.perl.com/article/metacpan-traffic-crisis/

[1] https://www.reddit.com/r/perl/comments/1mbzrjo/metacpans_tra...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: