There’s no possible setting that would make it expensive enough to deter AI scrapers while preserving an acceptable user experience. The more zeros you add the more real users suffer, despite not creating much of a challenge to datacenter-hosted scrapers.
Bad: A site being usable for a significant amount of time per day, but also unusable for a significant amount of time per day, and the ratio between usable and unusable time per day significantly deteriorating.
Worse: A site being usable for a significant amount of time per day, but also unusable for a significant amount of time per day, and the ratio between usable and unusable time per day significantly deteriorating _significantly faster_.
Clearly, Anubis is at best an interim measure. The interim period might not be significant.
But it might be. That is presumably the point of Anubis.
That said, the only time I've heard of Anubis being tried was when Perl's MetaCPAN became ever more unusable over the summer. [0]
Unfortunately Anubis and Fastly fought, and Fastly won. [1]