Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

All this is true, but also somewhat irrelevant. In reality the amount of actual hash work is completely negligible.

For usability reasons Anubus only requires that you to go trough a the proof of work flow only once in a given period. (I think the default is once per week.) That's just very little work.

Detecting you need to occasionally send a request trough a headless browser far more of a hassle than the PoW. If you prefer LLMs rather than normal internet search, it'll probably consume far more compute as well.





> For usability reasons Anubus only requires that you to go trough a the proof of work flow only once in a given period. (I think the default is once per week.) That's just very little work.

If you keep cookies. I do not want to keep cookies for otherwise "stateless" sites. I have maybe a dozen sites whitelisted, every other site loses cookies when I close the tab.


A bigger problem is that you should not have to enable javascript for otherwise static sites. If you enable JS, cookies are a relatively minor issue compared to all the other ways the website can keep state about you.

Well, that's not a problem when scraping. Most scraping libraries have ways to retain cookies.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: