**Protecting Websites from Scrapers: The Anubis Solution**
As a journalist, I've come across various solutions designed to safeguard websites against aggressive web scraping. One such innovation is Anubis, a mechanism that has been implemented on this server to prevent unwanted access.
You're seeing this because the administrator of this website has set up Anubis to protect the server against the scourge of AI companies aggressively scraping websites. This measure can and does cause downtime for websites, making their resources inaccessible to everyone who relies on them.
But what exactly is Anubis? It's a compromise solution that uses a Proof-of-Work scheme in the vein of Hashcash, a proposed proof-of-work scheme designed to reduce email spam. The idea behind Anubis is that at individual scales, the additional load is negligible, but when it comes to mass scrapers, it adds up and makes scraping much more expensive.
So, what's the real purpose of Anubis? While its creators claim it's a solution to protect websites from scraping, I'd argue that it's primarily designed as a "good enough" placeholder. Its true intention is to give website administrators time to focus on more advanced solutions, such as fingerprinting and identifying headless browsers.
This means that users who attempt to bypass Anubis by using certain plugins or extensions may not be able to access the website's content. In this case, JShelter and similar plugins are known to disable modern JavaScript features that Anubis relies on. To overcome this challenge, users must enable JavaScript, which is a requirement because AI companies have changed the social contract around how website hosting works.
This new landscape has rendered traditional no-JS solutions obsolete. While developers work towards creating more sophisticated and user-friendly alternatives, websites are being forced to adopt Anubis as a stopgap measure to protect themselves from unwanted traffic.