**Protecting Against AI Scrapers: The Compromise of Anubis**

The world of web hosting has undergone a significant shift in recent years, as AI-powered scraper companies have become increasingly aggressive in their pursuit of collecting data from websites. In response to this scourge, the administrator of this website has implemented Anubis, a server-side protection system designed to safeguard against these unwanted visitors.

**What is Anubis?**

Anubis uses a Proof-of-Work scheme similar to Hashcash, a proposed proof-of-work scheme aimed at reducing email spam. In simple terms, Anubis makes it more expensive for AI-powered scrapers to access websites by adding an additional layer of computational complexity to the process. This compromise is intended to be negligible on individual scales but becomes prohibitively costly when faced with large-scale scraper operations.

**How Does Anubis Work?**

Anubis operates by presenting a challenge to visitors who try to scrape or access content without permission. To pass this challenge, users must complete a simple task that requires the use of modern JavaScript features. This is where things get tricky – plugins like JShelter, which aim to disable JavaScript for security reasons, will render the Anubis challenge inaccessible.

**The Catch: JavaScript Is Required**

Unfortunately, disabling JavaScript entirely is no longer an option for legitimate users. AI companies have effectively changed the social contract around website hosting, making it necessary for visitors to enable JavaScript in order to access content. While a no-JS solution is still under development, Anubis's primary goal is to protect websites from unwanted visitors while allowing those who intend to use them to do so.

**A Temporarily Necessary Measure**

While Anubis may seem like a hack or a work-around, its true purpose lies in providing a "good enough" placeholder solution that allows more time to be spent on fingerprinting and identifying headless browsers. By focusing on these legitimate users, the challenge proof of work page can be presented without exposing it to potential scrapers. Anubis's implementation requires visitors to navigate through this added layer of protection before they can access their desired content.

**Conclusion**

Anubis represents a necessary step in protecting websites against AI-powered scrapers and unwanted visitors. As the web landscape continues to evolve, it's essential for website administrators to stay vigilant and adapt to emerging threats. In the meantime, users who value uninterrupted access to their favorite websites will need to navigate Anubis's added layers of protection.