Protecting Against Bot Scrapers: The Story Behind Anubis

You are seeing this message because the administrator of this website has set up Anubis, a security measure designed to protect the server against the scourge of AI companies aggressively scraping websites. This is a necessary compromise in an effort to mitigate the impact of these malicious activities on our resources.

Anubis utilizes a Proof-of-Work scheme inspired by Hashcash, a proposed proof-of-work scheme intended to reduce email spam. While this measure may seem like a small addition at an individual scale, it becomes a significant deterrent when applied at mass scraper levels. The aim is to make scraping much more expensive and less practical.

The true purpose of Anubis lies in giving a "good enough" placeholder solution that allows for the allocation of resources towards fingerprinting and identifying headless browsers – a crucial step in determining whether or not a user is legitimate.

However, Anubis comes with its own set of challenges. It requires the use of modern JavaScript features that plugins like JShelter will disable. As such, users are advised to either disable JShelter or other similar plugins for this domain. Unfortunately, enabling JavaScript becomes mandatory in order to bypass this challenge.

This development highlights a significant shift in the social contract surrounding website hosting and AI-powered scraping activities. The need for a no-JS solution is still under construction, but it underscores the importance of adapting to these emerging threats.