Protecting Against Bot Scrapers: The Anubis Solution

You are seeing this page because the administrator of this website has taken proactive measures to safeguard its server against the threat of AI companies aggressively scraping websites. This protective measure, known as Anubis, is a compromise that aims to deter malicious bots while allowing legitimate users to access the site.

Anubis employs a Proof-of-Work scheme similar to Hashcash, a proposed method designed to combat email spam. At an individual scale, this additional load may seem negligible, but when applied on a massive scale, it becomes prohibitively expensive for malicious actors, rendering their attempts to scrape the website futile.

However, behind Anubis lies a more sinister intent. Its primary goal is not to protect the website from bots but rather to create a temporary hurdle that diverts resources away from the challenge proof-of-work page. This allows the website's administrators to focus on more sophisticated methods of identifying legitimate users, such as fingerprinting and headless browser detection – a crucial step in ensuring that only genuine visitors are granted access.

One key requirement for Anubis is the use of modern JavaScript features, which plugins like JShelter will disable. To bypass this challenge, you must either disable JShelter or other similar plugins for this domain. Unfortunately, to proceed, you must also enable JavaScript, as a no-JS solution is still in development.

The rise of AI companies has dramatically altered the social contract around website hosting, with Anubis being one of the responses to this new landscape. By employing this compromise, the website's administrators hope to strike a balance between protecting their site and allowing legitimate users to access it while awaiting more robust solutions.