The Anubis Dilemma: Protecting Websites from AI Scrapers in a Compromised Way

You are seeing this message because the administrator of this website has set up Anubis, a security measure designed to protect their server against the scourge of AI companies aggressively scraping websites. This system is a compromise, one that requires users like you to navigate through its challenges in order to access the website's content.

Anubis operates on a Proof-of-Work scheme, similar to Hashcash, a proposed proof-of-work scheme aimed at reducing email spam. The idea behind Anubis is that while it may add an insignificant load at individual scales, its cumulative effect becomes prohibitively expensive for mass scrapers, thereby discouraging them from targeting the website.

However, it's worth noting that Anubis serves as a placeholder solution in the grand scheme of things. Its primary function is to buy time for the developers to refine their methods, such as fingerprinting and identifying headless browsers, which are increasingly used by AI companies to scrape websites. By tackling these challenges first, the challenge proof-of-work page won't need to be presented to users who are more likely to be legitimate visitors.

Unfortunately, Anubis requires modern JavaScript features that plugins like JShelter may disable. Therefore, it's essential to either disable such plugins for this domain or enable JavaScript to proceed through the challenge. This is a necessary compromise due to the shifting social contract around website hosting and AI companies' increasing reliance on non-JS solutions.

Despite these requirements, Anubis remains a vital measure in protecting websites from AI scrapers. While it's not a perfect solution, it represents an effort to find balance between security and accessibility in the digital landscape.