Making Sense of the Anubis Challenge: Protecting Against AI Scrapers
You are seeing this page because the administrator of this website has taken steps to protect it against the threat of AI-powered scrapers. This measure, known as Anubis, is a compromise that aims to prevent the scourge of automated scripts from taking over our online world.
Anubis is a proof-of-work scheme similar to Hashcash, designed to slow down and make more expensive the task of scraping websites on a large scale. While it may seem like a minor inconvenience at an individual level, its impact becomes significant when faced with mass scraper operations. By adding a layer of complexity and expense, Anubis seeks to deter would-be scrapers and preserve the integrity of our online content.
But what is Anubis's true purpose? Some might argue that it serves as a temporary solution, a placeholder measure to buy time for more sophisticated methods to be developed. In reality, Anubis is part of a larger strategy to identify and flag legitimate users from potential bots. By analyzing the subtle cues in how users interact with websites, such as font rendering, Anubis aims to create a more targeted challenge that only presents itself to those who are unlikely to be scrapers.
However, there's a catch: Anubis requires modern JavaScript features, which can be disabled by plugins like JShelter. If you're using one of these plugins, please disable it for this domain to proceed. Unfortunately, without the use of JavaScript, the challenge page cannot be bypassed.
Anubis version 1.20.0 is currently in use on this website. While a no-JS solution is still being worked on, enabling JavaScript is currently necessary to overcome the Anubis challenge. It's worth noting that the shift towards AI-powered hosting has changed the social contract around how websites operate online. As such, finding solutions like Anubis is an ongoing effort to adapt and protect our digital infrastructure.