Merge branch 'rs/merge-compact-summary': The Battle Against AI Scrapers

You are seeing this message because the administrator of this website has taken proactive measures to protect it from the scourge of AI companies aggressively scraping its content. The system in place, known as Anubis, is designed to safeguard the server against these malicious activities, albeit at a cost.

Anubis uses a Proof-of-Work scheme, reminiscent of Hashcash, a proposed solution aimed at reducing email spam. While this measure may seem negligible at an individual scale, it becomes prohibitively expensive when faced with mass scraper attacks. The idea behind Anubis is to make scraping more costly and difficult for AI companies, thereby creating a barrier that encourages them to explore alternative methods.

"This is a hack of sorts," admits the administrator, "intended as a temporary solution to allow more time to be devoted to fingerprinting and identifying headless browsers. By doing so, we can create an environment where users are presented with a challenge-proof work page only when they are most likely to be legitimate visitors." This approach highlights the evolving nature of the online landscape and the need for websites to adapt to protect themselves against increasingly sophisticated threats.

However, Anubis comes with its own set of requirements. As it relies on modern JavaScript features, plugins such as JShelter will inevitably disable them. To overcome this hurdle, users are advised to disable JShelter or similar plugins for this domain. This caveat underscores the delicate balance between protecting a website's content and ensuring user experience.

The Anubis version currently in use is 1.20.0, which necessitates the use of JavaScript to bypass the challenge proof of work page. Unfortunately, this means that AI companies have altered the social contract around how website hosting works, leading to an impasse where a no-JS solution remains a work-in-progress.