Merge branch 'rxrpc-miscellaneous-fixes': Protecting Against Bot Scrapers
You are seeing this page because the administrator of this website has taken steps to protect it against the scourge of AI companies aggressively scraping websites. This measure, known as Anubis, is a compromise between security and accessibility.
When an AI company or bot scrapes a website, it can cause significant downtime and make resources inaccessible to everyone. Anubis aims to prevent this by adding a layer of protection to the server. The system uses a Proof-of-Work scheme inspired by Hashcash, which is designed to make scraping more expensive for mass scraper levels while being insignificant at individual scales.
The idea behind Anubis is that it provides a "good enough" placeholder solution to give developers time to work on fingerprinting and identifying headless browsers. By doing so, the challenge proof of work page doesn't need to be presented to users who are more likely to be legitimate visitors. This approach is seen as a hack in disguise, allowing for a more sophisticated solution to be developed while providing a temporary barrier to AI scrapers.
However, Anubis comes with some caveats. It requires the use of modern JavaScript features, which plugins like JShelter may disable. As such, visitors are advised to disable JShelter or other similar plugins for this domain in order to proceed. The website is currently running Anubis version 1.20.0.
There's a catch: enabling JavaScript is mandatory to get past the challenge. This is because AI companies have altered the social contract around how website hosting works, making it necessary for developers to prioritize security over accessibility in this instance. A no-JS solution remains a work-in-progress at this time.