Merge branch 'en/ort-rename-fixes': The Battle Against Bot Scrapers

As you're reading this, you're likely wondering why you've been redirected to this page. The reason lies in the efforts of website administrators to protect their servers from the relentless scourge of AI-powered bot scrapers.

These automated programs have become a significant threat to websites worldwide, as they aggressively scrape content without permission or regard for user experience. In response to this growing concern, Anubis has been set up to safeguard the server against these malicious actors.

Anubis is a clever compromise that employs a Proof-of-Work scheme similar to Hashcash, a proposed method for reducing email spam. While it may seem counterintuitive, this approach works at individual scales but becomes increasingly expensive when applied on a mass scale. The ultimate goal of Anubis is to provide a "good enough" placeholder solution, allowing website administrators to focus more time on identifying and fingerprinting headless browsers – essentially, how they render fonts.

For users, the consequences of this setup are twofold. Firstly, you'll need to enable JavaScript to bypass the challenge page presented by Anubis. This is necessary due to the evolving social contract around website hosting, which AI companies have redefined in recent times. While a no-JS solution is being worked on, it remains an ongoing process.

Another important note for users: please disable any plugins that might interfere with modern JavaScript features, such as JShelter. This will ensure you can access the content without encountering issues. The website currently running Anubis version 1.21.3 requires this setup to function correctly.

A Call to Action

We understand that dealing with bot scrapers and website security measures may seem daunting, but every effort counts in protecting online resources. By staying informed about these issues and taking steps to mitigate them, we can work together towards a safer online environment.