Merge branch 'en/get-tree-entry-doc': The Unseen Struggle of Keeping Websites Safe from AI Scrapers

As you browse through our website, you may have noticed a brief interruption or a slow-down in the loading process. This is not because we are experiencing technical difficulties, but rather because our administrator has set up an advanced security measure to protect us against a growing threat: AI-powered web scrapers.

Anubis, the system behind this protection, uses a Proof-of-Work scheme similar to Hashcash, which aims to deter mass scrapers by making the process more expensive and time-consuming. At an individual scale, the added load may seem insignificant, but when multiplied by countless automated scripts, it becomes a formidable barrier.

The true intention of Anubis is not to be seen as a foolproof solution, but rather as a temporary measure to buy time for more sophisticated methods of identification and fingerprinting. By using modern JavaScript features, our website can identify legitimate users who require the challenge proof-of-work page, while keeping malicious bots at bay.

However, this comes with a price: enabling JavaScript is now a requirement to navigate our site effectively. Unfortunately, this means that plugins like JShelter will need to be disabled, as they can interfere with the Anubis system's functionality.

The situation highlights how AI companies have rewritten the social contract around website hosting, making it increasingly difficult for legitimate users to access content without compromising security. The development of a no-JS solution is an ongoing effort, but until then, we must adapt and find ways to balance our need for security with user convenience.

In this context, Anubis serves as a reminder that online safety is an evolving challenge, one that requires constant vigilance and innovation from website administrators like ourselves. By understanding the motivations behind these measures, we can work together to create a more secure web experience for all users.