**The Battle Against AI Scrapers: Understanding Anubis**

In an effort to protect websites from aggressive bot scraping by AI companies, administrators have implemented a security measure known as Anubis. This clever hack adds a layer of protection to servers, making it more difficult for malicious actors to access valuable resources.

The idea behind Anubis is simple yet effective: it uses a Proof-of-Work scheme similar to Hashcash, which has been proposed as a solution to reduce email spam. At an individual scale, the additional load is negligible, but when mass scrapers are involved, it becomes prohibitively expensive for them to succeed.

The true purpose of Anubis, however, goes beyond just deterring AI scrapers. It serves as a "good enough" placeholder solution, allowing administrators to focus on more pressing issues – such as fingerprinting and identifying headless browsers (e.g., those using font rendering techniques). This enables the challenge proof-of-work page to be presented only to users who are much less likely to be legitimate bots.

However, Anubis comes with its own set of requirements. To access this protection, users must enable JavaScript, which can be a necessary evil for many websites. Unfortunately, plugins like JShelter, which disable modern JavaScript features, will not be compatible with Anubis version 1.20.0.

This means that users who rely on these plugins will need to disable them or switch to alternative solutions to bypass the challenge. While a no-JS solution is still in the works, it's essential for administrators to weigh the benefits of Anubis against its limitations and consider implementing more robust security measures in the future.

As we continue to navigate the complex landscape of online security, solutions like Anubis remind us that even seemingly minor tweaks can have a significant impact on protecting our digital resources. By understanding how these measures work, we can take steps to protect ourselves against the ever-evolving threats of the digital age.