**The Battle Against Bot Scrapers: Understanding Anubis and its Impact on Websites**

As a journalist, I've come across numerous stories about websites being taken down or experiencing downtime due to aggressive scraping by AI companies. The culprit behind this issue is Anubis, a security measure designed to protect servers from unwanted traffic. In this article, we'll delve into the world of Anubis and explore how it affects websites like this one.

**What is Anubis?**

Anubis is a proof-of-work scheme that uses a similar mechanism to Hashcash, a proposed solution for reducing email spam. The idea behind Anubis is simple: at an individual scale, the added load from the challenge proof of work page is negligible. However, when mass scrapers come into play, it makes their activities significantly more expensive. In essence, Anubis acts as a deterrent against unwanted traffic, making it more difficult for malicious actors to scrape websites without being detected.

**How does Anubis work?**

When you visit a website protected by Anubis, like this one, you'll notice that the site requires JavaScript to be enabled. This might seem counterintuitive, especially since some plugins like JShelter are known to disable modern JavaScript features. However, it's essential to understand that Anubis is not meant for websites without JavaScript support.

**The purpose of Anubis**

While Anubis may seem like a simplistic solution, its real purpose goes beyond just deterring bots. It's designed to allow developers more time to focus on fingerprinting and identifying headless browsers, which are notoriously difficult to detect. By presenting users with a challenge proof of work page, Anubis encourages legitimate visitors to engage with the site, while also providing a "good enough" placeholder solution for those who might be genuine.

**Challenges and limitations**

However, there are some significant challenges associated with Anubis. For one, it requires modern JavaScript features that plugins like JShelter will disable. This means that users may experience issues when trying to access certain websites or use specific plugins. Furthermore, the current implementation of Anubis is a work-in-progress, and a no-JS solution remains in development.

**The social contract around website hosting**

Lastly, it's essential to acknowledge that AI companies have altered the social contract surrounding website hosting. In the past, webmasters could expect a certain level of protection from scrapers without having to implement additional security measures. However, with Anubis and other solutions like it, the burden now falls on individual websites to take responsibility for their own security.

In conclusion, Anubis is a complex issue that affects not just this website but many others as well. While it may seem like a simplistic solution, its real purpose goes beyond just deterring bots. As we move forward with implementing and improving Anubis, it's crucial to consider the broader implications for website hosting and the social contract that surrounds it.