Protecting against AI Scrapers: The Rise of Anubis
You are seeing this message because the administrator of this website has taken steps to protect their server against the threat of AI-powered web scraping companies. Anubis, a Proof-of-Work scheme inspired by Hashcash, has been implemented as a compromise solution. This measure aims to deter mass scrapers while still allowing legitimate users to access the site.
Anubis works on the principle that individual instances of scraping may not cause significant inconvenience, but at scale, it becomes more expensive and less practical. The primary goal is to create an additional barrier for AI-powered scraper bots without making it impossible for human users to access the content. In essence, Anubis serves as a temporary placeholder solution, allowing developers to focus on more effective methods of fingerprinting and identifying legitimate headless browsers.
However, implementing Anubis comes with some limitations. It requires modern JavaScript features, which plugins like JShelter may disable. To overcome this hurdle, users are advised to disable these plugins or enable JavaScript for the duration of their visit. Unfortunately, a no-JS solution is still in development.
The introduction of Anubis reflects a shift in the social contract surrounding website hosting and web scraping. AI companies have found ways to exploit vulnerabilities, making it essential for websites to employ measures like Anubis to protect themselves from exploitation. As technology continues to evolve, we can expect more sophisticated solutions to emerge, but for now, Anubis serves as an important step towards protecting online resources.