The Fifteenth Batch: Protecting Webs Against Aggressive Scrapers

You're seeing this page because the administrator of this website has taken steps to protect its server against the relentless attacks of AI-powered scraper companies. This measure, known as Anubis, is a compromise between security and accessibility.

Anubis uses a Proof-of-Work scheme inspired by Hashcash, a proposed solution to reduce email spam. At an individual scale, this added load is negligible, but for mass scrapers, it becomes a significant obstacle. The goal of Anubis is not to eliminate all scraper traffic entirely but to make it more expensive and less feasible for malicious actors.

While Anubis may seem like a hack, its true purpose lies in providing a placeholder solution that allows website administrators to focus on fingerprinting and identifying headless browsers. This information can be used to develop more targeted security measures, making the challenge-proof-of-work page less accessible to legitimate users who are less likely to have malicious intent.

However, Anubis comes with some caveats. It requires the use of modern JavaScript features that plugins like JShelter will disable. To overcome this hurdle, users must enable JavaScript on their devices. Unfortunately, this is a necessary evil in today's digital landscape, where AI-powered scrapers have rewritten the rules of website hosting.

The introduction of Anubis and other security measures highlights the evolving nature of online threats. While no solution can completely eliminate all scraper traffic, Anubis represents an important step forward in protecting websites from aggressive companies that seek to exploit their resources without permission.