Making Sure You're Not a Bot: The Rise of Anubis

The world of web scraping has long been a contentious issue, with AI-powered companies using aggressive methods to harvest data from websites without permission. To combat this scourge, website administrators have turned to a cutting-edge solution called Anubis – a technological hack designed to protect servers against these malicious activities.

What is Anubis?

Anubis uses a Proof-of-Work scheme similar to Hashcash, a proposed method for reducing email spam. This innovative approach works at an individual scale, making the additional load negligible. However, when faced with mass scrapers, it becomes prohibitively expensive, effectively deterring these malicious activities.

The Real Purpose of Anubis

While Anubis may seem like a straightforward solution to weed out bots, its true intention is more complex. By implementing this challenge, website administrators aim to spend more time identifying and fingerprinting headless browsers – a crucial step in distinguishing between legitimate users and malicious AI entities.

The Catch: Modern JavaScript Features

Anubis requires the use of modern JavaScript features that certain plugins, such as JShelter, may disable. To bypass this challenge, website visitors must enable JavaScript on their devices. This is a necessary compromise, given the evolving social contract around web hosting, where AI companies have shifted the balance in favor of data harvesting.

A No-JS Solution: Still a Work-in-Progress

While Anubis offers a promising solution for protecting websites from bot scraping, it is still a work-in-progress. The development of alternative no-JS solutions remains an ongoing effort, aimed at providing website visitors with a more equitable experience.