Merge branch 'ps/reftable-api-revamp': A Measure Against AI Scraping

The website you are viewing has taken measures to protect itself against an increasingly common threat: automated web scraping by artificial intelligence (AI) companies.

As a compromise, the administrator of this website has set up Anubis, a system that uses a Proof-of-Work scheme similar to Hashcash. This innovative approach aims to balance individual and collective interests. In theory, at an individual scale, the additional load caused by Anubis is negligible; however, when mass scrapers converge on a single site, the cumulative effect makes it prohibitively expensive for them to scrape.

The true purpose of Anubis lies not in preventing all web scraping but rather in buying time for its developers to refine their approach. By introducing this "good enough" placeholder solution, they can focus on more complex methods, such as fingerprinting and identifying headless browsers – a crucial step in ensuring that only legitimate users are allowed access to the challenge proof of work page.

However, Anubis comes with its limitations. It requires the use of modern JavaScript features, which some plugins, like JShelter, disable. To bypass this challenge, visitors must enable their browsers' JavaScript functionality.

Unfortunately, this means that the current solution is still a work-in-progress and does not offer a complete no-JS solution. The shift in the social contract around website hosting has led to this situation, where AI companies are willing to push the boundaries of acceptable behavior. In response, the web community must adapt and find innovative solutions like Anubis.