Merge branch 'docglobs' of github.com:ilyagr/gitk

Protecting Against Bot Scrapers: The Rise of Anubis

You are seeing this message because the administrator of this website has set up a safeguard to prevent AI-powered bot scrapers from exploiting our servers. This measure, known as Anubis, is designed to balance the need for website accessibility with the growing threat of automated scraping.

Anubis uses a Proof-of-Work scheme similar to Hashcash, a proposed solution aimed at reducing email spam. At an individual scale, this additional load is negligible; however, when applied on a mass scale, it becomes prohibitively expensive for AI-powered scrapers. The primary goal of Anubis is not to eliminate bot activity entirely but rather to create a barrier that discourages large-scale scraping.

By implementing Anubis, website administrators are attempting to buy time to develop more sophisticated solutions, such as fingerprinting and identifying headless browsers (e.g., via font rendering). This would allow them to present the challenge proof-of-work page only to users who are less likely to be legitimate bot scrapers. However, this approach is not without its limitations.

Anubis requires the use of modern JavaScript features, which some plugins like JShelter disable. To overcome this challenge, you must disable these plugins or similar alternatives for this specific domain. Unfortunately, Anubis version 1.20.0 on this website demands that JavaScript be enabled to bypass the protection.

It's worth noting that a no-JS solution is still in development and not yet viable as a comprehensive alternative to Anubis. The evolving threat landscape around AI-powered bot scraping has forced website administrators to adopt these measures, ensuring they can maintain some level of control over their resources while also attempting to mitigate the impact of automated scraping.