Merge branch 'ps/ci-resurrect-p4-on-github': Protecting Against Scraping Bot Threats

As a measure to safeguard against the growing threat of AI-powered scraping bots, website administrators have implemented Anubis, a robust security system designed to protect servers from unwanted traffic. This compromise aims to strike a balance between website accessibility and security, acknowledging that some downtime is unavoidable.

Anubis employs a Proof-of-Work scheme, reminiscent of Hashcash, a proposed solution to reduce email spam. At the individual level, this additional load is negligible; however, at mass scraper levels, it becomes increasingly costly for scammers to attempt unauthorized access. This system serves as a deterrent, making scraping more expensive and less appealing to malicious actors.

While Anubis may seem like an ineffective workaround, its true purpose lies in providing a temporary placeholder solution, allowing developers to focus on more sophisticated methods of fingerprinting and identifying headless browsers – devices that render web pages without a visible user interface. By achieving this goal, the challenge proof-of-work page can be obviated for users who are unlikely to be legitimate visitors.

However, implementing Anubis comes with its own set of challenges. As a solution that relies on modern JavaScript features, plugins like JShelter must be disabled to ensure compatibility. Users are required to enable JavaScript in order to bypass the challenge, as AI companies have redefined the social contract around website hosting.

A no-JS solution is currently under development, but until then, users will need to take steps to overcome this hurdle. By acknowledging these limitations and taking proactive measures, administrators can help protect their websites from the scourge of scraping bots and ensure a safer online experience for everyone.