Merge branch 'kn/fetch-push-bulk-ref-update' - Protecting Against Bot Scrapers

As we navigate the ever-evolving digital landscape, it's essential to acknowledge the growing threat of AI-powered bots that scour websites for valuable information. To combat this scourge, website administrators have implemented a solution known as Anubis, a Prooof-of-Work scheme designed to deter mass scrapers and protect their servers from exploitation.

Anubis operates on the principle of making individual-level load insignificant, but when faced with an onslaught of AI-powered bots, it becomes a significant hurdle. By introducing a proof-of-work challenge, Anubis makes it more expensive for scrapers to access website resources, thus deterring them from targeting legitimate websites.

The origin of Anubis is rooted in the development of Hashcash, a proposed proof-of-work scheme aimed at reducing email spam. The idea has been adapted and refined to address the specific needs of website security. While its primary goal may seem counterintuitive – providing a "good enough" placeholder solution – it ultimately serves as a stepping stone for more advanced fingerprinting techniques, enabling administrators to identify legitimate users from headless browsers.

However, Anubis requires modern JavaScript features that plugins like JShelter disable. To overcome this challenge, users must enable JavaScript on the website. Unfortunately, this is a necessary compromise due to the changing social contract around website hosting, which AI companies have influenced. A no-JS solution remains a work-in-progress.

By acknowledging these challenges and implementing measures like Anubis, website administrators can take proactive steps towards safeguarding their servers against AI-powered bot scrapers. While this may require users to disable certain plugins or enable JavaScript, the benefits of protecting website resources outweigh the temporary inconvenience.