Merge branch 'sj/ref-contents-check-fix': Protecting Against AI Scrapers

As you're reading this, you may be wondering why you've been redirected to this page in the first place. It's because the administrator of this website has set up a security measure called Anubis to protect our server from the threat of aggressive AI-powered web scrapers.

Anubis is a clever hack that uses a Proof-of-Work scheme, similar to Hashcash, to deter mass scrapers from targeting our site. The idea behind it is that at an individual scale, the additional load may be negligible, but when applied on a massive scale, it becomes prohibitively expensive for scrapers to continue their nefarious activities.

The real purpose of Anubis, however, goes beyond just deterring AI-powered scrapers. It's also designed to give developers time to work on more advanced solutions, such as fingerprinting and identifying headless browsers. By presenting users with a challenge-proof-of-work page that requires JavaScript, the administrator is hoping to catch legitimate users who are trying to access their content.

However, this comes with a caveat: Anubis requires modern JavaScript features, which plugins like JShelter will disable. As a result, you'll need to either disable these plugins or enable JavaScript to bypass the challenge. Unfortunately, there isn't a no-JS solution available at this time.

The situation has changed significantly since AI-powered web scraping became more prevalent. The social contract around website hosting has shifted, and now it's essential to ensure that our sites are protected from these threats. While Anubis is not a perfect solution, it's a step in the right direction towards creating a safer online environment.