Mascot: The trusted reference standard for protein identification by mass spectrometry for 25 years

Posted by Web Master (March 27, 2026)

Mitigations against uncontrolled web crawlers and bots

The server running matrixscience.com is being overwhelmed by millions of HTTP requests from distributed botnets, which amounts to a distributed denial of service (DDoS) attack. Due to the high load, matrixscience.com has been periodically unavailable this week as the server does not have enough hardware resources.

We are now forced to restrict access to certain pages of dynamic content:

  • Protein Family Summary reports of example searches shipped with Mascot
  • Protein View reports of example searches shipped with Mascot
  • Peptide View reports of example searches shipped with Mascot

If you submit a database search in the free Mascot service, your search results are not restricted. The above restrictions are designed to only impact bot activity.

The full explanation is below, and if you have any questions, please contact us by email.

Why are these pages unavailable?

We use the Robots Exclusion Protocol (robots.txt) to indicate areas of the website that should not be accessed by automated bots. This is a long-standing mechanism to mitigate server overload, widely recognised as an industry standard.

Since 2024, the world’s largest tech companies have been using large-scale web crawlers to repeatedly and regularly hoover up every page on the public Internet, including websites operated by small companies like Matrix Science. Unlike crawlers operated by search engine companies, these new crawlers (or bots) have been designed to deliberately ignore robots.txt. Because the bots follow every link, they are causing massive processing demand on this server.

Worse, the bots have been stripped of any identifying features and they are operated from botnets of 10,000 or more IP addresses. There is no way to automatically detect or block them. So, we have no choice but to make some dynamic pages unavailable to everyone.

Who is responsible?

We are being hit by distributed botnets run by the world’s largest tech companies. However, because the bots have been stripped of any identifying information, we don’t know who is operating them.

What can be done?

The best thing you can do is contact your elected representative. Lobby them to force governments to regulate tech companies. Lobby them to force these companies to honour the Robots Exclusion Protocol (robots.txt) with any automated web crawler or bot they create or operate.

Comments are closed.