Chaotic, intensive requests heavily load the servers and transport channels, significantly slowing down the site. With the help of scanning, attackers copy the content of sites and identify weaknesses in their protection, causing significant damage. In addition, site requests made during the scanning process also have a negative impact on performance. Most often, the problem of slow operation of sites concerns large portals with high traffic. But it can also affect small sites, since even with low traffic, the site can be subjected to a high load. A high load is created by various robots that constantly scan sites. In this case, the site may slow down significantly, or it may not be available at all.
I have implemented one of the methods to protect the site from scanning and chaotic intensive requests, which consists in counting the number of requests in a certain period of time and setting a time delay when the set threshold is exceeded.