Apache Web Server Forum

I have a little spider trap that I picked up here at WebmasterWorld and over time the number of IPs listed in the .htaccess file grows. I'm wondering how many can be blocked before the page load takes a noticeable hit?

I recommend that you go through the .htaccess lines generated by the script and look for patterns of repeated access from the same IP address ranges -- Sorting the lines by IP address helps to do this.

If you see a large number of IP addresses in the same range getting banned, then delete all of those lines and add a simple mod_access "Deny from" directive that blocks the entire range of addresses. You might block 256 IP addresses at a time, or block an ISP, or even block a whole country -- That is up to you, and depends on your site, its normal visitors, and its abusers. But, for example, I removed several dozen lines of code from my .htaccess file last year, and replaced them all with

Deny from 38.100.0.0/12You'll need to look up the IP addresses in ARIN, RIPE, APNIC, LACNIC, Afrinic, etc. to find out who/what they are, and how big of a range they are part of. If you're not sure, start small.

Otherwise, you might just want to delete all those lines once a year, or once a month, or whenever the server slows down... :)

I recommend that you go through the .htaccess lines generated by the script and look for patterns of repeated access from the same IP address ranges -- Sorting the lines by IP address helps to do this.

I have been having the feeling that this needed to be done and now with your advise I will. It seems like the bots and log spammers just keep getting worse. Thanks