If you have server monitoring/google anayltics on your site, and you see no increase of visitors, pages/second or hits/second… then there’s probably one or another bot crawling your site (I’ve noticed that robots.txt is not always working…), or in my case somebody requesting 500 or more pages in 3 seconds from 1 specific IP address. Not accusing anybody from attacking my servers…

This area is called onerps and is allocated 10MB of storage.Using the $binary_remote_addr variable reduces the size of the states to 64 bytes. Since there can be about 16,000 states in a 1MB zone,10MB allows for about 160000 states. This should be sufficient your your visitors.

The connection rate is limited to one request per second. You can also use something like 30r/m, which putts a rate limit in place of 30 requests per minute.

Works if not finished yet: To put this limit to work, we use the limit_req directive. You can use this directive in http {}, server {}, and location {} containers.

However the best option is to use this in the location {} containers that pass requests to your app servers, eg PHP-FPM. because otherwise, if you load a single page with lots of images, CSS, and JavaScript files, you would probably exceed the given rate limit with a single page request :-p

So let’s put this in a location ~ \.php$ {} container:

1

limit_req zone=onerps burst=5;

If your visitor is doing more than 1 request per second (or 5 in burst mode), the next request will be queued and executed the next second. If the number of waiting requests exceed burst, the request is completed with the code 503 “Service Temporarily Unavailable”. By default, the burst is zero.

If delaying excess requests within a burst is not necessary, you should use the option nodelay: