This will set 30 requests per minute. However what if I wanted to stack a second rule, so the logic is "30 requests per minute, or 1 request per second. Whichever comes first"

The reason for this is we still want a cap of 30/min but there are some weird attempts that we're seeing a spam of 10x requests in a 1-2 second period. This causes some issues when requests come through that fast. So we would like to limit users to 1 request a second, AND a cap of 30 per minute.

I read the doc reference but didn't see much around stacking different rate rules.

I don't need multiple zones. I need multiple rules on the same zone.
– emmdeeJul 25 at 1:20

From how I understand your question, multiple zones are precisely what you want. One zone is one rule. Are you possibly missing the fact that you may define multiple zones for the same key?
– anxJul 25 at 1:34

1 Answer
1

It is perfectly valid to apply multiple zones to the same block. You may also create multiple zones that store requests for the very same key (typically some prefix of client IP address). It only costs minimal processing and memory.

The following will limit users, identified by their network address, both in second and minute intervals - and allow them to temporarily exceed the defined limit for short bursts. You want to allow that, as it will make your website appear less broken for user with unreliable input devices and/or network access - they might issue the same request twice:

That being said, if your web server cannot handle a short burst of search requests, maybe you are going to need to offload your search engine to some other machine or apply a different kind of rate limit. You need a significant margin between typical human usage and usage which exceeds your machines ability to keep up. Otherwise you might not find a safe value between annoying legitimate users and failing to restrict abusive/bot usage.

Appreciate the concise direct answer + example + extra details making it a well rounded solution to the question. Also, no worries on the annoying legit users as th limits imposed are way outside a normal human-use scenario as we're getting several results per second and no bot should be traversing /search... (it's denied in robots). Again, appreciate the input and information.
– emmdeeJul 25 at 3:43