I am hosting a public heavy CPU web service, and I would like to configure Apache2 to only allow 1 simultaneous connection per IP address to the location at hand, to prevent single clients from using too much resources.

Is there a neat apache2 solution to do this? I have looked into mod_bw but this does not seem to do the trick (the MaxConnections only applies for all users, not per IP). There also is a module called apache2-mod-limitipconn, but this one has no precompiled packages and I think is longer maintained as the website is dead. I would prefer something that I can include as a formal dependency in Ubuntu.

4 Answers
4

From my experience it's a little more complicated than just limiting "one connection per IP" I'm afraid (isn't it always :-)).

There were however other configurable parameters such as "MinSpareServers, MaxSpareServers and StartServers" (I'm not sure if these still exist but they were certainly in Apache 2.0).

Also apparently compressing the output can improve bandwidth needs and speed up response time by up to 75% with "mod_deflate" if that helps.

"mod_bandwidth" may offer some assistance rather than trying to limit a single connection and is probably the route I'd go down after looking at compression.

Or if you can write C then you could create a module yourself to count connections per IP.

The thing to watch to out for is what happens when an IP has been served a single connection. Let's just say that usually it's not a pretty sight ! You can drop to a "Server Busy" page or similar but I'd go down the route of optimising your Server so it's a good as it can be than frustrating visitors.

My "mod-bandwidth" comment was put because by giving a small amount of bandwidth to each user (eg 32kbps) you're effectively limiting how much CPU and overall resource they can use.
–
Jonathan RossMar 28 '11 at 6:12

However, as mentioned in other questions, be carefull: this rule might block some legimitive robots (like google crawler) or ISPs/organisations that use NAT and shares single ip address for large number of users.

The problem with using the firewall is that it takes a while for the firewall to notice that the connection is gone. So if the user has a couple of short jobs, which are over in a couple of seconds, the firewall will keep blocking incoming requests for e.g. 300s.
–
JeroenMar 30 '11 at 22:34

I used 'sudo iptables -I INPUT 4 -p tcp --syn --dport 80 -m connlimit --connlimit-above 5 -j REJECT --reject-with tcp-reset' and then opened 8 tabs to the same page on the apache server and it failed. What am I missing?
–
flickerflyOct 3 '13 at 17:05

One connection per IP address is not going to work. A web browser will use one connection to download the web page, then 10+ simultaneous connections to get all the images, css, javacripts, etc. So if you do limit by IP, the user will get the main page, and maybe a few images and that is all.

The only use case that limit by IP works for is if you have a dedicated download server you don't want people using download accelerators on. Aka, RapidShare.

You need to look into how the website abusers are abusing your services and target them. If you limit everyone, then everyone is going to hurt.

If it's case of just too much traffic, then you need to optimize the site or add some more cpu cycles with more/faster hardware.

I am indeed using dedicated servers for this webservice so that is not a problem. I specifically want both restrictions on the number of connections per IP and the number of total active connections.
–
JeroenMar 30 '11 at 22:32