I have been building a PHP web application using the following techniques in the past 2 months:

PHP 5.3.4

MongoDB

MySql

I just got my dedicated server running Ubuntu 10.4 LTS x64 with the following hardware:

100Mbit networking speed

120GB SSD

16GB RAM @ 1600 MHZ

AMD FX 6100 6 core CPU @ 1400 MHZ each core (3600 MHz total)

Linux Kernel 2.6.33 (to support SSD trim)

Ubuntu 10.4 LTS x64

The static content for the web application is (uncompressed, unminified) 200kb total.
The PHP web application required scalability, it can get a lot of traffic in the start.

Now I have a few questions:

What should I configure in terms of DDOS protection, I got no competitors, the project is
underground and unknown, so what should I consider? What would help me with this since there are so many things around like use a Nginx mod or use Iptables? And how can I configure these?

How can I calculate the bandwidth, and how much traffic the server can handle?

This question has been asked before and already has an answer. If those answers do not fully address your question, please ask a new question.

1) There is no real protection against DDOS, disable keepalive, but the rest should be your hosters business. 2) Depends on the count of your visitors and your application (can't really give you an advice, just that you will see in production), the maximum traffic is about 31TB with an 100mbit (oneway) connection in theory ;)
–
zaub3r3rMar 8 '12 at 22:50

@zaub3r3r where do you base that 31TB off is that per month? what do you mean by oneway? and is there a way to test my server's network speed?
–
Mike VercoelenMar 8 '12 at 23:00

100mbit = 12mb/sec and so on (for 30days on fullspeed thats ~31tb, but that's not the point and just a calculation), with oneway i mean that this is just the down traffic (from client side), for sure there will be some up traffic (from user to server, for example the requests). You can measure your network speed with some testfiles (just search for "server speedtest file"), of course this is not an accurate measurement, obviously you never will get an accurate one.
–
zaub3r3rMar 8 '12 at 23:15

2 Answers
2

Protecting against DDOS can be difficult, because a high proportion of the time you ISP runs out of resources (stateful firewalls or bandwidth) and will sink hole your IP block.

If your expecting DDOS then choose an ISP that can offer DDOS prevention and knows how to deal with high traffic loads , ask them for examples and system them have in place that can help. Turning you off is not an answer.

Be prepared to block netblocks of aggressive ddos machine or infact ddos network blocks with iptables or your own upstream firewall.

consider using inbound/outbound qos to control outbound bandwidth fairly to clients.

consider splitting database, application logic and web serving on to different hardware.

consider a load balancer with some beef caching nodes to soak up small attackers. But beware entering a resource war with your attacker is not the best idea, they will win! :-(

consider add adding a caching layer between you application and the database, which should keep load off your database server for repeat requests

if the ddos is targeting static content , not web scripts that require database resources, consider something like a CDN (cloud flare) that hides your actual ip address from the rest of the internet, and helps distribute load geographically. you get faster content delivered by to your users, and as a side effect you a get some ddos protection.

if you don't need UDP get your ISP to block the traffic at their border. if you only need port 80 and 443 then get your ISP to block this at their network permitter. If your ISP doesn't know what UDP or ports are get a new one... :-).

host your dns on separate infrastructure, with some one large who can deal with ddos. If you must host dns yourself, place it on separate infrastructure and a different network.

if your using SSL, make sure you can handle the cpu hit of SSL handshakes. SSL accelerators are expensive. Perhaps develop a system where only paid up customers or registered authenticated customers can connect via SSL. Same goes for port 80 connects, make sure your users are registered before they have access to the application. Could stop deep ddos attacks into your application.

The comments and other answer are completely sound, but I would also recommend putting Varnish in front as a cache for static content (and some dynamic) as well as reverse-proxy. NGINX does these as well, but Varnish is better in these two areas.

This basically describes the infra at my old company, where we ran that setup for both PHP and Rails apps.