The 3000 charts:
The server handles up to 600 clients per second, which I interpret to mean active connections. At 600 everything plateaus and the timeouts begin. That's at a rate of about 50 new requests per second while the server responds with 30-40 responses per second. That's about 20 req/sec that are building up because they're not able to be handled fast enough. After 30 seconds that's 20*30=600 requests bogging down the server - which is where the 600 client/sec comes from. The timeouts start because the server can't accept any more connections as it's waiting for existing requests to be filled.

The 10000 charts:
The server eventually handles ~1550 clients per second. There are about 170 req/sec and 40 responses per second. That's ~130 requests building up, and after 12 seconds hits its threshold of 130*11=1560.

Why is one 600 and the other 1550? I don't know. I wonder if the 600 is a database limit and 1550 a web server limit.

What you have to decide is how many requests per second you expect (a) on average and (b) in a burst of X seconds. Then a bit of math will tell you how many active connections you need to support and how fast responses must be. Like if you want an average 25 requests per second then you need a response time averaging 1/25 = 0.04 seconds because anything much more than that will mean requests will start building up and you'll eventually hit a connection limit and timeouts. If you want a burst of 200 over a 5 seconds then your server needs a connection pool that can handle 200 connections. They might not all be served in 0.04 seconds, but the client isn't timing out so it's okay because it's just a burst.

That's all for a single server, of course. With multiple servers and load balancing, the requirements are loosened a bit because there are more servers to distribute the traffic, but then you have to double check that your database(s) can handle everything.

It seems to me like the chart is saying each request takes a few seconds? That's not good. Or is it aggregate?

No one knows the answer to that question. This is an open marketing software that shows the timer on screen. Clients could send as many and load the timer from my database based on the records. They could do a product launch for a list of 1 million prospects.

I noticed the competition have 100,000 leads maximum. How many connections at once, I don't know.

The front end of sales pages etc, I have no idea what to do! I am talking to server people. There has to be a good solution. I mean there are sites with Billions of visits per month.

Then you have to come up with one. The only way to measure the performance of an application is using metrics, so if you don't have metrics then you need to make some. Otherwise how will you know it's good enough?

1 billion requests per month is ~400 requests per second. (Oh, by the way, that's a metric.) If you must support that then you need to hire people who understand how to design and work with distributed systems because you're not going to get one web server and one database server to support that.

At some point, you simply have to scale out to more servers if you want to handle large amounts of traffic. You can only do so much with a single server, and trying to just build a bigger/better server isn't always the most economical option. That's why cloud based setups are so popular these days. You can get a strong server to host your database, then multiple smaller VPS's to handle the web traffic. With some providers you can even set things up so that during peak load times you can spin up extra VPS's to handle the extra traffic, then get rid of them after traffic dies down.

Without having access to your code (and since I know nothing about code igniter), it's hard to say whether you could make any improvements there to increase your load capability. Probably you could, but how hard would that be? Is it worth the effort? Can't say.

If you're currently running both your database and your website off the same server, the easiest way to try and increase your capacity is to get a separate database server. The server you host your database on should be fairly strong with plenty of memory so it can quickly and efficiently process queries. It should also be on a high-speed LAN along with your webservers rather than elsewhere on the internet. How strong you need to web server machines to be depends on if you want to try and handle everything with one, or spread the load across many.

I've never personally worked on something with a lot of traffic. We don't really track traffic that much, but running the last couple months of server access logs through a quick script seems to indicate we only get about 6 requests/second average, with the largest peak being 497 requests in a second. That's basically nothing compared to the big boys out there. For reasons unrelated to traffic, we are planning on moving from a single dedicated server to a dual VPS setup on Azure, with the database on a 4core/16gig vps and the web server on a 2core/4gig vps which from my testing should support our needs just fine, and if not it's simple to scale by either adding another web server or scaling them up to something stronger.

Creating a stable infrastructure that can handle high volume is complicated. That's why companies pay lots of people lots of money to do it.

If I helped you out, show some love with some reputation, or tip with Bitcoins to 1N645HfYf63UbcvxajLKiSKpYHAq2Zxud

I am gonna move the API, members area and timers to separate locations. By locations, I mean somewhere they won't interfere with each other.

It maybe sub-domain? Kicken mentioned different database server? How about subdomain? Is sub-domain gonna be effective at all? Or if the site has tons of traffic, sub-domains get affected too?

Once I get CodeIgniter hello world to work 100% with 500 requests per second, then I get into optimizing the app's code. Tbh the app code is already super tiny and optimized. it has 1 query that builds the timer. I already took the timers under another account on MAMPP.

About Codeigniter, the main reason I picked Codeigniter is the speed and simplicty. It does outperform all other frameworks. It's simple.

Edit:

Once they say "this is as good as the server gets, then I tell them about separating database server from the server. I'll also suggest Kicken's

You can get a strong server to host your database, then multiple smaller VPS's to handle the web traffic. With some providers, you can even set things up so that during peak load times you can spin up extra VPS's to handle the extra traffic, then get rid of them after traffic dies down.