Database Administrators Stack Exchange is a question and answer site for database professionals who wish to improve their database skills and learn from others in the community. It's 100% free, no registration required.

1 Answer
1

There's no formula here. You should limit your database according to what you think is reasonable according to your application needs.

Typically, servers with applications using a connection pool shouldn't need more than a few hundred concurrent connections. Small-Medium sized websites may suffice with 100 - 200.

I usually setup a new server with some 500 - 800 value of max_connections and see how it goes. You can always change dynamically via

set global max_connections := 567;

Make sure, though, you set up a proper open_files_limit. On linux, your process is limited to 1024 files, by default. This is very low, since every thread, connection, and, of course, file -- make for a file handle in linux. So set open_files_limit to some generous number (say 8192) to clear up your many connections with the operating system.

I should note I have worked with MySQL servers with thousands of open connections - it's cool. But, most of the time, the vast majority of these connections would sit and do nothing (be idle).

To sum up, I would use what appears to be normal application needs + some threshold for spike events.