All the Perl that's Practical to Extract and Report

Navigation

The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
Without JavaScript enabled, you might want to
use the classic discussion system instead. If you login, you can remember this preference.

Please Log In to Continue

This is something a lot of people don't seem to get -- CPU and disk are all but free in our era of 2GHz+ chips and terabyte drives, but memory (and cache) are a huge bottleneck. The limiting factors in many programs I write (bioinformatics) are memory and sometimes address space, not CPU time or disk space; and my most frequent source of UI irritation is not waiting for the CPU, but waiting for an application that has been paged out. Yet we see man-years devoted to ever-more-complicated CPU schedulers and

Yet we see man-years devoted to ever-more-complicated CPU schedulers and file compression algorithms, while language runtimes become increasingly complex and approximate LRU remains the state of the art in virtual memory management.

I've seen applications where the best possible optimization was to remove a cache and recalculate a modestly cheap value; many modern processors are fast enough that throwing a few extra cycles in the pipeline is more efficient than paying the price of a processor cache miss.

I think this just goes to show you how different problems needs different optimizations. I mainly work on web applications and in those instances we are mostly concerned with Disk performance, CPU and network, not memory. Withy 64bit boxes all over the place having 16GB+ memory is pretty common. So memory is never my problem and I appreciate all the work that's been put into making CPUs faster.

This really surprises me. From my admittedly outdated experience, I would have guessed that you would either be network-bound or, with too many simultaneous (slow) connections, memory-bound. What's sucking up all that CPU? Is it the kernel copying buffers around, or your app generating a bunch of dynamic content?

We do have a lot of IO bound issues (especially with the DB) and most of our content is dynamic. Since we try to minimize the load on the shared resources (the DB) it means we push the computation into the web servers as much as possible, so having fast executing code is nice.

Memory is so cheap so throwing a few extra gigs into a box is a non-issue. I've never found myself thinking "wow I wish Perl used less memory". Again, this is just my experience with the kinds of applications I develop.