Is overclocking over?

This site may earn affiliate commissions from the links on this page. Terms of use.

While we were tossing around story ideas this week, one of the ExtremeTech guys lobbed out this doozy of a press release he received from Kingston. It basically explained how a Romanian overclocking team smashed a few overclocking records, including pushing Kingston’s HyperX DDR3 memory to an incredible 3600MHz (at CL10). The Lab501 team did this, and their other record breakers, with the aid of liquid nitrogen which cooled the RAM down to a frosty -196°C.

That certainly qualifies as extreme, but is it news? One of our number wasn’t so sure. Ten years ago, he pointed out, overclocking memory involved a certain amount of investigation, research, and risk, but in these days of super-fast RAM and manufacturer’s warranties it seems a less intoxicating prospect. “I think the enthusiast passion for overclocking has cooled off as it’s become increasingly difficult to justify what a person should overclock for,” he said.

He’s right, of course. And he’s wrong.

I’m a good case in point. When I started building computers in the 1990s, overclocking held a deeply magical allure. Because there was a lot less hardware to choose from, and what little you could find was generally expensive, squeezing whatever extra power you could from it was a necessity if you didn’t want to redo the entire system the following year just so you could play the next Ultima game. (I built one of my first PCs for just that purpose, by the way.) And because the Web either didn’t yet exist or was in only its most formative stages, it was challenging to connect with others who could help you. Maybe there was a BBS or a Prodigy discussion group you could peruse, but those skills — like so many others — were passed down via an oral tradition we’ve sadly gotten away from now that there are almost as many enthusiast websites as there are enthusiasts.

But truth to tell, I never loved overclocking, and I stopped doing it as soon as I could. I appreciated the art, intensity, and rigor of it, but I had trouble appreciating them all together. Getting everything apparently right only to have a game crash and take an hour or two of progress with it, forcing you back into the BIOS to make another tweak and hope that this time it worked… It annoyed me. Plus, when my meticulous (some might say OCD) nature caused me to spend more time ensuring that every detail was perfect rather than enjoying the fruits of my labor, I eventually had to let that aspect of the build world fade into the background.

Today’s computers are an absolute blessing for gamers and enthusiasts in a way those of yesteryear were not. Because the big hardware folks have come to understand better what tasks can be done and how, raw processing power from a single chip as the ends rather than the means matters a lot less than once it did. Will a super-speedy CPU like the Intel Core i7-3960X help you get more out of, say, Batman: Arkham City or The Elder Scrolls V: Skyrim? Sure. But a powerful video card will do a lot more, as far as both the base visuals and some tasks traditionally relegated to the processor (such as those now handled by Nvidia’s PhysX engine).

If you’re willing to sacrifice some performance, the Llano processors AMD released earlier this year deliver compelling discrete DirectX 11 graphics without requiring a separate card. Creative, Asus, and other companies still put out sound cards if you don’t want to take any chances in that department. Heck, if you want to offload your game’s network processing, you can even get PCI Express cards now like the Killer NIC that let you do that. (Whether those are really worth the money is a subject for another time.) In lots of ways, you simply don’t need a supersonic CPU to have a good time.

Even if you do want one, the out-of-the-box experience is pretty terrific on most of them. With some mainstream chips having four processing cores, amazing multithreaded performance is within the grasp of ordinary people for the first time. Add in additional factors like Intel’s multithreading technology, and four cores can seem like eight, or six cores like 12 — this kind of power is unprecedented for consumers. Much of the time, you don’t have to worry about processors’ clock speeds at all — and when that’s the case, of course overclocking loses its luster. When you had to do it, it was one thing. Now, it’s something else.

Tagged In

What? With virtually one click overclocking from MB makers for DIYers and most PC builders offering OC’d game machine options, overclocking has become almost mainstream. The entire K line from Intel shows just how good OCing can be for the “average joe”.

http://www.cardinalphoto.com David Cardinal

Amen. I love the simple overclocking I can now do on my desktops, tablets and phone. I’m not a gamer, but all of them definitely feel snappier and of course Photoshop & compilers all run that much faster.

Lupius

I was very excited about overclocking my first build over 5 years ago. It turned out that my usage pattern was not CPU intensive at all, and I could barely notice a difference.

Now I regularly underclock my CPU to keep the fan noise down and my room cool. I still don’t perceive any performance drop.

http://www.facebook.com/people/Level-Ten/100002996609274 Level Ten

You can never have to much power.

Anonymous

Overclocking is to nerds what tuning is to petrolheads. There’ll always be someone doing it, either cuz they want to save money or for the sheer fun of it.

Anonymous

Overclocking is to nerds what tuning is to petrolheads. There’ll always be someone doing it, either cuz they want to save money or for the sheer fun of it.

http://twitter.com/xarinatan Alexander (Ced)

No. Just because it’s easier to step up the speeds a bit doesn’t mean it’s dead.

In fact, the amount of options to do so has immensily increased, allowing for more kinds of overclocking.
Just throwing up the multiplier isn’t overclocking, you think the FSB will keep up with that? The PCIE bridge? North/southbridge?
And what about the videocard, CPUs these days aren’t just faster, they’re so fast that by far they are no longer the bottleneck.

Simple example:
In an AMD scenario, try up your multiplier by a bit, until you run at 4GHZ (which should be fairly easy with no voltage changes even), leave the rest as it is and run a few benches and a game and check the FPS.
Now, lower the multiplier and up the FSB speed until you’re at 4GHZ again, and do the same benches.
Now, lower the CPU speed again and increase the videocard’s GPU speed (not mem!), and run the benches.
Now, lower the GPU speed and increase the video memory speed.

Et cetera et cetera, they will all harvest different performance increases in different scenarios.

No, overclocking ain’t dead, it only got more complex, and GHZ no longer represents an accurate measurement of the speed of a computer (Actually, It never did.)

http://www.bestclubsin.com Jacob E. Dawson

With video and audio editing applications, squeezing an extra 1.2 ghz from a 2600K has been well worth it, and pretty easy with a solid mobo. I’d never bothered with it before, but after delving into the possibilities I can say I’ll be OC’ing for the foreseeable future, at least until HAL takes over…(“…just what do you think you’re doing, Dave?”).

vox nulla

For years the whole overclocking hype has had marginal gains in speed but it is partly responsible for a massive decline in quality and inertia in development. As part of the gigahertz, terabyte, gigaflop madness that dominates the lower ranks of computer users like gamers and solitaire playing porn collectors, These lowest computing denominators have influenced the industry really negatively.

Whereas the push for new technology from the mid 90’s should have been, smaller, less wattage, more efficient, solid state and cleaner, we have been burdened, up to this day even, with noisy fan’s that break, obsolete rotating media and ceramic cook plates for C and GPU’s.

We are ten years behind what we could have had due to such simpleton views on hardware and how to operate it. It’s about time this wasteful, tweaker non sense went out of fashion.

Applaud the person who can get the most performance per watt instead of the dimwit who employs a coal power plant to reach theoretical benchmark numbers that he/she won’t even use/notice in real life and only shortens the lifespan of, normally, perfectly tuned hardware.

People who think half of the weight of a computer ought to be heatsinks and massive fans should be taken outside and be shot.

http://www.cardinalphoto.com David Cardinal

vox–I’ve definitely gone in both directions. For machine(s) that need to stay on I’ve built low power systems (wrote an article here on ET about that a couple months back), then for Photoshop I have a pretty beefy unit that’s overclocked with 2 graphics cards. But it goes to sleep when I’m not using it.

vox nulla

Parallelisation is a perfect example how you can attempt to get maximum performance per watt. Your usage of two GPU when needed is a good example. I’m not sure that PS really benefits that well from clocking especially compared to the extra power consumption, heat production and reduces MTBF for the hardware. Currently I’m experimenting with second hand thin-client boxes as dedicated number crunching renderfarm. I need a couple to match the speed of a serious powerhouse, but when I do the TCO is a fraction of massive workstation and the wattage is about 80% of the multi-core beast. The best part is, their off when not batch rendering. Imagine what would happen If I actually used a low power chip that was designed with floating point in mind.

Anonymous

Overclocking is not as compelling as it once was, but for a different reason: consoles. The gaming market is catering to them now instead of PCs, with the consequence that developers are aiming for stagnant hardware targets instead of the constant advances we saw in the 90’s with a PC-centric market. As a result, even relatively modest PC’s built on a budget a couple of years ago can still manage to respectably run cool modern titles like Skyrim–no upgrades needed. Ever-expanding development cycles raise the stakes even further by making it necessary to cater to the greatest possible audience. If those developers were free to push the envelope the way they used to, it would be a very different story.

Christian Cann Schuldt Jensen

Skyrim was a poor example. It’s a very CPU-bound game and having a speedy CPU will help it much more than the latest greatest graphics card.

http://twitter.com/iWankTV Raphael Baker

Too many words. Couldn’t be bothered to read all through that. You could simply have said you’re not an overclocker but some people are. I am and have always been. Liquid cooled overclocker. Used use peltier and liquid to chill my CPU to -7c. There, I said it and in less than 4 pages.

Anonymous

In my Laptop I have an A6 AMD Based CPU. Sure I have a dedicated graphics card but it can only do so much and the 1.4ghz CPU is not enough for todays games. So the second it came out of the box I overclocked it to 2.35ghz and now I can do everything my old Q series C2Q could do while it was overclocked but mobile. I will always overclock everything I can , my phone … my cpus … my gpus … whatever it is.

Anonymous

I’ve never overclocked myself, simply because if you did it wrong, well then you got to enjoy a deep fried CPU/GPU.

Better safe than sorry has always been a guiding motto of mine.

Anonymous

My impression was that few developers went in for overclocking, since it would be clear whether a crash was due to a bug or due to a hardware glitch triggered by the overclocking. Has this changed at all?

I guess if I absolutely had to squeeze out a few more FPS to play my game, I might give it a shot. Or maybe I’d just upgrade.

Dennis Reiley

Overclocking is ridiculous! It doesn’t take that much more money to simply increase the speed in other ways.

This site may earn affiliate commissions from the links on this page. Terms of use.

ExtremeTech Newsletter

Subscribe Today to get the latest ExtremeTech news delivered right to your inbox.