Anyone else not give a single hoot about overclocking

OK, i realize HarddForum readers are hard core enthusiasts and that a vast majority view overclocking as a critical and necessary thing.

However, although a lower end enthusiast, I have no interest at all in overclocking.

For me, I see very very little value at all.

I have an i5-8400 on a H370 motherboard now, since I will never overclock it. The upgrade over my 3570K has been really significant, especially in CPU intensive games like Endless Legend and Civ 6 and some RPGs.

When I first got the 3570K, I spent like a week going through the whole process of overclocking the chip to get the highest possible speed. Seemingly endless tweaking and running torture programs to see if it was stable, checking my CPU temps and agonizing over whether I used the right thermal paste and installed the color correctly, and running endless benchmarks to see the frankly minor overall speed increases.

A year or two passed, and I needed to go into my BIOS and I discovered that somehow all my overclocking settings were no longer set and it was running purely at stock. In real world gaming and use, I did not even remotely notice that the overclock had fallen off.

On the flip side, I HATE noise. I need my PC to be completely silent. I do not game with a headset and noise drives me bonkers. Any noise. Clicking fans, coil whine, all of it I will do whats needed to eliminate it.

I would never consider a founders video card or any video card that was loud. It's the first page I read on any component review, what is its noise.

I always had gotten the K processors and SLI "just in case" I wanted to overclock.

My i5-8400 idles at 23, has NEVER gone above 45C at any point in normal use, and is completely silent from 3 feet. I couldnt't be happier.

I care about overclocking and tweaking hardware, I just don’t do it myself anymore. I’ve read a ton of information about overclocking this 7700K I have, I just can’t be bothered to actually do it

Years ago overclocks were often 50% and the difference between stock was huge. Now high end chips are much closer to their max speed when you buy them. When this 7700K starts feeling slow, overclocking 15% won’t save it.

As Boost algorithms get more intelligent (Turbo Boost, Precision Boost, XFR, etc.), the chips are running closer to their theoretical anyway. I"m all for it. You throw more cooling capacity at it, and it automatically gets you more stable top clocks. That's hard to argue with.

Sure, it was fun playing with pencil traces on a Duron. But I won't miss finding an overclock has gone unstable after a few months and having a machine restart/hard lock.

Nowadays, I only do a modest OC if at all (maybe to 4ghz). Even then, I might not keep it for long. Just to test out. Especially since newer CPUs are getting faster and faster stock.

Like the OP, I prefer the silence and stability. I want it to boot on the first try every time vs. a little extra speed and having to deal with flakey / random windows crashes.

Given the focus on GPU over CPU for the apps & games I use, it's easier to just upgrade the video. At least it was until the crypto wars broke out and GPU prices went batshit crazy. I'll be on my current GPU setups for awhile ...

It's not that I don't enjoy the hobby. I still read about hardware all the time, and read reviews and benchmarks. But I need stability overall. I still build annually with whatever I can afford.

I just miss the duron days. Long live my 2500+ Barton core.

And my Celeron 300a. I miss you guys. Back when I could overclock and actually feel the difference, not just in synthetic benchmarks.

Click to expand...

Yeah there is such little tangible benefit. I still read about hardware all the time as well, but this was actually the first time I have actually upgraded in 4 years. I couldn't afford to do it yearly. Glad to see I am not alone.

I still care about it because it's fun. I enjoy tinkering in the BIOS, stress testing and benchmarking. That's a big part of why I always gravitated to AMD chips, they had more unlocked"l models that had tweaking option in them.

GPU overclocking I don't think is worth it anymore tho. Higher end AIB boards like a FTW or Strix are already gonna be overclocked pretty good and probably near their max and their boost clocks are getting really good nowadays. My trust old 290x won't go much higher and although I did have a go at it, the benefits were like 5% but at a lot more heat so I just leave it at stock. When I upgrade it I'll probably leave the new one at stock as well.

I still care about it because it's fun. I enjoy tinkering in the BIOS, stress testing and benchmarking. That's a big part of why I always gravitated to AMD chips, they had more unlocked"l models that had tweaking option in them.

GPU overclocking I don't think is worth it anymore tho. Higher end AIB boards like a FTW or Strix are already gonna be overclocked pretty good and probably near their max and their boost clocks are getting really good nowadays. My trust old 290x won't go much higher and although I did have a go at it, the benefits were like 5% but at a lot more heat so I just leave it at stock. When I upgrade it I'll probably leave the new one at stock as well.

I guess that was part of my initial displeasure with overclocking is that I found the process itself super tedious and I was not really all that excited about the minimal increases. If I want to go faster Ill just buy faster parts. I do get why people like it though, not trying to say in any way overclocking is wrong or that people should not do it.

I guess that was part of my initial displeasure with overclocking is that I found the process itself super tedious and I was not really all that excited about the minimal increases. If I want to go faster Ill just buy faster parts. I do get why people like it though, not trying to say in any way overclocking is wrong or that people should not do it.

Click to expand...

I totally get that. Back when I first started in the thing of ours, I was broke as shit and could only afford a $80 CPU and $110 GPU so I hammered the nuts off those things to get as much performance as I could. Now I'm able to afford better gear so the "need" to overclock isn't really there anymore, it's just for the fun of it.

Years ago overclocks were often 50% and the difference between stock was huge. Now high end chips are much closer to their max speed when you buy them. When this 7700K starts feeling slow, overclocking 15% won’t save it.

Click to expand...

And this is the main point.

Years ago you could buy a 300Mhz Celeron and easily overclock it to 450, making it as fast as a chip costing 3x as much.
In the early Athlon days, you could buy a 700Mhz chip and over clock it to 900Mhz

Even in the early i3 days, you could buy a 2.8Ghz model and over clock it to 3.6Ghz
(the wife is still running one as it's still plenty for email & web browsing)

Now, the lower end chips are locked, with limited/no overclock potential.
If you want to overclock, you have to buy a higher end chip that is already close to the max speed, meaning that you only get a small overclock. Even my better than average 5Ghz overclock is only an 11% increase. Almost not worth the effort.

At this day and age most all of those who overclock do it for fun and not out of necessity. I don't have to overclock but I do because I enjoy it. I enjoy cranking up the power to achieve the highest stable speed for my use. Just because I can.

For a game like Civilization 6 as you mentioned, clock speed is the most significant factor in its performance. At least a few seconds improvement per turn is a significant time savings when games can easily be 600 turns.

That's 30 minutes time saved per 600 game turns if you can shave off a simple 3 seconds per turn. (I'm not 100% sure if 3 seconds off is a real result from overclcoking in civ6, but it sounds reasonable enough to me)

Personally I love to overclock. I will overclock each core individually to see which are the best cores. Potentially disabling a dud core if more clock speed is helpful.

Now I'd spend a week fiddling, stressing my 2700x out for what, 200 MHz? Just isn't worth it anymore.

Click to expand...

Back in the day when I found out about overclocking and turned my Radeon 9550 from a total turd into something that could play games it was a revelation. Of course those were unique in that they had a 9600 pro core running at 250 mhz software locked. I overclocked mine to 455 mhz. Now that was worth it. I still enjoy tweaking every once in awhile to get more out of my system but I care a lot more about noise and stability now as other in the thread have mentioned.

It's crazy to think we we're able to squeeze so much more performance out of those chips. That's probably around about when most of us old school overclockers got involved in the hobby.

Click to expand...

I got started much earlier than that. I had overclocked my i486 DX2 66MHz to 80MHz. From there I had overclocked some Pentium 100's and other chips as well. I had a dual Pentium Pro 180MHz setup I ran for a long time that was overclocked to 200MHz.

It's crazy to think we we're able to squeeze so much more performance out of those chips. That's probably around about when most of us old school overclockers got involved in the hobby.

Click to expand...

I started with overclocking my IBM AT from 6Mhz to 8Mhz by changing the clock crystal. Now that's old school

Later I was overclocking 386/25 chips to 33Mhz and later 486/25's to 33Mhz.

Come to think of it, I even overclocked my old RadioShack color computer. I could double the clock rate (from .9Mhz to 1.8Mhz) with software.
Only problem was , when running the faster speed it would scramble the screen. So, in order to speed of some slow programs I was writing, I'd switch it to 2x speed, run the long process with the scrambled screen and then switch it back off.

Overclocking is mostly fading because the cpus literally do it by themselves, and they are going to do it more and more. More self awareness of voltage, temps, load etc will be abused to maximize the work performed in a given set of constraints.

Especially with competition firing up again neither camp is going to leave a ton of free performance on the table when they can charge for it. Turbo 3.0's "magic core" and the latest Ryzen PBO stuff are great examples of where things are going.

I used to. My first real foray into OC'ing was with my Athlon64 3000+ Venice. I was able to put a 33% OC on it with absolutely zero bump in Vcore. That made quite a difference in games at the time.

However, I have strayed away from OC'ing because of the 4C/8T I have available make all the difference in comparison to my old single and dual cores of old, plus I finally have a string enough graphics card to be very GPU-bound at 1080p @ 144 Hz.

I've dabbled with OC'ing my 3770K and was able to get a stable 4.4 GHz with a very slight bump in Vcore, but the damned thing runs hot enough as it is because of the smeared dogshit under the IHS, that it's just not worth it, imo.

I'll likely be moving to a 2700X (or the jext-gen Ryzen) next upgrade cycle, because I want a soldered IHS back.

What required a lot of tinkering a decade or so ago is now done automatically by the CPU & GPU. I also think the speed of modern PCs is plenty for most people, so OCing gets you a lot less than it used to and it's not even appreciable at that.

Not to mention most of the folks on here have way more important things to do in their lives since the site started. Married, mortgage, kids, the little time you get to game, it better just work, a few FPS here and there be damned.

Unless you just enjoy tinkering, it's mostly a matter of noise and aesthetics now.

I still have a 4770k destkop ONLY because I got it + a board at microcenter for $250 on a BF. I only replace my E3-1240v3 xeon based windows box as the motherboard died. i5-8400+Z370 & 1070sli runs everything I throw at it. My game budget is no more than $5-$7 a game so you can guess the age of the games I play.

I much rather learn about networking, routing & fire-walling these days than overclocking.