Let's say you have a Core i5 2500K and an overclock-able S1155 motherboard: P67, Z68, Z75, Z77 kind. You also have a GeForce GTX680 or Radeon HD7970, but you want to play games with your brand new 120/144 Hz gaming monitor at 1920x1080 resolution getting stable 120/144 FPS . However your PC can barely push 60 FPS in demanding games at best of times. First obvious choose would be to get something like a GeForce GTX1080, but then comes the dilemma with CPU: sell the whole PC and get a new Core i7 6700K/DDR4 system or just upgrade your Core i5 2500K to Core i7 3770K? The Ivy Bridge Core i7 3770K is the best CPU you can get, and having in mind your motherboard can overclock it, would it be worth keeping it istead of buying a new Core i7 6700K system? This question has been asked many times, and there are already many answered tests to this situation, but i myself always like to test things myself, so here it goes.

I overclocked both processors to 4.5 GHz, since in that way you can simulate Core i7 7700K performance, since it reaches 4.5 GHz with just turbo boost and no OC. As you can see i used "basic" frequency RAM for both processors. 1600 MHz is the common "starting" frequency of DDR3 for Core i7 3770K and 2133 MHz is the common "starting" frequency of DDR4 for Core i7 6700K. However, i do have to admit that Core i7 3770K benefits much more in this situation than Core i7 6700K, because this Crucial Ballistix Tactical 1600 MHz {7-7-7-20} DDR3 is one of the best RAM you get. I tested it versus Corsair Vengeance Pro 2400 MHz {11-13-13-31} DDR3 and Crucial turned out to be the faster RAM!

If you would like to know the difference in gaming between DDR4 2133 MHz vs DDR4 3000 MHz, open this right now:

So now that you know how much Core i7 6700K would gain by having faster RAM, let's get on with the benchmarks!

On both computers the games were tested with Fraps , using 3 or 4 same gameplay sequences, 15 seconds of benching, and averaging the result from those 3 sheets to 1 sheet. To minimize GPU load, i only did 1080P tests and no AA was used in any game. Having in mind GeForce GTX1080 has more than enough horses to pull out games at 1080P, this should ensure a fair battle between Core i7 3770K and Core i7 6700K.

This is what you should expect typically. 3000 MHz DDR4 would add 2-3 additional FPS for Core i7 6700K.
CALL OF DUTY BLACK OPS 3

A win for Core i7 3770K definitely comes from great Crucial RAM latency. 3000 MHz DDR4 would add 3-4 additional FPS for Core i7 6700K, thus equaling the processors. For this i ran 6 same sequences for each CPU.

DYING LIGHT

While not much of an average FPS difference, the minimal FPS definitely cementifies Core i7 6700K superiority. 3000 MHz DDR4 would add 1-2 additional FPS for Core i7 6700K.

With so much FPS on the display the small victory for Core i7 3770K is not that important. 3000 MHz DDR4 would add 1-2 additional FPS for Core i7 6700K, so Core i7 3770K would still top the chart.

METRO LAST LIGHT REDUX

What's up with this game liking older processors? Core i5 2500K defeated Core i5 6500 EASY in this game and here Core i7 3770K is superior to Core i7 6700K. Bizarre, i have no words to explain... 3000 MHz DDR4 would surely help in maximum FPS for Core i7 6700K, but not decisively in average or minimal...

MIDDLE EARTH SHADOW OF MORDOR

This is the only in-game benchmark used for this CPU battle. The max FPS are not important at all, since it's in the random variation of +/- 10 %. However, the minimal FPS are more less constant, thus making Core i7 6700K a no-doubt victor. 3000 MHz DDR4 would add 1-2 additional FPS for Core i7 6700K.

That was shocking! To add salt on the flesh, 3000 MHz DDR4 would add a staggering 20-25 FPS boost for Core i7 6700K! This has to be the most memory bandwidth dependent game i have ever seen, nevermind the latencies...

RISE OF TOMB RAIDER

Amazing stuff! This means if you can get the best RAM for your Core i7 3770K, overclock it to Core i7 6700K levels you get a computer that matches a 3 year newer computer frame to frame in games! 3000 MHz DDR4 would add 4-5 additional FPS for Core i7 6700K, thus equaling the processors.
WITCHER 3 WILD HUNT

This pretty much sums up the whole point of not upgrading your Core i7 3770K CPU and the whole system to Core i7 6700K system. This does not count if you have a piece of shit H61 motherboard. 3000 MHz DDR4 also adds no benefit in this game.

------------------------------------------------------------------------------Please note that whenever i say that DDR4 3000 MHz adds minimal FPS improvements over DDR4 2133 MHz, i am only talking about standard situations where frames are being drawn constantly. I do not conclude this statement on processing demanding scenes, in those, FPS can benefit more from higher memory bandwidth.

I will draw no performance summary this time, since the results were so random and unexpected. I knew that there will be minimal difference between Core i7 3770K and Core i7 6700K clocked at the same speed with the same amount of RAM, but i surely expected Core i7 6700K to win in every game. The fact that Core i7 3770K actually managed to beat Core i7 6700K in some games, all be it by little, was a big nice shock.

Great job, you obviously put a lot of work into this. However, I have one point of contention, or maybe just an observation and question. Do you not feel there would be a fundamental difference discounting hardware differences simply by using Windows 7 on one rig, and Windows 10 on the other?

Again though, excellent work. Very interesting and a good argument for keeping those "old" dusty rigs a while longer.

Great job, you obviously put a lot of work into this. However, I have one point of contention, or maybe just an observation and question. Do you not feel there would be a fundamental difference discounting hardware differences simply by using Windows 7 on one rig, and Windows 10 on the other?

Click to expand...

Thank's. Perhaps someone else would be willing to answer that, i am not sure even what to think. I mean, sure, Windows 7 and Windows 10 have quite a bit different performance, but i tried to tie the specs and drivers of both computers similar. No DX12 API was used for any game in Windows 10.

Id be interested to see the results using the same os. Though it may not make much of a difference, thats a variable which shouldnt be introduced.

Id also be interested in seeing results of actual game settings on the gpu (read: ultra with AA) instead of artificially deflated settings which potentially exaggerate differences. People with a 1080 run with ultra settings and aa at 1080p. Again, likely not a big difference... but a variable that shouldnt be added.

Id be interested to see the results using the same os. Though it may not make much of a difference, thats a variable which shouldnt be introduced.

Id also be interested in seeing results of actual game settings on the gpu (read: ultra with AA) instead of artificially deflated settings which potentially exaggerate differences. People with a 1080 run with ultra settings and aa at 1080p. Again, likely not a big difference... but a variable that shouldnt be added.

Click to expand...

I'd be interested too. Perhaps someone ELSE will do that stuff. I am not interested in showing any GPU settings. This is not a GPU battle. And though i agree that AA should be used when gaming at 1920x1080, it would only decrease the difference between processors and increase GPU load, so the benchmarks would have been irrelevant, and we would see no difference between Core I7 3770K and Core I7 6700K.

For me even people with cpu's like I7 2600k or even i5 2xxx are up to date until covfefe launch and more cores will make a diference but that's just normal, Sandy Bridge is still kicking arses all over the world.

Love the comparison great job
I'm going to see if the upgrade to an Intel Z77 Extreme☠️ I got last month will make any difference from my DZ68 ☠️ board. Will be interesting at least if it will still clock 5.0-5.2 as well to see how those killer LP Samsung's fair Even tho Intel said they were Not Z68 compatible

For me even people with cpu's like I7 2600k or even i5 2xxx are up to date until covfefe launch and more cores will make a diference but that's just normal, Sandy Bridge is still kicking arses all over the world.

I'd be interested too. Perhaps someone ELSE will do that stuff. I am not interested in showing any GPU settings. This is not a GPU battle. And though i agree that AA should be used when gaming at 1920x1080, it would only decrease the difference between processors and increase GPU load, so the benchmarks would have been irrelevant, and we would see no difference between Core I7 3770K and Core I7 6700K.

Click to expand...

Yeah... that was the point, actually.

Where people play games, there is less of a difference or perhaps none at all. I suppose I just don't understand the point of making an unrealistic testing environment, to bring out a result not normally there with settings which are normally run.

i can tell you for sure that i5-2500k's will choke in BF1 mutliplayer, campaign is much easier on the CPU for some reason, i did some testing after i got my 6700k when i still had a 980 and even the 6700k at 4.7 ghz with hyperthreading off slightly hurt my FPS and caused drops in GPU use. overclocked i7 Sandy's and up should be ok though its the 4 threads vs 8 in that particular game that seems to be the real breaker, even 8 core FX cpus overclocked do better than vanilla 4 cores. at this point if Coffee Lake is a good overclocker its probably a great time for upgrading off old platforms end of this year with both that and the Ryzen options, games are starting to justify the need for more than 4 cores for sure but it will take a long time till its the norm just like it did with 4 cores becoming the norm.

Love the comparison great job
I'm going to see if the upgrade to an Intel Z77 Extreme☠️ I got last month will make any difference from my DZ68 ☠️ board. Will be interesting at least if it will still clock 5.0-5.2 as well to see how those killer LP Samsung's fair Even tho Intel said they were Not Z68 compatible

Click to expand...

Perhaps i have missed your previous posts, but what kind of balls are you smoking with Core i7 2600K 5 GHz at stock voltage?

Thanks for this benchmark!! This just justifies my decision even more not to upgrade yet. Coming from an i7-2600k@4.5Ghz user, ever since 4th gen came out I've been thinking of upgrading to a new CPU. I'd always say: "NOPE, too expensive"... DDR4 came out, still nope. I switched between 4 different cards up until my current GTX 1070, and still nope.

To upgrade my CPU right now I'd have to change my mobo, and RAM as well. not to mention assembling and installing the system plus selling the old parts and spending a lot of cash. not worth the few FPS increase imo. I haven't even maxed my GPU yet. I could upgrade to a GTX 1080 at a lower price and still get better FPS than I'd get if I upgrade my CPU instead.

The sticky point here is actually the 'stock' voltage claim. I think mine was 5Ghz 1.275V actual (from MM and voltage read point). I was 5.3Ghz before the multiplier crapped out and 5.4xx with BCLK. That was on custom 3x120 water.

Perhaps i have missed your previous posts, but what kind of balls are you smoking with Core i7 2600K 5 GHz at stock voltage?

Click to expand...

There's plenty of screenies from waay back testing the Samsung ram with Dave @102 Bclk
It's been running like this for years since they came out.
And yes it on a modded old skt478 Scythe Ninja Rev.B cooler with 1 old fan

The fact that you keep swapping the 6700K and 3770K in your charts, depending on which one wins, is pretty confusing. I looked at the first three games in the list, where 6700K won and thus was on top, then went to Fury Road where the 3770K won, and if I hadn't read your comment I would've assumed the top graph was for 6700K as well, not 3770K.

As for Metro Last Light, I'm guessing that game engine is more sensitive to DRAM latency as opposed to bandwidth.

The fact that you keep swapping the 6700K and 3770K in your charts, depending on which one wins, is pretty confusing. I looked at the first three games in the list, where 6700K won and thus was on top, then went to Fury Road where the 3770K won, and if I hadn't read your comment I would've assumed the top graph was for 6700K as well, not 3770K.

Click to expand...

Some forum members recommended me to do graphs THIS way and i agreed with them. People expect the winners to chart the top.

There's plenty of screenies from waay back testing the Samsung ram with Dave @102 Bclk
It's been running like this for years since they came out.
And yes it on a modded old skt478 Scythe Ninja Rev.B cooler with 1 old fan

Click to expand...

Wait. If i remember the default core voltage is surely higher than 1 Volt. WTH is going on?