THe difference is probably in how those numbers were TESTED, not the CONFIGURATIONS.

Click to expand...

There is a variance in what constitutes "power consumption-load", but that's to be expected. Most seemed to use some common sense- x264, multi-tasking, multithreaded gaming...but of course there are always going to some that view anything short of LinX as a cheat- probably the same kind of people that think transient peak load under OCCT represents "real world" testing.
My point was that if the system was the same, the cooling was equal and adequate (to take throtting out of the equation) , and the test between the two CPU's from the same review on any given site, then the only other parameters of contention are methodology in measuring power draw, and variation in CPU -which I tried to eliminate by using a larger pool of reviews. An outlier would probably be an early ES vs late stepping- but that would likely show up in comparitive testing in earlier reviews.

Who cares if its old news or not, the point is that its as good as a 2600K which Bulldozer was meant to be last yr and the 2600K is not a slow CPU either, am i right?

Click to expand...

It completely matters, comparing a brand new chip to one that's over a year old and not even sold anymore, is hardly a fair comparison again, all comparisons of the 8350 should be done against the 3570K !

It completely matters, comparing a brand new chip to one that's over a year old and not even sold anymore, is hardly a fair comparison again, all comparisons of the 8350 should be done against the 3570K !

Click to expand...

Not to me it doesn't, in my eyes a 2600K is a very fast CPU (regardless if its new or not) and in fact still faster then the 3570K is it not? so im quite happy to compare the 3570K to the FX-8350, doesn't worry me in the least

BTW... I've been reading tons of news and stuff about AMD's plans in the past weeks (maybe not so much in the past days tho), but I don't remember anything about Vishera 2.0 or Trinity 2.0, Orochi die and so on... can someone get me up to speed? A link maybe too.

I am planning to build a new rig to replace my aging c2quad and my only problem from jumping into buying this chip is the power consumption needed to use it. I found out that my power consumption should be less than 500KW if I want to stay at my 13 cent per Kilowatt billing tier. If I go over, my rate will change to 18 cent a Kilowatt. I could stomach the added cost if they'll charge me 18 cents/KW if it will only go towards the excess... but it won't. It will be the new rate for my whole monthly energy consumption. They say it's to give incentive to people who save power but since I am already about 490KW a month average, I don't want the added cost since between my younger sister and my gaming habits account to about 6-7 hours a day of usage, I might end up getting the i7 since the difference in power consumption will most likely cover the price difference between the 2 processors in about 4 months of usage.

well, it's good cpu IMO.
It's certainly not enough to compete with intel to us avarge gammers / users, but it sure has it uses for a different type of user, whom work with applications that will benefit a lot from the 8 core advantage from AMD.

"If you are looking to upgrade a full system then it's impossible to recommend. It's too slow, it draws too much power, it's too hot. It's just not worth it.""

Click to expand...

Yea of course its slow, a 2600K is one mega slow CPU hey? power draw is down a good margin from BD so yea totally terrible, shit yes 53c is smoking hot omg under $200 yea spot on man WAY over priced, what where they thinking? :shadedshu

Yea of course its slow, a 2600K is one mega slow CPU hey? power draw is down a good margin from BD so yea totally terrible, shit yes 53c is smoking hot omg under $200 yea spot on man WAY over priced, what where they thinking? :shadedshu

Click to expand...

What kind of response is this?

The 8350 is slower than much cheaper Intel chips in a lot of the benchmarks that some people actually care about - see the 99th percentile Skyrim graph above. In addition, it uses far more power than any recent competitive Intel offering of similar performance. Particularly when overclocked, using more power translates to getting hotter.

I happen to agree that the 8350 is not impossible to recommend, but instead is only recommendable to a certain type of buyer (one who does not prioritise games, who does prioritise certain highly threaded tasks, who does not stress the CPU enough for the electricity bill to eliminate the savings). But when you put forward an argument like that, you destroy any chance of persuading anybody of its merits.