There is no novelty to these issues. Its business. Buy any AMD X4 965+ and OC to 4Ghz.... thats the Novelty part.

Having ALL your best products - even those costing almost $300 that is slower than the competitions $200 lower-end CPUs is not fun.

I have a #2 desktop that is rendering videos daily (converting my OLD VHS) - and I'll need to upgrade its mobo/CPU to speed up the process. A NEW CPU will speed things up at lest 4-6x. (Its an OLD AMD X2).

The Intel's use less power, there is that odd-ball combination that allows the GPU of the intel be used to help render video faster.

So yeah, when looking at a $150~200 CPU, its performance that counts - not MHz.

Still, for most people - any $75~100 CPU will DO just fine. Including gaming.Reply

Considering that the AMD Phenom II X6 1100T is only $54 more and beats the new chip in most benchmarks while using less power under load (using max turbo @ 3.7, same clocks as the new 980 BE) I'd say that this new 4 core is a poor value comparatively.Reply

I'm sure the PhII is already stuck at 3.8GHz at reasonable voltages and has been this way for a long time.

I wonder how AMD feels when the mobile i7-2820QM is just as fast as their 4.2GHz OCed Phenom II X4. Bulldozer single-threaded performance and power consumption has to be at least the same as Nehalem to stand a fighting chance.Reply

In "Gaming Performance", the last two charts show Core i5 frame rates which are a little on the low side. Thanks for saving the images in PNG format though. You'd be surprised how many technical authors save images which have large areas of flat color in JPG format. :)Reply

Amazing that Amd's Phenom II is still clock for clock on par with the old q6600. It might have a few more refinements in coding, and a higher clock speeds, but it's raw horsepower is still on par with a 5 year old chip.

AMD is two gens behind intel in performance, I don't see how BD can reasonably expect to be in the same ballpark as sandy bridge. I'm sure they'll compete in a budget based scenario, but the high end is a moon shot. With Intel's x58 replacement is in the que I just don't see AMD even sniffing a piece of the high end action.Reply

You know, I am with you. But who believed that AMD would ever have released the Athlon and beaten Intel for as long as they did? I have hope, and we all should, for our own wallet's sake. AMD's inability to beat intel is the reason my Q6600 is still a fast CPU (granted, overclocked to 3.2ghz, it isn't exactly stock).

Nothing would make me happier than a fast AMD cpu to make me consider upgrading.Reply

Don't get me wrong, I'd love it if AMD were neck and neck with intel. Competition helps my wallet and keeps high end gear in my beige box.

It's just at this point in the game, BD cannot logically advance far enough past their last generation to compete with Intel's lineup. I'm sure that BD will be priced competitively and give some trouble to the low end sandy bridge chips. It won't be because their more advanced though, it'll be because the AMD chip will be a power hoggin, quad+ core with unlocked multipliers up against a lean locked down intel dual core.Reply

All they need to do is add a couple ARM cores to their existing bobcat design. Then they need to get in bed with microsoft and make sure that windows 8 properly implements schedulers that can seamlessly integrate ARM and x86 code. If AMD and microsoft both execute correctly, an 18 watt brazos chip is all we'd need for a desktop OS once the windows kernel is all ARM. Office would follow in a year. Antivirus and photo/video editing would go to the SIMDs. The x86 cores would only handle our legacy apps.Reply

Why would you EVER want to do that? Then you would have developers having to build apps that have to run on two architectures at the same time? If you want to do fixed function stuff do something like Intels Quick Sync for video, and then leave an x86 chip as an x86 chip. Reply

But I'm not disagreeing that the Phenom II is unimpressive. The Phenom II is essentially the same as the Phenom released in 2007, but with more L3 cache. The Phenom itself wasn't all that different from the K8 from 2003, which in itself was just an evolution of the original Athlon.

You could trace current Intel CPUs back to the Pentium Pro in the same way, but they have gone through many more, radical changes over the years. Hopefully those radical changes will come to AMD's CPU architecture with the release of BD.Reply

"When you look at it that way Intel's latest gen chips giving you an extra 10fps isnt that amazing."

FPS are the best case scenario for older/weaker processors as they are ultimately limited by GPU performance....

"Look at it another way and you could say AMD have done an amazing job keeping whats essentialy an 8 year old design in the running."

The Core i series can be trace it's origins to the original Core Duo processors that debuted Apple's transition to x86 (5 Years ago) Intel has had 4 extremely impressive architecture changes since, each improving performance from ~20% to ~100% in some cases... AMD has executed 2. The first Phenom was a HUGE disappointment and couldn't compete with Core 2 Duo's let alone Core 2 Quads.. Phenom II, now gaining traction, is still barely competing against those same Core 2 Quads..Your same ~10fps difference could be said of my old Core 2 E6600 against your Phenom II x6 in GPU limited scenarios...

AMD's entire success in surpassing Intel was placing the MCU on the CPU die. With that move, they blew their load. Thanks to the power hungry beast that was the netburst processors, Intel worked on improving caching algorithms, multi-threading, parallel processing, etc The end result is Intel with an extremely efficient CPU(born from it's mistakes) and an integrated MCU, AMD is left with just a so-so CPU and an integrated MCU.Reply

""FPS are the best case scenario for older/weaker processors as they are ultimately limited by GPU performance...."

so is he saying the game tests in this review do not show the same trends as the other tests? That is what he is saying in effect. They look perfectly in line with expectations afaik. The sentence on it's own does mean very little if anything. He is either saying this review is doing something wrong or that gaming tests are of little/no worth.

"nd getting on someone for saying extremely impressive instead of just impressive, Facepalm!!"

ZOMGWTFBBQ111!!!!!!! LOL!!!11

Being a native english speaker, I know there is a difference between impressive and extremely impressive.

Skipping a stone on a lake 20 times is impressive. Walking on the water there is extremely impressive.Reply

"Thanks to the power hungry beast that was the netburst processors, Intel worked on improving caching algorithms, multi-threading, parallel processing, etc The end result is Intel with an extremely efficient CPU(born from it's mistakes) and an integrated MCU, AMD is left with just a so-so CPU and an integrated MCU. "

I have said this exact same thing many times before. At this point in time making the P4 helped Intel because they had to optimize the heck out of EVERYTHING in it to even be remotely competitive, and well, now all that work is done.Reply

Price is a function of demand. AMD HAS to sell them this cheap to move them at all. They do not sell them as cheap as they do because they are more cost effective to produce, so your comment in this context is completely irrelevant ..Reply

Yep. I've bought at least 10 cpu+MB combos for $99 at Microcenter over the last year, for clients, friends, and family. Fast enough for most purposes, and you are GPU limited in most games, and so better off spending the cost savings on a better video card. For $300, for can get a much better performing AMD system, versus Intel. Maybe not good for AMD, but good for me.Reply

But it doesn't. There's plenty of benchmarks around to show that a comparative Phenom II X4 gets a few wins, especially in encoding. Aside of a few very strong Intel-optimised situations, there's really very little in it, and anyway, if you ignore the very high price of the upper Core 2 Quads, you can get a faster Phenom II X4 or X6 for less.Reply

AMD has a new platform on the way, and that is where the focus has been. For now, any new Athlon 2 or Phenom 2 processor will just be a bit more of the same with a higher clock rate, but still has the same exact design as previous chips. Bulldozer on the other hand is the big push to get back to being competitive.

The big problem is, and remains the lack of a 32nm fab process which Intel has had for a long time now.Reply

I knew from the beginning that AMD wants to milk more performance out of it's Deneb architecture. I wasn't surprise when I saw this item on this website that I visit from time to time.

I currently own one of the first Deneb CPUs (PH II 965 BE) w/ a clock of 3.4 GHz rated at 140W! I didn't even bother OCing this regretful purchased CPU of mine. How I wish I waited for the cooler versions, but I was in a hurry to build a PC that I can use at home for my Networking classes

I will buy the 1100T soon. I will be be OCing it at 3.7 GHz w/ turbo core disabled.

I don't wanna be negative, but I know for a fact that Bulldozer will not beat Intel's upcoming LGA 2011 CPUs. Maybe Bulldozer is meant to compete with Sandy-Bridge.Reply

"I don't wanna be negative, but I know for a fact that Bulldozer will not beat Intel's upcoming LGA 2011 CPUs. Maybe Bulldozer is meant to compete with Sandy-Bridge"

Exactly. Even if. Look at the turbo boost scenarios from the Intel SNB processors. There is so much headroom on these processors. Intel is already holding back performance because of lack of competition.

It has been proved useful in one of the Phenom II X6 reviews (can't find it now though), where performance in HAWX just shot right up bij about 20% I believe when upping from 2GHz to 3GHz.

It's a shame most if not all reviewers don't overclock their uncores. Or, well, at least, they're not telling you and they don't put a CPU-Z Memory Tab screenshot in the review, showing the Uncore-frequency.

I know pretty much all hope is lost for this aging design (Deneb), but as an owner of this furnace CPU called the Phenom II X4 C2 (yes, 140W at 3.4GHz), I'd like to know how much faster the Intels are compared to my 3.7GHz/2.4GHz oc.Reply

I haven't had an Intel CPU since P4 Willamette. I've been happy with AMD bang-for-buck, as performance seemed sufficient, and overclock always covered any shortcomings. Nowadays, I see mid-level Intel CPUs beat AMDs top-end offerings every release. And honestly, I'm really bottlenecking in the CPU department, but I don't see AMD offering a solution (running Phenom X3 @x4 3.5Ghz). I've been waiting for 2 years to upgrade the processor, and I'm getting tired of this. Don't make me cross the Sandy Bridge, AMD. Make BD happen. And it better be good.Reply

Where I work, AMD still outsells Intel by a factor of.. well... over 20-to-1 if not more. What we always tell customers, is that if price is no object, the Intel platforms are higher performance. But best bang-for-the-buck is AMD, and it's not like we're comparing an i7 to a 486-SX. We try to explain it in best terms, but there's Intel processors in our display case that are actually gathering dust. Which frankly makes an AMD fan like me happy :) But I digress.

We carry a full line of Intels, from the Celeron cheapies, all the way up to i7. And we finally closed out all of our 1156s and only sell 1155s.

Like it or not, our customers are amazed that they can pair up a decent mobo, an Athlon II 250, and 4GB of 1333, for less than $160. Most spring for the 1075T for the price, too. Whether or not it's faster than a similarly priced i5, people like the ability to say they have six cores. When they can get 75% the performance of a comparable chip, for less than 50% of the cost... people bite.

We're quite excited to start carrying Zambezi chips, when we're able to, since they'll be more competitive. But it's always going to be darkest before the dawn, and it's nice that AMD is throwing in a speed bump or two (1100T, etc) before the architecture change. Instead of letting their chips languish until BD.Reply

Must admit I havent ever made an Intel box for a customer, its always AMD.

I check out the Intel CPU range every now and then and check what price the bottom non Celeron Intel chip is going for, then the cheapest decent brand Intel motherboard and after seeing any profit just vaporise, I roll my eyes and go back to the AMD section.

Intel isnt worth the extra cost for most ordinary folks. Intel are total overkill. The good old 3GHz dual core Athlon with a mATX MB and 4GB of 1333DDR3 works a charm everytime.

If a customer came to me and said he wanted to do loads of transcoding and video editing and had £1200 to spend then lets go Intel. But as most come to me with a budget of £4-500 and I need to take my cut, its not going to happen.Reply

This is exactly how I feel. I've always owned AMD because it's been fast enough and cheaper. If I had to build a PC today I'd choose a Sandy Bridge processor, but i'm not building one because my AMD 955 BE still does everything I need it to. I have tons of windows open on my 22" monitor and play my games on my 23" and have no issues.

A lot of that Intel performance gain falls into the 'can't even tell' category for many users. Reply

The saving made by buying AMD, would you pay it back in electricity? Let's say, after 2 years? Just want to see if the higher power consumption would translate somewhere. If any of you have done any comparison in this area, I would appreciate very much if you can give some highlights.Reply

This is ridiculous. BD is one month from launch, and llano has been shiiping for weeks. Yet you guys dont have a single benchmark of either. Or maybe you do and are just too spineless to post them. It is sad that we had better sources of information 10 years ago. Now there is nothing because everyone seems to care too much about NDAs. Who cares about NDAs? If you know someone who has the hardware you should post their review asap, and not worry about NDAs. What good does it do to release a review the same exact day as two dozen other sites? If you want more eggs sometimes you have to sacrifice a few chickens.Reply

How about reviewers that enjoy getting engineering samples, new products to review, and support from the manufacturer's for free? How about any reputable, respectable, reviewer? How about anyone who frowns upon breaking a contract?

I asked myself the same thing, did he really just ask that? Someone never took ethics courses, or has any ethics for that matter. Just because you want to drool like a baby over specs is no excuse to ask these reputable authors to break NDA. On the day of release feel more than free to go view any hardware site you wish.Reply

Yeah well you must be a spineless chicken too. There are plenty of ways to get chips. All those engineering samples out there, and no one can get their hands on one? Yeah right. If all NDAs were broken there would be no NDAs. Your brain is full of fail if you cannot understand that. Just a bunch of spineless cowards.Reply

are you a total idiot ? it's called competitive advantage. if they would break the NDA, no engineering sample for them for the next round. if everybody breaks the NDA, no launch day reviews. that simple.Reply

The engineering samples would go to different people, but chances are one of them would hook up with someone who has the reputation for having some freakin cojones. Especially if money was involved. There IS money to be made in this way, and its not illegal. Again, it just takes cajones.Reply

I'm glad to see a compile test in the lineup. Can I ask - how did you choose number of threads for the test - was it the same for all (12) or based on number of physical, or hyperthreaded cores. In my experience, on AMD chips most compilations are faster with overloaded threads, say 2x number of physical cores - did you test this possibility?Reply

Some programs don't work well with non-power-of-2 architectures either, which would harm performance of the X3 and X6 processors (Windows Media Encoder 9), or even worse, cripple things completely (DivX or XviD on VirtualDub uses just one of an X3's cores, so you need to set the affinity to two cores). I suppose logically, overstressing a hyperthreaded CPU would mean that the its execution units are fully utilised and, as such, logical cores won't actually make any difference, so it would be in these situations where the X6 could perhaps close the gap a little.Reply

Man this is some boring news. I would prefer to get some more inside or backgroud info from AMD, Intel or Arm country, even if it takes years compared to this. But i guess this is better business :)Reply

I don't see this mentioned in the review nor the benchmarks, someone correct me if I am wrong but when those benches are made, are they made with Turbo disabled? If not, I don't see it as a fair comparison if you are running stock speeds when comparing a 3.3Ghz i7 2500K vs a 3.33Ghz i7 975 You have games that make use of all cores, and some that use only a few.. so you may get a higher turbo on a Sandy Bridge chip. This is not exact science but making the GHz speed into an exact comparison without Turbo enabled gives a little more insight into the product..

This is just a final release to cap off Phenom 2 they know they were beat they are just giving a little speed boost to it for more value. If this was bulldozer then ya it would be bad but we already know what phenom 2 has to offer why is there so much discussion around a final revision chip. They are just throwing out what may be the most speed they can with that core until BD arrives. Reply

"AMD originally introduced the Phenom II architecture over two years ago to compete with Intel's Core 2 lineup. Intel has since been through one major microarchitecture revision (Sandy Bridge) and Phenom II is beginning to show its age."

Phenom II came to market after Nehalem. Yes, it wasn't exactly meant to compete with it, and was more about sorting out what Phenom did wrong, but the unfortunate truth is that the i7 beat it to market.

529th - I think the issue would be that you would be disabling a feature and thus not showing accurate performance, however that in itself would show the difference given by having Turbo in the first place. Doing the same with Thuban would be interesting.

Zosma would've made more sense than continuing on with Deneb, however AMD quickly shelved that idea.Reply

Nah, it's just Intel has a huge advantage on fabs. They also have a clock for clock advantage on most workloads, but AMD's engineers are by no means incompetent. Brazos is the only recent great product though.If both Bulldozer and Lano fails spectacularly, i'll start leaning towards your side though. The last years Intel have alienated me with horrible business ethics and artificial restrictions on processors for market segmentation with very high prices on the chips that were not gimped (disabled functional units).Hopefully power consumption will be better with GloFo.Reply

Intel actually DOES need to make its compilers more vendor-agnostic since the settlement. If a piece of software works rather poorly on an AMD CPU and much better for an Intel offering, this isn't always down to the Intel CPU being stronger; it might actually be the result of the AMD CPU working on an inefficient codepath. Of course, it may actually be the limit of the processor. If something is compiled using a neutral compiler, I'd be far more inclined to trust the results of that.

I doubt Intel is forcing people to use their compiler nor are they making them develop for Intel-only platforms, however in the interests of fair competition it's only right that checking for the CPU ID tag be removed, if that is what they're doing.Reply

The most popular CPUs are towards the lower end. Intel doesn't have a quad core CPU below $100 whereas AMD has multiple. There's no reason why an Athlon II X4 setup shouldn't easily undercut an i3 setup, and occasionally, four true cores will be a benefit.Reply

This is great hardware, but it seems that most everyone I know who is an average user has left their tower systems and switched to smart phones, iPads, and gaming consoles for the most part. These items are now being discussed at family get togethers instead of PCs as was the case just a few years ago. Some advanced users are using the latest graphics cards for computational projects instead of gaming - the CPU is no longer seen as the best way to fold. It's the graphics card doing the grunt work while the quad-core processor sits mostly idle. Reply