120 Comments

I still believe in AMD, and i know they will release a nice product line-up in the coming months, or probably 2009... i have a Core2 PC now b/c i cant deny the fact that they are really strong CPU's... but my DDR3 upgrade will be on an AM3 system... i think they will be really competitive.

I didn't comment on this review when it first came out cause I didn't want to read the whole thing. It reads more like a rant on a blog than a review, he didn't want to go to Cali, so what. In the time since this article the phenom has proved to be a good cpu, I noticed in the last week that HP and Gateway have started selling systems in Best Buy and Circuit City with phenoms. This cpu was rushed out and it will take a bit of time to mature. It's the same thing we saw with the athlon64 from 2k3, had it been as developed as it needed to be, they would not have gone from socket 753 to 939 to am2 and so on. Amd should have made the smaller leaps to a quad core athlon64 til phenom was ready, but they have bad decision makers these days it seems. Reply

There was no rant in this article, there was a stern condemnation of an attempt by AMD to control the benchmarking and review process, to influence what should be independent and transparent review of a product to the marketplace.Reply

Well I think that intel got the processing but i realy dont think at the first place that the phenom is ready they need time and money to be able to get rev. in good working and debuged intel had that money and time they realy took their time befor shoing up with core2 wonld say tow years almost. Amd had already a small part of the market enven whene they give the best performance for the price even compaired to the best intel. To say it I realy was hoping the truth native quad core phenom would be better proccesing too but in ther other hand there are a lot of technogie that need to be looking at about amd that poeple sould take a look at about the phenom that is key to all amd cpu that people are too stupid to look at and understand. like power comp. wtf man there is a bus and a memory controler my nvidie chip set coul burn and egg and it only have to run the pci and pci-x. Reply

I remember to have to save food of my mouth to buy the first ill conceived Pentium or the 486. How Intel set those prices evades me. They maximized their profits with no competition. Yes I know Intel produces a faster chip, if faster is the right word. However, when it comes down to competition and the markets, AMD is the strategically right choice. Unless of course you think that the Walmart buy cheap toys from China idea is your consumer ideology then you should stick with Intel, that actually is going like Nixon to China or you believe in the consumer choices should keep competition open. Reply

Maybe AMD just had to release these early to make a little capital this year. Personally I think that it is good that AMD let people know SOMETHING is close. I am hoping that with a little tweaking AMD and some MB partners can get performance up a little and be competitive with Intel sometime in 2008. I have seen just simple driver tweaks work wonders on other hardware, maybe just a little more time can help. Reply

By reading through the benchmarks, where a single core (of a quad) is compared to all four cores running, it looks like the 8 core version of the "Phenom" would scale even better than the four-core one. The Barcelona btw. also shows this behavior, by having one core being just a little faster than the Opteron core, but the four running in tandem scales very well.

Because of the not-so-good MHz numbers, it might not take AMD to new glorious heights for now, but when (soon?) 8 cores arive, AMD MIGHT be able to do better, because of their better core-scaling factor.

Looks like they HAVE to do something like gluing together two Phenoms to at least do 8-cores before intel, and before intel gets TOO fast for AMDs liking and the ability to catch up slips away.

Unless AMD already is working on a true 8-core design, which would probably scale even better than a glued one. And by incooperating knowledge on multithreading from ATIs designing of GPUs they might be able to do something even more serious in the future.

I really wish that AMD never went down the road of L3 cache for these processors. As the majority of applications still used today in the desktop/workstation market are going to be only one or two cores, the shared cache itself probably causes more of a hindrance.

Personally I would have liked to see 128k L1 and 1MB of L2 for the higher models, and simply the 512K L2 for the lower models. The tweaks to the individual cores would almost enable them to catch up clock per clock with Intel without this L3 cache latency getting in the way, and that way powering down the individual cores would also power down all the cache they would be using as well. I realize that routing the L2 cache in larger quantities is trickier and consumes more die space than L3 but they should also be able to gain significantly cost measures in those produced without L3 and be able to compete better in the $180 or so market.

Granted if a single application was single threaded and the only one taxing the system while taking advantage of all the L3 at once, and the other 3 cores were sleeping, it would be a slight disadvatage, but that's an extreme situation. Reply

I think Anandtech power consumption graphs are way off. Phenom power consumption sould be compared to Penryn power consumption plus NSB power consumption. Does any body see any mention of this fact and do the graphs properly account for this?

Phenom effectively has the north side memory controller bus built in. After looking at the Architecture now I know why Intel is always trying to increase MB bus speed. They only have one external bus to feed and comminicate between all the CPU's. Reply

After all of this being said i have only a few words for you:
- AMD's faith it's in the hands of its ability to cut down the prices even more.
- Cheaper by 13% it is just not enough ! ! ! unless they cut down the prices even more they'll loose more customers!
- Of course that they can trust in their marketing strategy( true quad core ), but not for long: It's PERFORMANCE and PRICE/PERFORMANCE that matters and not STYLE or FASHION. Reply

One thing to keep in mind here is that the Quad-Core AMD chip is a 64-Bit part, like it's predecessor. To perform these tests without giving it a chance in 64-Bit mode is not a fair comparison. I would like to see some subset of these same tests with Windows Vista Ultimate 64-Bit. The drivers should be out there for that configuration. Let's see how it performs in that mode, I don't think there will be much competition from Intel then. When everything finally goes 64-Bit AMD will have the advantage, since they got in at the ground floor. Reply

I was checking holidays buyers guide and then I noticed that in the tests you were using for intel a motherboard that costs $335 and DDR3 memory which is $580 (I think I must be wrong here), while for AMD you used a $180 board and DDR2 memory for $121. That would be comparing a $915 platform vs $301 one, am I correct? Don't misunderstand me, I just think I didn't understand what you used to test what. Reply

I wondered too, but I guess they figured the newest AMD chipset was most appropriate for the test. Seeing as availability there is shady as well, there aren't many options. Not sure why they didn't choose a cheaper Intel board and DDR2 (since no DDR3 for AMD) but at those speeds their tests don't seem to show much extra performance for DDR3 over DDR2 anyway. Reply

Hopefully IPC improved Shanghai are doing flawlessly in AMD's R&D Lab now. Phenom is worth buy when their TLB errata and quickly ramp up to 3GHz stable cpu.
Quad CF is something attractive to hardcore high end gamers when Phenom 9 series can come out something at 3GHz that are able to give some really competition to Intel's Penryn and Kentsfield. Reply

AMD's OverDrive Utility looks very good, and extremely exciting to hear it actually works. nTune's screens are much larger when sized big enough that there are no scroll bars on the screen, something that's annoyed the carp out of me since it's release. While I do have a couple high res displays, most of the stations in my workshop have only 1024x768 or 1280x1024. I very much like how AMD has designed the interface, and look forward to trying it some time soon.

Other thoughts:

There is no doubt good value in these new quad cores.

Monitary value if and when AMD cuts pricing -- if the last year is any indication of their commitment to this, we may see much more than just sub $200 quads. For mid-range parts, these processors are plenty fast and any mainstream user would be plenty happy using one. The non overclocker does not consider how much headroom is in a CPU and the fact that the Intel equivelents will clock much faster than AMD's parts is completely nullified. It gets right back to basics: price. The overclocking community is growing fast however, and it's nice to see both AMD and Intel recognizing a dedicated and loyal clientel by releasing the Black Editions and showcasing that new chip of Intels.

Value in technological advancement -- AMD doubtless needs Phenom core technology, and accompanying chipset and socket technology in order to evolve into the next generation. Like Formula 1 racing (or any racing for that matter), not moving forward in technological development equates to moving backwards because if you are standing still, everyone else is passing you. AMD has/is learning a tough lesson in letting the competition catch up as they coasted for so long with the Athlon64.

Value in power savings -- Each core is independant, and can be clocked at different speeds?! CoolnQuiet on 2 cores, the other two happily crunching full speed in some game? Lower overall temperatures in the end? Oh yeah, baybuh, watch the Silent PC enthusiasts and HTPC builders flock to this!

Value in reduced spoilage -- one core out of four not quite up to snuff? Disable it and bin the part as a Tri-Core CPU at rediculously attractive prices! In the end, less waste means lower costs to us all as the overall cost of all the products will be greatly reduced. Over time, less spoils per wafer will doubtlessly be achieved, but by then we should see spoiled and binned 8-core CPUs appearing as 6-core, 5-core, 7-core, maybe even 4-core! This is a tough thing for Intel to match untill they get their quads all on one die.

Are people just completely ignoring the fact that the listed Phenom prices are PER 1,000? There's no way you'll be able to buy a $280 2.3GHz Phenom at "launch" (if you can find a retailer who has them in stock).

Also, the Q6600 is a completely fictional product, and can't be bought from Newegg retail for $279.00 USD. This is why people are pointing to the Phenom as the clear price/performance leader. *boggle*

While I'm glad AMD fans can find some redemption in Phenom by pointing to the inevitable price cut, this is incredibly short-sighted. Simply put, AMD CANNOT AFFORD to play the margin-slashing game with Intel forever. Phenom needed to *destroy* the Q6600. That would have given AMD the ability to charge a performance premium, and start MAKING MONEY. High-end Phenoms would have been great chips to bridge the gap in performance between the ridiculously priced Extreme chips and the Q6600 (and be priced accordingly), but instead it's outperformed by chips over a year old, and has difficulty keeping up with the still amazing Athlon 64s (albeit in single-threaded apps, but still...)

Kyle at was exactly right, AMD needed to launch a high-end 3GHz part, and bin down to 2.2GHz to fill out the processor family. They did not do this, and are in trouble.

I was really hoping AMD could come out with something special, and gain some much needed market share.

Right now it's clear Intel is holding back (whether to soak up profits or in fear of completely destroying AMD and bringing anti-trust/monopoly investigations down on their heads... or both... is dependant on how you spin it) and a revitalized AMD would have kicked Intel into high gear, bringing advances to the market and the end-consumer faster and lower-priced.

I really hope AMD gets over it's yield issues FAST, and is bolstered by sales of it's Spider platform, but overall this is bad news for the industry. Competition pushes technology forward, and while Intel's tick-tock cycle (and frankly the necessity for Intel to compete against it's own previous-gen products, if nothing else) will keep moving us forward, no one doubts a stronger AMD would have pushed both companies to bigger and better things. Reply

I posted that because I kept reading posts over and over from people saying the Phenom has the best price/performance ratio. It clearly does not. I simply wanted to emphasize that based on the initial pricing from AMD, the Q6600 is a huge problem for the Phenom, as it's simply the much better buy.

First time posting here, I've noticed that no one has made any comment about the power useage of the processors. The claim that the Intel processor is more efficient at idle is only telling half the story and is misleading.

Intel has quiet clearly claimed the performance crown with the core 2 duo but anyone who has been following the direction of the industry over the last couple of years will notice that performance is not as much of an issue as it used to be. Even a three year old processor can quite happily perform most everyday tasks, qed performance is no longer king.

Alot of effort is being placed into efficiency and performance per a watt. Evidence of this can be seen from VIA still being alive and well selling it's C7 processors, the growth of AMD in the datacentre and the growth of dual core chips.

I wont say that the phenom is a stellar peformer it's not, but if you look at the improvements that have been made to it you will notice that most of them centre around power management and efficiency.

back to my original point, comparing the power draw of only the processors at idle is incompentant at best and deliberately misleading at worst due to the fact that the memory controller is build onto the chip of the phenom. You really need to see the power draw of the system at full load compared to make any reasonable assumptions.

Consider that quad cores are more likely to end up in a data centre than a home system in the near future and that for every 1w of energy dissipated by your computer you will be spending ~1.5w to keep it cool and then Phenom starts to become much more interesting, add the extra memory bandwidth and native quad cores and I think you will find the performance gaps start to close a little bit on sevrer oriented applications (obviously this statement is conjecture, i'd really like to see some comparrisions of server-type loads if anyone has them but i suspect the differences would be much smaller than shown on end user loads)

The comments on this topic were great. AMD really let us down on this one. I trade Intel stock, but only buy AMD chips and today I am so dissapointed. I hope AMD read these reviews and get their act together. Patience is a virtue, so I am going to wait for another review after better yeilds have been produced. Guess I can tuck my cash back into my wallet.

mmmm, Maybe AMD was joking with this chip. Could Santa be delivering the real Quad? With this reveiew we better hope so.

I hope all of you choke on your comments. You're completely judging amd on pre production samples that will all mature and give more accurate benchmarks by the time they hit stores. No it's not a core 2 killer but amd never planned for it to be. They laid out a road map some time back and are following it as best they can with the resources they have. It's the first native quad core, intel can't say that. It's brand new, so it's full potential hasn't been seen and probably fully developed yet. You gobshites are just like the rest of America I have to live with everyday and want it NOW. You sound like a bunch of 2 yr olds screaming mine mine. Reply

Anand stated that AMD has promised availability by the end of the week, so unless they shipped him below-average samples to test, this is indeed representative of what will hit stores at first. AMD may or may not have planned it as a Core 2 killer, but if Intel outperforms them on speed and power usage at the same prices, who is gonna buy AMD? Being the first native quad-core is an interesting Trivial Pursuit fact, but it won't win many sales unless performance goes up or price goes down. Reply

All we can do as consumers is make educated choices based on the information we can get.
Your arguement is "wait and see," which is not a bad position to take, but this is the first TANGIBLE evidence of what AMD can do with its new chips, which most of us have been waiting over a year for. The information we have right now says that the current best AMD quad core chip can't compete in performance with the slowest intel based quad core, which if our information is correct, will cost the same.
By comparison, you can see the first time Anand got a hand on a test sample of Conroe, the results eventually foreshadowed real world performance:
Reply

the same way cpuz showed the phenom as single-channel memory, this sandra test is probably showing the result from 1 of the memory controllers... thus both channels together are undoubtedly pumping out 10000+, which makes a LOT more sense Reply

Yeah, I was in a similar position a couple months back too but I could tell from the winds that the Phenom wasn't going to be that much better than the Core2 if at all so I sprung for the Q6600. Looks like (unfortunately) my estimation was quite correct =/ Reply

Currently, the L3 cache/NB on these chips runs at a fixed frequency that's actually lower than the rest of the CPU frequency: 2.0GHz. We tested Phenoms running from 2.2GHz all the way up to 2.6GHz, and in all cases the L3 cache and North Bridge ran at 2.0GHz.

What's the performance impact of this limitation? I don't know how to quantify it. It seems like it might have a material effect on performance. Perhaps Anand can re-run his benchmarks at 1.8 and 2.0 GHz to get a sense of the performance boost when this limitation doesn't exist. From this we might be able to determine how much this shortcoming effects performance at speeds greater than 2.0 GHz. Reply

My Athlon 4200+ (2.2ghz) overclocks to 2.7ghz stable. Anything passed that and the second core skyrockets too high in temp and fails prime95 torture tests, while the first core has yet to fail ANY tests and runs up to 10 degrees C cooler.

My point is the individual core overclocking is the only positive I see out of phenom. If i had this ability in my current athlon 4200+, it'd be clocked at 2.7 and 3.2+ easily. Reply

if it managed to be as cool and quiet, then I would have gone there on a quad because of price plus they have better and more consistently cheap + featured motherboards.

sux.

Well, on that thought, the new 790x motherboards with this actually might be low power when its all set up and done. fx or x I think does relatively the same thing. isnt the chipset supposed to be incredibly low wattage?

Everyone knows AMD processors have always been, and will always be, far superior to the crap from Intel. Any article suggesting otherwise is clear evidence of pro-Intel bias, that indeed you all get weekly checks from Intel for the favorable press. The reality is that most Intel processors really don't even work at all; all the supposed PC's sold with Intel processors secretly use AMD processors instead, but again Intel pays off the companies to say they're Intel Inside. Intel has an endless supply of money because of their unfair business practices and the Magic Money Fairy.
Reply

quote:and the company was committed to delivering Phenom before the holiday buying season

Good idea, but didn't they pick the wrong parts to focus on? I doubt many of these quad-cores are going to find their way into budget systems, which is where they might see significant sales volume. Unless AMD is hoping hype from the Phenom launch will turn into better sales of Athlon X2 systems, it is odd to focus on parts which are high-end for your current line but not in the greater market. Reply

Well I have seen reports that AMD is expecting o release higher clock speed up to 3.0Ghz.

Now when Intel releases the X6800 at 2.93Ghz it was already obvious the CPU at max would do 3.06Ghz with model 6900.

Now AMD with the 9900 at 2.6Ghz, 9800 at 2.5GHz, 9700 at 2.4GHz, ...
Tell me what model name will AMD use with the 2.7GHz part?
At least Intel with the 6800 still had the 7xxx, 8xxx, 9xxx.
Since this is a new product isn’t it strange, unless AMD will kill it soon with Phenom 2, but then again why not release it as Athlon X4?

I understand the needs of AMD to match the future Intel model numbers, but however it fails miserably since its higher model number fails to meet the performance expectations. Reply

What gives? This was a LOUSY review, something I don't expect from Anadtech. If you want a THOROUGH review, try Tom's Hardware. Anand, please remove your lips from Intel's stinking rear end long enough to actually review the competition's products.
For those of you switching to Intel, I hope you enjoy replacing your entire platform AGAIN next year, since Intel is clearly unable to upgrade their products without requiring a new socket every year. Reply

Tom's Hardware review is more thorough, but it's also WRONG. They made a huge mistake that completely invalidates their conclusions, which are arguably the most important part of the article. The problem is they state the price of the Q6600 as being $329 (on page 20 of the article), when in fact it's readily available at $279 (or less). Consequently, when they conclude that the Phenom offers equal _value_ to Intel, it's not true -- the Intel chip offers ~15% more performance at the same price as the Phenom, which means Intel beats AMD in both performance AND value. Reply

I do agree that Anand's review could have included more tests, (and preferably some older Dual Core C2D's and X2's to see where the new chips stack up). However I also realize that Anand will most likely post a follow up in the very near future with a much more detailed analysis of the chip and its various caveats.

Tom's Hardware has a TON of testing information... however You have to take into account that Tom's tests were done under the direct supervision of AMD, on a system AMD specifically built for testers. Anand's tests, while perhaps not as plentiful, were at least conducted in the same lab environment as every other chip they've tested creating a level playing field.

I myself an let down, as I was hoping to have a nice upgrade route for my X2 6000+ in the near future, but it looks like I'll be holding onto this CPU for at least another year now. Like it or not, Intel has the best CPU's presently. Trust me, an admitted AMD fanboi, I don't like it myself, but thats just the way things are at this point in computer history. Reply

Nevermind that most people don't upgrade systems every year, so whatever new socked comes out with Nehalem won't matter until they upgrade. And those who do upgrade every year are probably already on Core 2 and know it is expensive to constantly upgrade. Reply

quote:The platform sell is a great one to an OEM, but it's simply not compelling enough to the end user - if Phenom were more attractive, things would be different.

To make the CPU more attractive AMD desperately needs to drop the price

I disagree. When an average Dell customer (who probably doesn't read anandtech CPU reviews) goes to Dell, he thinks more expensive = more performance. He'll never know that in a controlled benchmark, the $269 Core2 edges out the $289 Phenom. He'll just see that one config costs more than the other, and if he wants to pay for the "upgrade" he will.

This isn't as relevant since the Intel models are separated from the AMD models, and Intel has much more expensive parts available if that's what you're after.

But I think it's important to note that being attractive to enthusiasts is not the same thing as being attractive to OEMs and their customers, so one conclusion can't be shared with another. Reply

I have been recommending Core 2 Duo machines to friends since its inception. With this latest round from AMD it wont change. I cant justify buying a product that is so much slower than the competition.

At the very least they could have lowered its power consumption. Slower and more power consuming is bad news imo. Reply

I'm not surprised that Intel's chip uses significantly less power (although its curious that the QX9770 runs significantly hotter than the QX9650). Believe Intel's marketing or not, they have made a breakthrough in reducing transistor gate leakage which has recently been a huge source of power consumption in these high performance procs. Intel is now using "high-k dielectric (hafnium based) plus metal gate transistors" as they call them, as opposed to the SiO2 gate insulator and polysilicon gate.

-Motherboard support is still bad.
-The promised better performance on AM2+ and couldn't deliver
-The yields are still bad
-Where the hell did "Spider" come from? What happen to the 2.4- 2.6 GHz Star? Good greif
-Quad Intel is cheaper and better. So AMD can't even offer the processor at a penetrating price into the market (which they are about 1-2 years late)

I suppose the results aren't too surprisingly given AMD's lack of openness regarding Phenom. No less disappointing though.

@ Anand - Kudos for taking the high road, and not accepting the trip to Tahoe on AMD's dime.

I almost had a fit though, when I read that Intel just stuffed an unreleased ES chip with a market value of >$1000 into a box and FedEx-ed it off. An armored car complete with armed guard somehow seems more appropriate :P. Reply

quote:Looking at the multi-threaded results we see that AMD does gain some ground, but the standings remain unchanged: Intel can't be beat. We can actually answer one more question using the Cinebench results, and that is whether or not AMD's "true" quad-core (as in four cores on a single die) actually has a tangible performance advantage to Intel's quad-core (two dual-core die on a single package).

If we look at the improvement these chips get from running the multi-threaded benchmark, all of the Phenom cores go up in performance by around 3.79x, while all the Intel processors improve by around 3.53x. There's a definite scaling advantage (~7%), but it's not enough to overcome the inherent architectural advantages of the Core 2 processors.

Overall the article is good, but the above statement is not true, there are no "inherent architectural advantages of the Core 2" there is just a CPU that is fed with Core2 optimized Code (SSE128, etc.) and another CPU that is fed with K7 optimized code in the best case (no SSE128).
Thus, if you are running cinebench, you should definitely buy a Core2 CPU, but you cannot draw the conclusion that Core2 has architectural advantages. Maybe the other programs suffer also from that code problem, but it can just be proofed for cinebench:
http://www.maxon.net/pages/download/cinebenchtech_...">http://www.maxon.net/pages/download/cinebenchtech_...

It would be interesting, if you could contact Maxon and ask if they are working on K10 optimizations.

Anyways Intel would be in lead, with 45nm they could easily launch a 4 GHz part, but that is not even needed, as AMD just got 2.3 GHz. Disappointing indeed ...

You haven't got a clue what you're talking about. The Core 2 and K9 (I'm not buying this is a K10, I think the benchmarks prove it's a dog. Besides, why skip a number?) are internally quite different. To say one does not have inherent advantages over the other is incredibly ill-informed. They each have advantages, it's just that Intel's tend to matter more. For example, if you are running x87 based software, the K9 is your pooch of choice. But who cares about this now? Why did AMD allocate so many resources to an obsolete instruction set (it's not even part of x86-64)? They should have killed it and saved the transistors, but they don't have the design resources. The big advantage Intel has is scheduling; it has memory disambiguation, whereas the K9 is stuck at a P6 level of scheduling loads after stores. It's a huge advantage. The whole internal architecture is different, it's very easy to understand why each would possess different performance characteristics.

Software efficiency can make a difference, but your language thing is totally inappropriate. They are both being fed the same instructions for most applications, and the K9 gets raped. Now, before you whine to the moon about how it's Intel optimized, maybe you should think about how well the P8 (or P6++ if you prefer) did when it came out initially. Certainly, even you can understand it did not have optimizations, yet easily outperformed the K8 to a point where it became obvious it was obsolete. So, tell us about how optimizations for the Core 2 made it so vastly superior, OK?

The reality is, the P6 was ALWAYS better than the K7. When the K7 came out, it was slightly faster (mainly because of the memory bandwidth), but considering the power use and size, it was generally an inferior processor. The Coppermine proved to be a terrific processor and Tualatin was even better. When Intel moved it to the Pentium M line and finally increased the bandwidth, AMD could never compete in the mobile space. Face it, the K7 sucked, the K8 was dreadful, and the only reason either were successful was because the P7 was arguably the worst processor ever made and sold in any numbers. It was mind-boggling bad, and incredibly got worse with the Prescott. The K8 was the death of AMD, it didn't advance the processor design enough to compete with a good design, it was just better than a very poor one. When I saw the K8, I figured AMD was doomed, but the P7 only got worse so I ended up being wrong. Now Intel is back to the P6 derived cores, and AMD simply can not compete. It's not about optimizations, they simply don't have the design talent or resources, or the manufacturing. It's not about them not understanding something, or some weird conspiracy. It's like a mouse fighting a lion. It's OK when the lion is asleep or sick, but when it wakes up, you can't really blame the mouse for getting eaten. It never had a chance. AMD has NO CHANCE against a healthy, well-functioning Intel. They never have. They can thrive only when Intel makes egregious mistakes. Superior design, and soon the superiority of their Hafnium based manufacturing will create such a delta for AMD, they need to look for a buyer before they go belly-up.

To put it all in perspective, this new release is supposed to be AMD's leapfrog over Intel, until Intel can release the Nehalem. If they were competitive companies, that is how it would play out. Instead, it's an absolute blowout for Intel. On the release of the new AMD chips, the disparity between the two companies is the greatest it has ever been since the Pentium Pro was pitted against the 486 from AMD. AMD can improve performance though, I think it is clear the cache arrangement is very far from optimal, and can be improved significantly. The clock speeds will certainly go up quickly as well. But even with this, they will end up being slower clock normalized for most apps, and use much more power. And then, the few advantages they have go away with Nehalem. It's just going to get ugly.

AMD has returned to it's well-established role as a bottom feeding catfish. It can suck up all the waste Intel leaves it, but as a competitive company, it's over. It's been that way for a year and it's only getting worse.

Their only hope is IBM buys them. Combined they would have the ability to compete with Intel. After all, the Power 6 blows away the Core 2 and Itanium 2. AMD as an independent company makes no sense. IBM/AMD makes a lot of sense. Maybe Sun will buy the AMD corpse, but that makes less sense than IBM. It makes more sense than AMD alone though.

Is it a coincidence AMD is looking to open a fab in upstate New York? I hope not. Reply

AMD had their improvements, but you are generally right, but not about IBM.

Remember the days that IBM had the Cyrix processors? You know, the one with model numbers? Where did that go? They went to plucky VIA. So, IBM ditched their x86 business; what makes me think they can compete now?

Well, that is what you get if you are Intel: Have multiple teams work on processor designs, just in case one sucks.

Yet, what I always say I will say again, the vast majority of people who need a computer can have their fill with the lowly D201GLY2 board. For those who need gaming or the Aeroglass stuff, you can have a cheap AMD/nVIdia 7025 system.

As it is painfully clear now, AMD should never have done a native quad core for the desktop. Maybe for servers. Reply

First off, how much do you know about instruction sets and compilers? I'm going to assume nothing. If you do know something, then consider this for the edification of other readers.

Compilers take code written in one language and produce an output in another. Specifically, compilers take code written in a language that can be easily understood by humans (ie C++) and output code that a machine can understand (machine code, or bytecode in the case of a JVM).

Now, the problem with enhanced instruction sets, like SSE4, and SSE4.1 and SSE4a, is that they require compiler support. Imagine an instruction set as a vocabulary. And compilers are the programs that produce books, using a specified vocabulary. Now, the simple truth is that Intel makes extra effort to get its vocabulary into use. Thus, Cinebench was most likely compiled with Intel's latest vocabulary, and not AMD's.

So, part of the K10 update was that it allowed SSE operations to be completed much faster, and I'm presuming this requires the use of its new instruction set. If so, that means that Cinebench was basically running in K8 mode.

Not so far fetched, just means AMD has to make sure people update their code. The instruction set issue is another reason why RISC CPU's are generally simpler and faster. Reply

Acutally, there shouldn't be any code modifications needed to make use of the new SSE functionality. The difference is internal to the CPU, meaning that it now processes SSE instructions without splitting them. The instructions used are the same as before.

That said, there may be untapped potential in the CPU that can be uncovered by the use of a different compiler (due to other reasons). Though, as far as I know, Intel compilers often produce faster code even for AMD CPUs...

About the "architectural advantage" Anand mentioned: Of course the Core 2 has an architectural advantage. It's pretty obvious from the fact that it performs faster, clock-for-clock, in almost all cases while having much higher frequency potential. Not even AMD's integrated memory controller can raise the computational efficiency of their CPU enough to really challenge Intel. AMD may have a more elegant external design and interface to the rest of the system (native quad, HyperTransport, integrated memory controller), but Intel obviously has the more refined internal design. Sadly for AMD, a computational advantage seems to weigh heavier than a neat system/core interface in this case. Reply

quote:Acutally, there shouldn't be any code modifications needed to make use of the new SSE functionality. The difference is internal to the CPU, meaning that it now processes SSE instructions without splitting them. The instructions used are the same as before.

Yes I know what you mean, the SSE instructions are the same, they are just executed faster (in 1 clock compared to 2 clocks before). That is correct, however I wonder how much code is out there that is compiled with the old Intel compilers until 9.X.

The problem with these compilers were, that they did not executed the SSE2 codepath on AMD chips, even if the CPU would have been capable of executing it. Instead a slower FPU code is used for AMD K8s.

The newest Intel 10 Compilers have now new compiler flags that can generate SSE2 code for non-intel CPUs, however I did not have seen benches of these so far.

All in all, I guess there are a lot of programs out there that disable SSE on AMD CPUs :( Therefore a plain compile test of several open-sorce prgorams with gcc / Sun / Pathscale compilers would be nice. Intel CPUs could be benched with Intel compiler, too, any CPU should gets it best code.

Lets imagine an English native speaker ... would he understand Spanish ? No, not much ... but maybe his fried, who learned Spanish in school is better in speaking Spanish, nevertheless, he wont be as good as a native Spanish speaker ...

Who would be the guy with the "superior, best language capabilities" now? The Spanish, the English speaking guy, or his friend ?

quote:I'm extremely disappointed with phenom, I was planning to get the entire spider platform for my yearly upgrade cycle, but that seems to be a bad idea

I'm waiting for the review on Quad-Crossfire first...
I figure I can get 4 x 3850s for about the same price as an 8800 Ultra. The question is, is it worth it?
If XfireX is good, then I will pull the trigger on 4 x 3850s (or 3870s if I can get them before Xmas), a 790FX mobo, and probably a Phenom 9500...
Then I'll upgrade the Phenom in March when the B3s finally come out. Reply

quote:If AMD had its way, today's Phenom review would have been done from beautful Lake Tahoe, on a system that AMD built, running at a frequency that isn't launching.

Good that you did not agree to this.

It is sad though that you agree to Intel tactis.
Reviewing a cpu that has no platform and will not be released for months.
Is that the worst paperlaunch ever and you happily benchmark it(QX 9770 or whatever it is).

Lets not go back to late 90s when Anandtech was a huge Intel fanboysite with payed
reviews by Intel with 90% of your ad revenues from them.

Now you are one of the best quality hardware review sites out there, lets not ruin it.
Reply

Wait... wait... the entire enthusiast community was beggin', pleading, for any early preview data on Barcelona and Phenom, it is not uncommon for companies in yesteryear to provide test sames weeks or even a month before launch to provide preview data. AMD gave us none of this, but power points, and promises that did not pan out.

Now, Intel provided this CPU as a 'spoiler' for the phenom launch no doubt... but has AMD ever pulled the same shibang? Of course, every IDF there is something new announced to spoil the party.... what do you expect??

Come on... not only is intel providing more performance and more options, but they are doing it with ease... give them credit for that... if there is a travesty here it is that AMD cannot be competitive and the cost factor, if you want top notch performance, is going opposite of what we would want... you blame Intel? Blame AMD for a crappy showing. Reply

I don't think Anand's preview of the QX9770 shows any favoritism for Intel at all. Yes, Intel is distributing a pre-release version of its chip, but this fact was clearly stated in the review. Allowing AT to benchmark the chip prior to release is similar to what they did prior to the C2D lauch and is therefore not a paper launch but simply a performance preview. Being a site for computer enthusiasts, AT would be crazy to refuse to evaluate this processor -- such a decision would only drive readers to other sites.

I believe that an enthusiast site's primary duty is to remain objective when evaluating products, and in this case I feel Anand's preview of the QX9770 was quite objective, highlighting both the positives (performance) and negative (throttling at stock speeds, heat, power). Truth be told, in my many years of visiting this site, I have never felt that any review has ever been unfairly biased... Reply

Well, they also went ahead and OCed the Phenom chips they had to simulate the 9900 and 9700, which are supposed to arrive in Q1 '08, i.e. roughly the same time frame as the QX9770 and X48 chipset. Furthermore the article for the most part seemed to emphasize the performance of the OCed chips and ignore the stock-clock parts, which will actually be available soon. So in a way you can say they did no more favors to Intel than AMD, in that this review here largely hangs on future processors as well. Reply

Have you ever played monopoly and had too good of luck for your own good at the beginning? Then your opponents start making ridiculous deals amongst each other to bring you down? I'll trade you Park Place and Board Walk in exchange for Water Works and $50. What should Nvidia do?

AMD has really managed to slow progress down this year. We're still talking about beating an 8800GTX, and can't seem to find a better deal than a Q6600 for miles to come.

AMD's products aren't really terrible, but they just can't compete with the very best. I think we can talk about Intel/Nvidia b/c though they are different companies, they are run the same way... faster, leaner, more agile, more desirable.

It's interesting to look at the AMD roadmap, and see that they expect there to be an integrated GPU on the CPU by 2009. Is this why they merged? Do they really think Intel and Nvidia wouldn't cooperate if they had to on a competing product? When AMD was successful, NVIDIA was a big part of that success. Now NVIDIA has to buddy up with Intel and nobody could fault either Nvidia nor INtel for "monopolistic practices" for giving each other preferential treatment. Amd needs to choose it's opponent... is it Intel? Or is it Nvidia? Cause at this rate it's going to loose to both. They need to make sure to treat NVIDIA right when it comes to their CPU business. It has nothing to gain from buddying up with Intel... but maybe cooperating on a chipset with NVIDIA that allowed for SLI or crossfire at the same time would be good.

Anyhow, hopefully AMD can continue on until they come up with something good. CPU wise I think they're handily more screwed than graphics wise. Just to think, if Intel were under pressure it could sell a 3.0Ghz quad for $215.00 without breaking a sweat... it could do so within a month's notice. Intel is going to start releasing it's next generation because it absolutely has to in order to survive... they can't wait much longer ... and it's not about drowning AMD. It's about the fact that Intel has to compete with the Intel chips it sold last year and the year before that. Cause CPUs keep ticking for decades, and that's a huge part of the competition (paid for chips that are 80% as good as new ones, are going to beat pricey new chips, that's all there is to it).

I'm currently wondering how much of the cache diff between AMD and Intel is performance related, Intel is shipping CPUs with ~12MB cache, AMD today, still ship 2MB/512K, I'm wondering how much of the gap between AMD and Intel is due to cache sizes?

Because if you take away this diff what happens then? I strongly believe that AMD has a good architecture once again, but Intel has a superior manufacturing process, and this makes it possible for both higher clock speeds and larger chace levels, and that is one reason why Intel is faster (amoung some things) Reply

intel designs their cache structure around their manufacturing strengths. AMD designs their cache structure around their manufacturing weaknessess. That's what you get with 2 totally different sized companies. Reply

The difference cache sizes (between Phenom and Penryn) really doesn't make a significant difference in performance. They both have pretty large caches, and increasing the cache will only decrease the misrate by so much. Their caches are so big that most programs probably don't suffer from capacity misses, or misses from limited space in the cache. Making the L2 cache too big would increase the hit time by too much, so I think this is partly why they're using a L3 cache now.

I think the main reason Intel makes its cache so big is for marketing purposes. They've got more transistors than they know what to do with so they just put them in the cache. I think AMD was smart to reduce its cache sizes (I think in K8?) from 1MB to 512KB because it probably didn't make a significant performance difference (don't know for sure though) and it saved them money. Of course it's important to remember that each architecture will use the cache differently. Reply

There are benchmarks with equal processors (from Intel) differing only in cache size. Usually (in the processors benchmarked, at lower frequencies) the difference is under 10 percent.
Maybe in quad cores at higher frequency (2.6 instead of 1.8-2) the difference will be bigger. Reply

quote:I'm wondering how much of the gap between AMD and Intel is due to cache sizes?

Doesn't matter, unless you intend to buy one of those parts and purposefully disable part of the cache for some reason. You might as well ask "I wonder how much of a gap would be between those parts if a clock cycle was added to the amount of time it took to execute every instruction in the Intel part". The cache is a part of the design (it's just more easily scaled).

I wonder if the Ferarri 365 Daytona would have performed as well if it had only 8 cylinders instead of 12! You gonna buy one and replace the engine in it to find out? Reply

Little bone to pick at the bottom of page 9" "if you're looking at quad-core, chances are that you're doing something else with your system other than game."
Future games will use several threads, I'm sure (i.e. Alan Wake). Crysis performance seemed to improve for me when I OC'ed my E6600 to 3.0GHz, holding at a consistent 21-22FPS in the Crysis demo during the more demanding views with my X1900XT. While I'm not exactly running benchmarks under lab conditions, the increase in CPU power did seem to help. IIRC, Crytek has stated that quad cores will be used by Crysis.

Other than that sentence, excellent article, touching on all the major points.

I would like to compliment and say thanks to you for Page 3: First Tunisia, Then Tahoe? The situation and your actions described on that page are a primary reason why you and this site have so much credibility with us, the readers, which is why we come here. Keep it up. Reply

Had to reply and clarify. The phrase refers to current setups in which quadcore gaming is not a primary reason to purchase said processors. Alan Wake, Crysis, and others, while being able to take advantage of quadcore setups, are not out yet, and I would guess that less than 3% of games out there now are quadcore capable, and scaling of such games is probably hardly optimized even if there were more supply. You speak in a future context in which these games will be available, even still the abundance of them will still be in wait.

Nonetheless, Anand that might be your best review in my book to date. It speaks honestly and depicts a very seemingly structureless company that only has time on its side to pull itself up from potential disaster. AMD has not shown too many positive strides as a company lately, and mindless spending on what should have been a direct "let em have at it" approach only shows what was speculated previously: the company has a vacuum in leadership that needs to be filled by capable hands. And we all know proper administration starts from the very top; Hectors step down will not be mourned by me. Dirk has his cup filled, but his past credentials speak highly and show him capable. I can only wish well.
Phenom needs to be priced competitively, simple enough. and it needs higher quality yields for overclocking. Its amazing how they can stay just one step behind in almost every step currently. I hope the 7 series mobos bring about better competition vs Intel and Nvidia boards. We as a community need this to happen. Reply

Thats such a mixed bag it makes me sick. Phenoms are mediocre at best at almost everything but somehow magically rape Intel in Crysis. I'll have nightmares. WTF is going on internally in those four cores to make this happen? I hope software manufacturers code well for AMD so they can shine. The pro-Intel software market is huge and thats where the fight is. Unfortunately it doesn't look good for AMD there either because programmers hate having to learn new code. Reply

You really think those shining numbers are realistic from pre production sample cpu's? I think you should all wait til sites have full production MB's and cpu's and can give real data with all their tests, then you can decide it's a steaming sack of buffalo droppings. Until then, you'll just sound like a racaous bunch of squabbling crows. Reply

"It turns out that gaming performance is really a mixed bag; there are a couple of benchmarks where AMD really falls behind (e.g. Half Life 2 and Unreal Tournament 3), while in other tests AMD is actually quite competitive (Oblivion & Crysis)."

the UT series and halflife 2 are both very CPU intensive, while oblivion and especially crysis are videocard killers. it's hard to say but it sucks in games as well. you'd wanna see a difference in crysis though, make it scale to about 640x480. it's just too demanding on the graphics card to compare at 1024x768. in lament terms, in HL2 the graphics card is usually picking his nose waiting for the CPU, so a stronger CPU will make a dig difference. in crysis especially it's exactly the other way around, so that's why scores are closer together.

why is this true? in half life 2, there are 50 frames between the best and worst of the line up. in UT3, there are 50 frames between the best and the worst of the lineup (meaning regardless of clockspeeds or architecture. the frames per second is also measured in the hundreds, even by such a new and graphic intensive game as UT3 (though it should be noted as far as i know the beta demo did NOT ship with the highest resolution textures to keep file size down). now crysis has about 9 frames per second difference between a 2,2 ghz phenom and a 2,66 ghz intel proc, and oblivion manages about 16. same system different game it can only be concluded that the game is much more graphic card intensive, which shouldn't be hard to imagine since crysis is well known to be the graphics card killer of the moment.... and oblivion of the last generation (HDR and AA anybody?).

AMD's 10-30% slower in every other test they are as well in gaming. i belive the difference would've shown more if they used a SLI or crossfire solution, though i understand that's not possible with the chipsets and drivers existing at the moment.

Who cares what this goofball said in his review, or all you other haters. AMD is following their roadmap as they've laid it out, they released early samples cause of the noisy minority screaming for a new chip and it didn't turn out right. You think actual production chips will have these issues or not mature over the next few weeks? Good God there's alot of you people out there with unrealistic expectations and some severe perception issues. If you can't say anything worthwhile about the phenom launch, why don't you just quit typing and spare some of us that look forward to seeing this chip actually release and mature and give us a quad core that isn't the intel p3 based quad core. Reply

Thats how all AMD fans feels. I was hoping at the VERY least that AMD cpus would have a lower idle/max power usage. That would of been a sell point even if Intel CPUs are faster. Too bad AMD's processors are lacking all across the board. But as a college student, if they price it REALLY cheap, i'd be willing to stick with AMD. Reply

This was a very sobering article. On a slightly brighter note, Toms Hardware was able to get their Phenom to 3.0Ghz stable using air cooling, so perhaps there is some hope just yet. The nice thing is that AM2 motherboards can use this processor so the upgrade ability is theoretically friendlier than some of Intel's boards. As long as AMD prices it accordingly, I think it will do alright. It will be nice when AMD really gets going *someday* and we have another successor with a similar bang that the K8 had at its introduction. Reply

Tom got the same overclock Anand did. Anand ran a comprehensive suite of stability tests which forced him to step the overclock back to 2.6ghz for stability. Tom ran 3dMark06 once and did not/could not/was not allowed to test for stability.

This is what happens when you drink the Kool-Aid. And yes, I'm looking at you Tom's Hardware. Reply

Why wait? Why on earth would you want to wait? Read Anand's rant, which I agree completely with. AMD has become it's own worst competitor. It's bending to pressure and they're steering away from their own customers and focusing in how to "compromise" their competitor.

They didn't even know what they were going to launch until the last minute. So why on earth would you want to wait for more broken promises and disappointment? Maybe we should all take a trip to Tahoe. And while we're there, we can take every upper level manager out for a evening of electro-shock therapy.

How can such a customer-centric company, after making a block buster product called the K8, collapse so quickly from pressure? They had a golden hand and folded it. Granted a CEO has to take risks to progress the growth of the company, but what he cannot do is ignore what they were successful at. Like in any business SWOT analysis, starting with strengths and weaknesses, you improve your strengths and risk the weaknesses. For the past 4-5 years, AMD has been risking the strengths and improving their weaknesses.

I've never seen such a failure of upper management since Audigy made the blunder of completely focusing on their strengths and ignoring everything else. AMD has completely lost their competitive edge. Without it, well, you can see with all the red ink AMD has to share with their shareholders.

R&D and it's efficiency is only the tip of the ice burg they'll have to improve. What they have to worry about now is finding a new target market for their processors which by keeping manufacturing prices down could of helped however they completely ignored from day one. They also need to offer better platform support because their major weakness is their manufacturing. To why on earth AMD's marketing department are trying to sell things their manufacturing and partnered chip makers can't deliver is beyond any explanation I can give. They're obviously not setting any real obtainable goals for themselves. They failed every goal this year. The only goal I can see they obtained this year was merging with ATi and doing so just for the sake of growth. Oh yes, and they did shrink to 65nm after all ready switching to DDR2, which made everybody who owned a 90nm DDR2 K8 so enthusiastic to upgrading. (SARCASM). Reply

I wonder if it means anything that there is no such thing as LightWave 9.5

??

The highest version is 9.3.1 and it JUST became available 11-20-2007 and there are no "special", "early", or "advanced" releases from NewTek for anyone in any way shape or form. Oh well, it"s good for a chuckle at NewTek I guess. :)