Op-ed: AMD may be facing irrelevance

New desktop processors don't do much to help the company's precarious position.

AMD's new processors are an improvement, but the company needs to be making bigger moves.

AMD

Today, AMD is announcing availability for a few new A-series desktop processors. Versions of these chips, codenamed Trinity and based on AMD's Piledriver architecture, have already been shipping in laptops and desktops for some time now, but this marks the first time that the chips will be available to consumers and system builders directly.

The chips aren't bad—the integrated graphics processor on some of the chips are capable of beating Intel's HD 4000 GPU in gaming performance, though the CPUs aren't quite capable of beating last year's Sandy Bridge processors at similar clock speeds. For AMD, the issue is that they're still talking mostly about desktop and laptop chips.

AMD is being shut out of new markets

That's not to say that Intel has moved away from desktop and laptop processors—the PC industry may not be growing much these days, but it's still sizable and profitable for the people making the chips (if not always the people who make the computers themselves). Intel continues to roll out new Ivy Bridge processors for mid- and low-end desktops and laptops, and desktops, laptops, and Ultrabooks factored heavily into its presentations about its upcoming Haswell architecture at IDF this year.

There's one thing in the Haswell presentation that is conspicuously absent from AMD's latest Steamroller architecture update: tablets. With Haswell, Intel is pushing the power consumption of its flagship x86 CPUs lower than ever before with a 10W processor aimed at Ultrabooks and tablets. Going even lower on the power consumption scale, its upcoming Clover Trail Atoms (and, going beyond that, next year's Bay Trail chips) are going to end up in lower-cost tablets and even smartphones. It has taken Intel a few years to develop serious, competitive chips to combat the ARM-based offerings prevalent in tablets and smartphones today, but by this time next year I expect Intel to be available in many more mobile devices than it is presently.

AMD has produced a lower-power processor based on the the Bobcat architecture found in AMD-based netbooks and budget laptops—codenamed Desna, the AMD Z-01 has found its way into very few tablets to date, with last year's MSI WindPad 110W being one of the only examples. A follow-up, codenamed Hondo, is apparently in the works—the latest information on the chip is that it's a 40nm part with a 4.5W TDP, and that its graphics performance should be superior to Clover Trail's—but details on are very difficult to come by. Hondo could, theoretically, give AMD some presence in the tablet market, but Atom tablets have so far been much more numerous and it still leaves the company without a higher-performing Haswell competitor or a more power-efficient version for use in smartphones.

AMD did recently rehire Jim Keller, who was lead architect on the K8 architecture that saw AMD through its early-2000s heyday before moving onto PA Semi and working on Apple's A4 and A5 chips. This may speak to a desire to prioritize chips for small, low-power devices, but even if so it could be years before we see Keller's hire make a difference: chips take a long time to design and manufacture, and the AMD of the last few years has shown a frustrating tendency to miss the deadlines it sets for itself. If there are new tablet and smartphone-oriented chips coming from AMD, we aren't going to see them soon.

AMD is losing ground in its traditional markets

So far, AMD hasn't seen much success in the tablet market, and it hasn't shown up to the smartphone fight. What's even more worrisome is that their relevance to anything but low-cost PCs is also melting away bit by bit. This has been happening slowly but steadily since 2006: First, Intel's Core 2 processors beat AMD's then-current Athlon X2 processors in clock-for-clock performance, but an aggressive price war kept AMD competitive in the mainstream market.

As quad-core processors became more common, it later became the case that a cheaper quad-core processor from AMD could beat a dual-core processor from Intel in heavily-threaded workloads, but AMD's dual-core processors were confined to the low end of the market, and the introduction of Turbo Boost, Hyperthreading, and ever-better performance-per-watt eventually helped Intel overcome this in the Sandy Bridge and Ivy Bridge epoch.

Finally, AMD had its integrated graphics performance to fall back on. Its argument in recent years has been that its chips offer more "balanced" performance than Intel's—a "good enough" CPU paired with an integrated graphics part that could replace low-end previous-generation dedicated graphics chips. This argument is still largely true in AMD's current Trinity products, but even this argument is slowly losing its potency as Intel's integrated graphics solutions catch up by leaps and bounds; even two years ago, Intel's graphics were basically unusable for gaming, but current chips provide a reasonable level of performance for older games or newer titles with settings turned down.

The trouble with AMD is that it doesn't have big plans to chase new growth markets like tablets and smartphones, and it's running out of niches to occupy in its traditional desktop and laptop PC markets. The PC market needs AMD (or a company like it) to keep Intel on its toes and to keep prices down, but the company's fumbles and inability to execute have made it hard to have a lot of faith in it. In both the PC and server markets, AMD offers a decent alternative to Intel for the price-conscious, but it looks increasingly unlikely that they'll ever again be anything else.

153 Reader Comments

As mentioned above, you will not be able to detect an AMD rig vs, an Intel rig in a blind "taste test".

The whole "I went with AMD and I'm really sorry I did" sounds like a crock of shit.

Well, your insightful argument definitely trumps my having two machines sat next to one another on which I run identical tasks to determine the facts from which I made my statement. And on which I often profile the games I write for a living.

But you probably know best.

It will depend on your use case obviously. I'm sorry if my "crock of shit" statement came off sounding harsh.

I program too....a program I wrote on my Phenom machine took 5 secs to run when I tested it yesterday. I just ran it on my Ivy Bridge machine an hour ago, and it ran in about 2 secs.

So there....Intel saved me three seconds (I am being generous and not exact with my measurements). Bottomline, Intel's new CPU just kicked the fuckin crap out of a 4 year old AMD chip.

But the point I was trying to make there is that that three second delta in performance does not make me regret buying the AMD CPU which has provided me with 4 years of service and will probably give me several more.

As mentioned above, you will not be able to detect an AMD rig vs, an Intel rig in a blind "taste test".

The whole "I went with AMD and I'm really sorry I did" sounds like a crock of shit.

Well, your insightful argument definitely trumps my having two machines sat next to one another on which I run identical tasks to determine the facts from which I made my statement. And on which I often profile the games I write for a living.

But you probably know best.

It will depend on your use case obviously. I'm sorry if my "crock of shit" statement came off sounding harsh.

I program too....a program I wrote on my Phenom machine took 5 secs to run when I tested it yesterday. I just ran it on my Ivy Bridge machine an hour ago, and it ran in about 2 secs.

So there....Intel saved me three seconds (I am being generous and not exact with my measurements). Bottomline, Intel's new CPU just kicked the fuckin crap out of a 4 year old AMD chip.

But the point I was trying to make there is that that three second delta in performance does not make me regret buying the AMD CPU which has provided me with 4 years of service and will probably give me several more.

Which is fine - but your scenario is expected (new == better) whereas I have a newer AMD chip which (for some tasks like compression) is slower than an older similar-level Intel piece. And that is not a good thing for AMD.

Sorry to jump in late in the discussion – I haven't had a chance to read all the comments.

Will AMD ever develop a Hyper-Threading core design like Intel? AMD has been losing their most vocal and loyal market, the fanboys, in the last two years. The performance gap between a stock Intel CPU and an AMD FX/Bulldozer is ludicrous. It doesn't matter how you overclock the AMD product – the performance/Watt and performance/$ numbers are ugly at the high end.

A 125W 8-core AMD FX gets soundly spanked by an Intel Sandy Bridge CPU, even after Microsoft's scheduler optimizations. The 95W 8-core FX-8100 to 8140 CPUs have been MIA from the retail market since they were listed in Fall 2011. I can only find one HP Pavilion desktop that uses an FX-8100, and it's listed for $750 with a discount.

Also, AMD's lack of progress (Enduro?) against Intel and nVidia's Optimus mobile solutions have kept them at the bottom of the notebook market. They badly need to get more 28nm GPU design wins and support them with intense driver development.

Regarding the newly-available Fusion APUs (http://www.anandtech.com/show/6332/amd- ... iew-part-1): the A10-5700 looks like it hits the sweet spot with a 65W TDP and a 7660D GPU. That will probably make 75% of the desktop computing market happy if it is priced aggressively.

Part of the problem is that AMD approached the problem in a "chicken" manner.They didn't want to lose their money from graphics processors and didn't want to lose the money from CPU's.

They half-assed it by creating CPU's with weak GPU's integrated. What they SHOULD have done, was offer GPU's with integrated CPU's. The idea of pairing up the two in a single chip/card to improve speed is a good idea, but they forgot that the majority of people that are going to drive the PC market want GREAT graphics.

Had they built graphics cards that had an integrated CPU that speed up the operations, say for video game purposes, that offloaded the work from the main CPU to a CPU/GPU tandem on the card, and that handled the all of the processing in a similar manner that the current CPU/GPU combo does on the mobo but with the full graphics power of a dedicated card, they might have seen success. The problem was they didn't look beyond the current mobo/cpu/vid card architecture. Had they simply said, "Hey, we're going to bundle a solid CPU in with the graphics card, and you select the level of graphics power you want"...... I think they'd have had more success, because their CPU wasn't going to take home any prizes on its own.

Regarding the newly-available Fusion APUs (http://www.anandtech.com/show/6332/amd- ... iew-part-1): the A10-5700 looks like it hits the sweet spot with a 65W TDP and a 7660D GPU. That will probably make 75% of the desktop computing market happy if it is priced aggressively.

It still hits the problem in the article, AMD is trying to catch up in an area that's a shrinking market with low margins. That's not exactly success, even if they do do it (which it looks like they have, this chip finally catches up to Intel for a gaming rig as well. If it's priced at the i3 point then I know what's going into my next gaming rig).

Right now AMD should be releasing something innovative to actually get ahead of the curve. This might win them back 1-2% of the market, which is really important for AMD, but they seem too set on just copying what Intel does. They have tried making high end chips and failed, they've half assed their low power chips on the hopes someone else will do the marketing for them, they've finally hit a decent middle ground chip about 2 years later than they needed to.

edit: One thing I forgot to add, AMD is still kind of nice on the socket front. I planned my upgrade last year and picked up an X6 Phenom II and motherboard for $200. That same motherboard will take one of the Piledriver CPUs coming out next month, so I'll likely upgrade to one of those next year. I generally stick with a 2 year update cycle, however since my RAM, motherboard, SSD's, etc are all good I'll go with just a straight GPU/CPU upgrade when the pricing looks good, depending on whether I need the CPU power (generally I don't, but there are times it would be nice).

AMD is nice on the wallet if you really know what you're looking for, however this isn't something well communicated to customers. If AMD gave release plans 3 years ahead, telling consumers they could drop in a new CPU in 2-3 years with the hardware they have now, they could do something in the market that Intel isn't and possibly gain some of the business market, that kind of stuff could work well with business upgrade cycles.

If price is your /only/ concern, then yes, AMD is still a good choice. Especially if you buy the last gen chips (Phenom II) which are better than their new chips, but cheaper because they're older.

And CPUs are so fast these days, do you really need the extra performance per yen? Only if you're a gamer or hardcore data cruncher.

Even if you aren't, it is very useful. I had a core 2 duo E7500 for a few years that I just replaced with an i5-3570. Clocked at 4ghz, in heavily threaded applications it returns litterally 500% higher performance than my old core 2 duo (for example, Handbrake).

I really didn't expect performance that much better, I figured double the cores, some advances in archtecture, okay maybe slightly lower power despite a higher TDP and oh, maybe well get 250% of the performance. Nope, 500% of the performance.

It is massive the difference in just about any application I'd care to name, even ones I wouldn't consider overly demanding.

"simple things" like lightroom exports happen several times faster. Exporting 40 RAW images to JPEG takes maybe 3 minutes now compared to roughly 12-15 minutes previously. Just opening a RAW file and getting lightroom default settings applied takes a small fraction of time. Lots of applications load faster too, and that is when running off the SAME SSD I had in the old core 2 duo system.

That is coming from a processor 5 (2007 manufacture date stamped on the processor) years old. That isn't something that is a decade. I'd consider that relatively recent and the performance is just radically different.

In the desktop space, I think you are not entirely wrong. For less demanding users, the performance is good enough from the faster bulldozer and likely piledriver chips in comparison to the Intel ones. However, the lower end ones...well they are frankly pathetic in comparison to even Intel's lowest end chips. If you used even a low end Intel system compared to a low end Bulldozer and likely Piledriver system back to back you'd notice a huge difference in most things you'd do if all else was equal in the systems. A high end bulldozer/piledriver versus a low/mid Intel Sandy or Ivy Bridge processor? Meh, you might not notice too much difference in casual use.

The price difference just isn't that extreme any more, especially if you talk overall system price. $50-70 extra for a mid to top of the line i5 quad core processor over what Trinity seems to be priced at (or likely pricing anyway) will net you, what seems to be exected to be, such a massive increase in processor power, that unless you are a gamer on a serious budget crunch and forgoing any discrete GPU, it doesn't make a lot of sense in overall system price.

Even for a relatively low cost $600 budget system, an extra $50 upping it to $650 isn't much if you are delivering an increase of 30-80% in processor performance (and less heat/noise/electric bill to). Go up to a $1,000 system and that price differential is even more trivial. A $1,500 system and really, why would you even think about considering AMD?

At best AMD might appeal to those who need the lowest cost system possible. It might appeal to some who know nothing about computers. It might appeal to some AMD fanbois.

That is about it.

Go to the mobile market and the story is much, much worse. In the ULV space, Trinity CPU performance is WELL below half of Intel's basic i5 ULV performance (I think it manages to weigh in around 30-40% of the CPU performance). Even graphically the only Trinity ULV SKU isn't up to par (roughly 70-80% of the gaming performance of the i5-3517u). Go to a full up standard voltage mobile part and then in graphics Trinity can beat out Ivy Bridge, but CPU performance is still abysmal and power use isn't quite as good. Once you are stepping up to those levels, with the overall cost of a laptop most times, the price savings for worse CPU performance often isn't enough to entice a lot of buyers. Heck, slap in a $100 optional discrete GPU in to most laptops and they'll perform a lot better than a Trinity iGPU will and the CPU performance will be way better.

The price of mobile (well laptop) CPUs also tends to be higher than their desktop brethern, so AMD is probably missing out in a bigger way there as that space is getting more popular and likely have higher margins.

Interesting, but looking at what Intel is finally bringing to the table with their next Atom refresh, it sounds like Atom is being redone almost from the bottom to the top and likely is going to deliver significantly better IPC, power efficiency, offering quad core designs, etc. It will be interesting to see if Jaguar is going to manage to handle Atom.

Current Bobcat I think is still a little better than Clover Trail, but the next Atom refresh looks like it promises a lot more than 15% IPC and possibly 10% better clock rate results (and likely deeper power savings moving from 32nm to 22nm FinFET).

I think the market has changed from the antitrust days though. PCs (and by proxy laptops and tablets) are now appliances, whereas even 5-8 years ago they weren't quite there yet. If AMD completely faded away, Intel wouldn't be able to artificially cause a price spike, because margins simply are too thin for OEMs, and consumers simply won't buy computers if we go back to the days of the $800 commodity desktop/laptop/tablet. Suddenly, the public will decide to go back to paying a technician to fix their computer instead of just junking it and replacing it. Mind you, I consider this a net positive.

I don't think that's really the problem though, prices. I think the real problem is that Intel still has to pour huge amounts into R&D to make sure AMD doesn't catch up, and as long as AMD is around and fighting, Intel will have to keep innovating. As soon as AMD is gone, Intel can just relax in the desktop and server market where they have no competition, and release marginal updates every two years to keep people buying but barely expend any R&D efforts on it.

This seems much more critical with video cards than it is with general purpose CPUs. We seem to be already well past the saturation point with CPUs. Whereas a less leading edge machine (especially an Intel based one) can be an ugly prospect when it comes to the GPU. I hear the remarks about Intel allegedly finally catching up in this regard and am still somewhat skeptical. Anything non-Intel seems to always be a dramatic improvement.

It seems that AMD needs to be around to help clean up after Intel on the GPU front.

I just build myself a new desktop PC. An 880g mobo and a 6 core FX. In total it cost me 200 dollars less than an intel build. I am super happy with it, it does everything I need to and I see no performance lags at ALL. I have always built AMD, and will continue to build AMD. Just can't beat them for the price.

Several years ago I built an AMD 1055T with a gigabyte MB. In the long run, it's been a disappointment. It's significantly slower than Intel (i7 Nehalem) for graphics work, which is not a show stopper but I'd expected better performance. The onboard graphics failed shortly after I started using it (sadly several months after I'd bought it), I had to buy a GPU. Next the onboard ethernet went out. I have issues with Gigabyte and AMD now.

Next I got an OCZ Agility 3 SSD, but even after setting ACHI in the motherboard, I saw no speed improvement. My son has an Athlon II X3 and it's motherboard does not seem to support ACHI at all. Even on my wife's old Q6600 motherboard, ACHI is supported and the SSD boot takes seconds.

I've been keeping AMD in my lineup hoping things would turn around, but right now I just don't see that coming. I'm planning on replacing the 1055T with an ivy bridge when I can afford it, my son's computer will probably get upgraded to an i5 (depending on what's available at the time).

"The trouble with AMD is that it doesn't have big plans to chase new growth markets like tablets and smartphones, and it's running out of niches to occupy in its traditional desktop and laptop PC markets. The PC market needs AMD (or a company like it) to keep Intel on its toes and to keep prices down, but the company's fumbles and inability to execute have made it hard to have a lot of faith in it. In both the PC and server markets, AMD offers a decent alternative to Intel for the price-conscious, but it looks increasingly unlikely that they'll ever again be anything else."

I for one don't want to see AMD pursue every new growth market. I would like to see AMD concentrate on what it does best... giving us a low-cost, decent performing alternative to Intel server, desktop and laptop products. Chasing down the tablet or smartphone markets are waste of AMD's finite resources and can only lead to "irrelevance". Want to be relevant? Become the best at what you already do.

Desktop Trinity was reportedly held back in the spring to allow stocks of Llano parts to clear. Now it's too late for it to make a dent with the Intel Ivy-based i3s appearing in August. AMD basically gave up 6 months to find themselves two months behind their largest competitor for the budget market. Great move, AMD.

I program too....a program I wrote on my Phenom machine took 5 secs to run when I tested it yesterday. I just ran it on my Ivy Bridge machine an hour ago, and it ran in about 2 secs.

So there....Intel saved me three seconds (I am being generous and not exact with my measurements). Bottomline, Intel's new CPU just kicked the fuckin crap out of a 4 year old AMD chip.

But the point I was trying to make there is that that three second delta in performance does not make me regret buying the AMD CPU which has provided me with 4 years of service and will probably give me several more.

Those deltas add up.

Case in point, years ago I used to edit my videos with my Athlon (~ 1.5GHZ as I recall). It was not fast enough to reduce frame size inline, so I had to save (and edit) raw video. To make a DVD took all afternoon for creation and all night to render and burn the DVD.

Next I got a 933MHZ P3, it was actually faster and could reduce frame size inline, meaning that I downloaded and edited a smaller file, and things were faster in faster in general as well, I could now start at 5PM and have a finished DVD in my hand by 11PM.

With a modern camera, most of the steps are missing, but the last time I edited a miniDV tape with an old core 2 e6300 (slowest and cheapest core-duo) I could edit 2 or 3 videos in an afternoon, taking my time and the process was much smoother.

So basically whatever your program does, the difference in execution times scales, and for compute intensive tasks like video creation, you can save a whole boatload of time (or get higher quality). And if you want to do several things at the same time, say play red alert 2 while the video renders, you can do so.

The 65wt A10 looks to be a fantastic all-around consumer desktop chip, however if it's just being stuck in mid-tower cases it's hard to make an argument for it in relation to an i3. After all, even a low to mid-range $100 graphics card that doesn't require a 6 pin connector will still be able to beat it in GPU performance, and likely CPU performance as well even with half the cores (albeit only integer, the FPU's are shared in Trinity).

The size of the market is no doubt in question, but what would be interesting is for OEM's to produce sub $500 (easily doable - perhaps even $400) extremely small form factor desktops. I'm not talking about Acer Aspire/HP Slimline; models that are significantly smaller than that. Pair them with a notebook hybrid HD, and you've got a very capable desktop (will become more so as more apps use the GPU), and potentially an excellent Steam "Big Picture Mode" box for gaming on the HDTV.

Now of course this is not a solution to AMD's market share woes, but it at least makes a case for Trinity on the desktop to exist at all. A super-cheap Mac Mini competitor that can play games better than the consoles for ~$400 and does 95% of what consumers need in a desktop system is a decent value proposition, especially as Intel really has nothing to counter it with similar power usage/price (the last preview I saw had it wiping the floor with the identically price i3 with HD4000 video, over twice as fast in games, plus AMD's longer expertise in graphics drivers can't hurt).

Out of all the processors that I have ever used, I always went with AMD excluseively. I loved how they were always more affordable than Intel's chips and I had fewer issues with them than computers featuring Intel chips. Case in Point: I built my own PC using a Duron back in 2002. It was only 1000 MHZ (puny by today's standards), but it totally outperformed a laptop featuring a newer Intel chip that my mother bought. To this day that 10 year old computer STILL is faster than the most recent laptop that we acquired, even though it is way obsolete (also an Intel chip!). I will continue to puchase AMD because they represent a cost effective philosophy that I can use to make a darn good computer for as little money as possible.

Since then, nVidia seems to at least have caught up and in many areas is pulling ahead, not to mention the now-relevant-more-than-ever issue of driver compatibility. In more than a few games I have to disable CrossFire for compatibility or to improve performance, and Borderlands 2 - a premium title - looks like a different game when comparing visuals on Radeon and GeForce cards with PhysX.

You can actually turn on Phys X while running an ATI/AMD card (within borderlands 2)... In fact the last couple generations of AMD cards run Phys X better than Nvidia...

Way to many have been drinking the 'end of desktops' koolaid in this thread... My desktop is not going anywhere. I may love my transformer tablet, but at best it replaces my laptop and my desktop owns it for functionality. And my desktop is a 'lowly' Phenom II X6 1100T with 8 GB of ram and a Radeon HD 6970. Which by the way can easily run every game at max settings I've seen in the last several years, do video editing of HD video, and all sorts of crazy things we couldn't even think of doing 10 years ago.

I hope AMD can hold on to their place in the graphics market, the low end desktop until Intel makes another error, and their space in the server market (which everyone here seems to have ignored).

I just bought a 3-core Rana for my son's machine. I got the mobo + proc for $110. I can't even buy a decent intel 3+ core processor for that much...

But you could buy a dual core Pentium G850 + mobo for that much and it would outperform the 3-core Rana in the vast majority of applications while using a good chunk less electricity. More cores != better, just like how more ghz != better.

Way to many have been drinking the 'end of desktops' koolaid in this thread... My desktop is not going anywhere. I may love my transformer tablet, but at best it replaces my laptop and my desktop owns it for functionality. And my desktop is a 'lowly' Phenom II X6 1100T with 8 GB of ram and a Radeon HD 6970. Which by the way can easily run every game at max settings I've seen in the last several years, do video editing of HD video, and all sorts of crazy things we couldn't even think of doing 10 years ago.

I hope AMD can hold on to their place in the graphics market, the low end desktop until Intel makes another error, and their space in the server market (which everyone here seems to have ignored).

I don't think it's an "end of desktop"; rather a "end of making money out of desktops". Even if AMD competes they get a growing share of a smaller market. Intel dominates the workhorse market, ARM/Intel are eating up the low power/low end market and the PC gaming market, while significant, isn't really enough. PC margins are likely to be pushed further and further to the bottom as cheap ARM vendors pop up selling to low end PCs because for most users that will also be good enough. AMD does have to keep selling billions of dollars worth of chips to sustain themselves though

The problem for AMD for the past couple of years though is that they can't do better at the same price. The 1100T was great against the i7-920, however it was stomped by the 2500K a couple of months after it came out. AMD sells chips, they let you do crazy things we couldn't do 10 years ago, but the problem is always that Intel lets you do those things better for the same price (until AMD do their usual price slashing, but at that point they're not making money anymore). Look at every market, it's the same, anyone will sell you a perfectly functional phone or tablet, but only certain ones are selling well because they're deemed slightly better.

Only if their GPU can compete in power usage with those from Imagination, and others. No point to have a more powerful GPU if it sucks several times as much power as the rest of the SoC.

Here's a funny story, the Qualcomm SoC's use a GPU chip with the name "Adreno". This is an anagram of "Radeon", because ATI designed them but they were sold off to Qualcomm when AMD took over. Hopefully the GPU designers in AMD will be able to do that again, in time to get in on the ARM game while the market is still growing.

AMD is run by idiots. They sold off their ARM license to Qualcomm at the same time as Adreno as you point out above. Dumb move by a company run by idiots.

The only thing that AMD sold to Qualcomm was the group and IP behind Adreno. I think that there's a perception that ATI designed the Scorpion core that powers much of the Snapdragon lineup. That is not true Qualcomm designed the Scorpion core and it would have likely been as successful had it been using a PowerVR gpu. AMD made the right choice when they sold off the Adreno line, it would have bankrupted the company to stay competitive in the ARM application processor market. TI the #4 smartphone SOC maker is exiting the market, while nVidia has taken about half a billion in losses from the Tegra line.

I would love to see IBM come in to the picture, purchase a controlling 50.1% of the company. Then license some more of IBM IP to AMD and give the company some direction. AMD no longer trails behind Intel at a constant distance. I'm glad to see Keller back in the mix but wonder why it took so long. I'm not expecting some miracle, I just hope some kind of improvement will be had by bringing him back.

I don't think that's really the problem though, prices. I think the real problem is that Intel still has to pour huge amounts into R&D to make sure AMD doesn't catch up, and as long as AMD is around and fighting, Intel will have to keep innovating. As soon as AMD is gone, Intel can just relax in the desktop and server market where they have no competition, and release marginal updates every two years to keep people buying but barely expend any R&D efforts on it.

Then you really know nothing about Intel or its corporate culture. R&D is their lifeblood. Building new chips is the blood that flows through their veins.

They are innovating in areas that AMD had zero presence in. Radio in silicon. Thunderbolt interconnect. Knights Crossing. Tunable lasers in silicon.

There has never been a time in Intel's history that R&D hasn't been high on their list. They innovate out of recessions and Bust cycles.

Now AMD isn't hasn't been a lightweight in this area either. Regardless of who 'won' this war neither were going to stop innovating or slow R&D. It is not in their nature.

I see lots of people saying they would only consider intel now. Just thought I'd add that I consider myself a hardcore gamer, with well over 100 games on my Steam library, most purchased over the last 2-3 years and I use an amd phenom II x4 3.2 GHz. I've had it for quite a while now and still there are no games which make me feel the need to upgrade it. Basically I feel cpu performance from either intel or amd is largely irrelevant nowadays; games are mostly built for consoles now and upscaled for pc, so as long as you have a cpu spec higher than the consoles you're ok for gaming.On my 1080p monitor with a high-end graphics card I get 60fps in pretty much everything I play. Sure with a higher res monitor things might be a little different, but I can't see the picture changing that much as higher res's are even more gpu dependent.I also saw the benchmarks that show individual frame drawing latencies which can be noticeable, are higher on amd cpus than intels, but we're talking milliseconds difference here, sure it's noticeable, but it's a fairly insignificant difference to all but the most hardcore players who obsess over minuscule input lag discrepancies.

....i dont know where to even start, i am just going to say that you are all wrong amd is fine, they are not becoming irrelevant. the industry is moving in their direction with the HSA platform/philosophy (there is nowhere else for them to go). intel is still 5-6 years behind in real gpu tech (not fake gpu x86 eu's) and integration compared to amd. there is so much fucking wrong with this article, that it would need another article to debunk it.

GPU tech is only useful in niche situations, if you've checked review sites recently you'd see that Intel integrated graphics is almost to "make do" levels of performance.

For a more direct reply to your thoughts, sadly, perception is reality. As anyone who's owned a Zune or bought a Apple product could tell you. To a large extent articles like this one pile on to an existing perception (that may or may not be correct).

Intel's Haswell is coming out next year. From the demos at IDF, it uses about 66% less power under load and 20x less power at idle compared to Ivy Bridge, which is already kicking the crap out of AMD.

It also has vastly superior iGPU performance compared to AMD's parts as well. Intel has been showing off silicon at trade shows with a Haswell CPU being able to play Skyrim at 1080p with high settings at playable framerates. Llano and even Trinity based hardware are usually only able to play games at around 720p with medium to low settings to get playable framerates on modern games.

First of all I want to say that I created this account so I could comment on your terrible and poor reporting. I laugh when I hear that the PC industry is not growing, your opinion is just that an opinion. The PC market is thriving, especially in the casual gaming market.

With Laptop and Desktop chips like the new A series you can bet that the casual gamers are going to eat this stuff up. Have you heard of WebGL, that's right 3D powered web games, where the A series APU will thrive.Look at the benchmarks for the A series CPUs they are impressive. Wait two more generations of processors and you'll start to see PCs connected to TVs because the Fusion processors will be able to handle hardcore games at 1920x1200.Hence Steam creating Big Picture mode, they see it, I see it, you don'tAll mathematical calculations are moving to the GPU as well.

Intel is about to peak, because they don't have a real GPU and Nvidia does not have a real CPU.What about the fact that all three next gen consoles are using AMD parts.

One thing AMD needs to fix is the Linux drivers, then they have themselves a winner.

AMD kills in the server market, power consumption, memory bandwidth, these top performing server chips are all from AMD, not Intel.

I often dream of a world where AMD was able to sell its superior athlon64s in an open market, reap the financial rewards, and use them to fund proper development and improved fabs. They would have had a chance to preserve their lead, and may have even been able to beat intel to a nm mark or too.But, they were shut out by intel's actions, the money was not there, the development was starved of funds, they had to spin off their entire fab division, and are falling further behind.

I worked at AMD as an intern/co-op for 8 months and worked on the first K8. I wouldn't say the people there were super smart and they seemed to be somewhat disorganized. They had a substantial performance lead after K8 and I would say also profits - their stocks went from $8 in one year to $16 a year later and I remember that because those were the prices that I bought and sold, respectively. Hector Ruiz essentially mismanaged the company and blew away the lead it had. It bought ATi at a way inflated price even without knowing what it was going to do with it. Only something like 5 or 6 years later did they come out with the APU while totally neglecting their CPU performance.

Interesting, but looking at what Intel is finally bringing to the table with their next Atom refresh, it sounds like Atom is being redone almost from the bottom to the top and likely is going to deliver significantly better IPC, power efficiency, offering quad core designs, etc. It will be interesting to see if Jaguar is going to manage to handle Atom.

Current Bobcat I think is still a little better than Clover Trail, but the next Atom refresh looks like it promises a lot more than 15% IPC and possibly 10% better clock rate results (and likely deeper power savings moving from 32nm to 22nm FinFET).

Clover Trail comes in 1- and 2-core. Jaguar will be up to 4. (And at these lower speeds it really matters.)

When's the next Atom gen planned? I doubt they'll go over two cores to avoid competing with their lucrative ultrabook Core processors, but I could be wrong.

The market moving towards notebooks/laptops is not precise because I do not use my notebook computer enough for it to be useful. In this case, I will use desktops. In five years, I will still use desktops. Notebooks for me is still very far in the future. If I want a mobile device, I will go with tablets because the keyboard in a notebook computer gets in the way.

About AMD and its future still relies on statement if the board kicks out arrogance AMD will improve. AMD board better do it now or it will be too late. Too late will be tomorrow. I doubt during the K7 and K8 days that Intel hurt AMD. I think AMD's arrogance hurt themselves during those times. AMD's marketing did nothing to embrace the performance of the K7 and K8. Also AMD's market did nothing to say that low power consumption of the K8 is a good thing, so it is AMD kicking AMD's butt.

How does the game console industry play into AMD's position? They produce the GPU in 2 of the 3 current gen systems and based on rumors possibly the GPUs in all 3 of the next-gen systems as well as the CPU in one (PS4). Given the size of the console market, that means hundreds of millions of units of AMD hardware getting sold over the next 5 years or so. Surely having that stable revenue stream would help keep them afloat.

Also, it seems to me that for these APUs it's less a question of absolute performance and more a question of "Do I want the ability to do some light gaming on my $400 word processing and internet box?" Plenty of people will never do video editing or other CPU-intensive work on their machines but would definitely appreciate halfway-playable game performance without a price premium. AMD's big problem is getting enough visibility among these people that they won't just go with the "Intel inside" because that's what they know.

...One thing AMD needs to fix is the Linux drivers, then they have themselves a winner.

That would be a great place to start.

ifgriffi wrote:

AMD's big problem is getting enough visibility among these people that they won't just go with the "Intel inside" because that's what they know.

I don't think the average person looking for a budget box knows or cares whether it's "Intel Inside" or not. I could be wrong, but I have non-geek friends that send me promo ads all the time for computers they are going to buy, in order to get my opinion, and most of them have no clue what the difference is.

Andrew Cunningham / Andrew has a B.A. in Classics from Kenyon College and has over five years of experience in IT. His work has appeared on Charge Shot!!! and AnandTech, and he records a weekly book podcast called Overdue.