Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

Barence writes "In 2006, AMD could seemingly do no wrong. Its processors were the fastest in the PC market, annual revenue was up a record 91%, expansion into the graphics game had begun with the high-profile acquisition of ATI, and it was making exciting plans for a future where it looked like it could 'smash Intel's chip monopoly' for good. Now the company is fighting for its very survival. How did AMD end up surrendering such a advantageous position – and was it given an unfair shove on the way down? This article has plotted AMD's decline, including the botched processor launches, the anti-competitive attacks from Intel and years of boardroom unrest."

It's really simple--Intel made better products. Once Intel abandoned the dead end of the Pentium 4 and changed tacts with the first low-power Core chip, AMD never had a valid response. The article details some predatory behavior on the part of Intel which was eventually settled, but I don't think the outcome would be different today had that not occurred.

AMD made good products too, they just made them for the wrong market. This is why commercial semiconductor manufacturing is so difficult. You give your engineers a set of constraints and then about 5 years later you have a product. Intel, back around 2003, bet heavily on laptops and power-conscious servers. AMD bet on desktops. Intel's market predictions were better and so the products that they brought to market turned out to be the ones customers wanted by the time the related products made it to market. AMD's were not. Of course, Intel only made this bet after seeing how badly they'd underestimated the importance of power consumption with the Pentium 4 so, if anything, their later products were an overreaction to the poor performance of the P4.

So, in summary, AMDs problem was that they didn't screw up the previous generation, so assumed that the next one could be more of the same and missed the industry shift to mobile devices.

Also Intel bribed big OEMs to use their processors instead of AMD's. Dell was an especial example of this: in the K7/K8 days they'd make noise every year or two about how they were considering selling AMD-based systems rather than being exclusively Intel, and those of us in IT who wanted/better/ computers would get very excited, but then Intel reliably came along and gave Dell an even better sweetheart deal on their CPUs, which was probably Dell's objective the whole time.

It wasn't AMD's fault for choosing the wrong market; they'd made a far better desktop and mobile processor than the P4, it was just that Intel was abusing its market position.

Dell was an especial example of this: in the K7/K8 days they'd make noise every year or two about how they were considering selling AMD-based systems rather than being exclusively Intel, and those of us in IT who wanted/better/ computers would get very excited

Um, if you wanted better computers, why did you keep buying from a company which didn't sell them?

This is the problem with the whole 'evil Intel' argument; you're assuming that customers would continue to buy second-rate products when they could buy better PCs from another company. When AMD released AMD64 they owned much of the server and workstation market and most of the high-end desktop market, because if you cared about CPU performance you didn't buy an Intel space-heater.

If you want to buy 5000 computers every year how many companies can you buy from?

If you want to buy one computer a year you can build your own for all it matters. If you want 1 computer every 5 years you probably don't have the desire or skills to build your own, nor is saving that small amount of money worth it for a lot of people.

When apply either of those two constraints Dell IBM and HP were the big dogs for a long time, and they were basically in bed with intel. People who don't have the skills to build their own want to buy from someone with a name brand who will stay in business long enough to honour a warranty, and people who want to buy 5000 computers this year are only going to buy from a big outfit, for basically the same reasons, and because there aren't a lot of places that can supply you will 1000 computers by the end of the week. If you're a really big outfit you're looking at buying something like 20-100k computers a year, and when you start talking numbers like that even your acer, asus and toshiba guys will have trouble keeping up.

AMD has the same problem in two different sectors. They had one really good product, and then someone released a better one. In the GPU business AMD will have the best parts for a couple of month then nvidia will come along and take the crown, and neither of them are competing in the high volume business desktop market that intel has (and has gone so far as to put it into the CPU package). For the CPU business Intel has been toying with them for at least 6 years. How do you know that? Because you can overclock an i7 (or a core 2 series) by 30% on air easily. Everytime AMD gets close to matching the performance/watt, performance/dollar or whatever, intel just ups the clocks a bit and boom, they're back in first place. They're basically a full process (die size) ahead of AMD, and they always have been, which gives them a huge advantage. In the GPU business AMD is doing as well as they can, if you look at the steam numbers they're up around 40% of the market. The problem is that the gaming market, which is where the money is on a per unit basis, isn't all that big. nVidia has a revenue of about 3.7 Billion USD, AMD 6.4, and Intel 54. The money is in volume, and AMD can't get volume because their price per unit, per performance, per watt are all just not up to match Intel, yes, Intel was anti-competitive for a while, but they only need to do that for about 4 years to get themselves back out into the lead by a wide margin.

Some big name entity makes noise about going with the "little" man supplier, and then their old compatriot casually pass them a back room deal to make them stick with the old compatriots products. I swear, corporate contracts really need to be out in the open, or else they undermine democratic principles.

Some big name entity makes noise about going with the "little" man supplier, and then their old compatriot casually pass them a back room deal to make them stick with the old compatriots products. I swear, corporate contracts really need to be out in the open, or else they undermine democratic principles.

You make it sound like it's shady, but it's not. That's just negotiation. There's nothing wrong with a company lowering its prices to respond to a threat from a competitor.

I'm guessing you didn't read about the Dell lawsuit. Even Michael Dell was on the scopes and had to pay a hefty fine. Dell reported quarter by quarter profits increasing their stock price in a kind of two-tiered Enron scheme with Intel. At the same time companies like Compaq and HP had to merge their companies and sell with razor-thin margins, while IBM sold their PC division to Lenovo.

It certainly didn't help that computer manufacturers have treated AMD as a budget CPU for many years. Looking back through history, a fair number of AMD CPUs were actually superior to Intel CPUs, but when paired up with crap motherboards and computer manufacturer's attempt to nickle and dime everywhere they could (emulated sound card? why not, it won't tax the CPU that much; (supposedly, in a few cases) emulate part of the video card using the CPU? why not, that won't tax the CPU much), Since the CPU is so overtaxed dealing with things it should not, you get crap performance, and begin to associate that brand of CPU with crap in general.

If I were a major computer manufacturer these days, I'd spec in AMD CPUs (Black Editions, etc.), then attach a self-contained coolant system to it, and crank it until it reached the temperatures that the i7 normally operates at. The $500 in cost savings would appeal to my customers, and I'd be able to price my competitors out of the market. If I spec'ed in SSDs for the primary OS, and a large media drive for what-have-you, and let potential customers test-drive it, they'd change their minds about Intel in a week. Tackling Intel's marketing arm is something of a b*tch, from what I understand.

If I were a major computer manufacturer these days, I'd spec in AMD CPUs (Black Editions, etc.), then attach a self-contained coolant system to it, and crank it until it reached the temperatures that the i7 normally operates at. The $500 in cost savings would appeal to my customers, and I'd be able to price my competitors out of the market.

Now what if I were to tell you that you can get a motherboard that ticks all the same boxes as the other one, for $129? ( http://www.newegg.com/Product/Product.aspx?Item=N82E16813157271 [newegg.com] ). That brings the total cost down to $498. Could you please enlighten me on how it is possible to *save* $500 by building an AMD solution instead of Intel, when the Intel option is less than $500, at retail?

The only consumers spending over $1000 on a CPU are the folks with too much money and not enough brains. And while you can spend that much on an extreme edition 6-core Intel processor, you're forgetting that it's also overclockable, by about 30%, and that you'd really be pushing things if you tried to get an FX6 running stably at 4.5GHz. You'd also be forgetting that unless something is massively parallel, the i7 still retains a performance edge over the bulldozer architecture. Chiefly, though, you'd be forgetting that for 99% of what you do, you'll never see the difference between the i7 2700 and the FX6, except perhaps that the ability to use a small SSD as a cache drive to improve spinny platter drive performance, something that's built into the Intel Z68 motherboard chipset and, at least last I checked, didn't exist on the AMD platform, would actually give the i7 a boost in real world usage, for significantly less price (pair a 32GB SSD with a 3TB spinny platter drive, and you get the write speed of the SSD, coupled with the capacity of the spinny platter drive). You may see a performance increase in things like video encoding, depending on the software you're using, but it's not going to get you any more frames per second in Skyrim. And truthfully? When I rip a DVD, I queue up the transcode in Handbrake before I go to bed, set it to turn the computer off when it's done, and don't really care if it finishes 2 minutes earlier.

The CEO of Intel always said he wanted to be the McDonalds of CPUs. That is, efficient produce mass quantity of CPUs. They spent far more on the manufacturing facilities then AMD. Whenever you here Intel people speak almost always focus on their Fab plants, not their chips.

So, you are going against the a company that's 10 times your size and can produce large run of chips cheaper then you. Intel could always discount it's chips more then AMD and still have more money then Intel.

Right, the question isn't "what went wrong at AMD," but, "how did AMD challenge Intel for a major product cycle during the 00's." The answer is, Intel made some missteps around that time. But David normally does not beat Goliath. Especially not in the long run.

They could have if not for anti-competitive tactics. That way they could have bootstrapped production at Chartered which had a similar little used plant based on the IBM process. The AMD/Intel deal said AMD could not manufacture processors on a 3rd party outfit. So when they were capacity constrained they could not outsource production and use the windfall to buy more fabs later. They still did get enough money to make two more fabs and upgrade the old one at Dresden. They made the second fab also at Dresde

Actually AMD bet on servers and it won. x86.64 was a result of this. Intel still sells crap like 32-bit Atom processors. We would probably be stuck in 32-bit land today if Intel had its will fullfilled.

Sure, they hurt on desktops, but it was no major issue. The problem was when Intel actually started making decent server processors.

I don't see more cores as a win... Parallel programs are always harder to write than serial programs (and in far too many cases effectively impossible). More cores is a crummy consolation prize that we've had to accept since serial speed increases died in about 2003-4. I'll be the first person to preach about how much I love nVidia's s20x0 processor cards (Half a double precision teraflop in a card? Yes please!), but I'd love a quad-core chip running at 40GHz way, way more.

Mainly because CUDA is the only language I've ever encountered where if a programmer told me "I can't get this simple finite difference function to work" I would nod in shared pain instead of thinking they're stupid. I have never encountered a language that was harder to get things right in. Never.

Tell that to my two servers with two 8 core Magny Cours 6128s per machine. Linux+KVM+fast RAID on these machines equals lots of responsive virtual machines at a price point way below what Intel could deliver. Is virtualisation a niche market? Really?

It's not all about the heat. We're pretty close to the speed of light being an issue. Electrical signals travel slightly below light speed, so for a 3GHz chip, your signal could make it approximately 10cm, round trip, in one cycle. Which means we may be able to get to ten or twenty GHz inside chips but when it comes to memory access, that's going to then take tens of clock cycles. And that also assumes a 1cm square chip, all straight paths, and zero latency in the transistors. My guess is that even if you made a diamond substrate, you wouldn't be able to get it too much faster than existing chips.

How about a small but perfect example: running more than one program at once.

I'm not sure where this idea that multiple cores isn't useful came from, but it's dead wrong. Many individual programs might not be written to take advantage of parallel processing, but the OS will quite handily dispatch different programs to different cores.

Exactly and with AMD you can have lots o' cores cheap I love the fact my 6 core can have two work on ripping a DVD while the other 4 are gaming and my customers love being able to hook that nice cheap quad via HDMI to their widescreen and have it game while surfing or listening to music or ripping, take a $100 AMD quad and add a $50 HD4850 and you can game quite nicely.

If you want to know where AMD is fucking up its killing Thuban and AM3 when it would be trivial to keep the low to midrange market with the AM3 by having a single chip. Take a 95w Thuban like the 1035 or 1045, any that have bad cores are your Phenom X4s and X3s, and any with bad cache is your Athlon. Voila! You can crank out one chip and pretty much keep the AM3 line (which has some damned nice boards dirt cheap) while you fix the bugs with bulldozer. As it is their sockets are changing too quick, AM3+ is gonna get maybe 1 more chip, Piledriver will be FM2 thus killing FM1 which is barely a year old, right now I wouldn't touch a non AM3 AMD desktop as its like socket 423 and 939, sockets that will be here today and gone tomorrow. Their E series are great for netbooks but their A series suck too much power for not enough performance IMHO, they should roll out 4,6,and 8 core Brazos chips to again give them product while they fix the problems with BD. They should have learned with phenom I that trying to shove bad chips out the door hurts your rep too bad, better to stick with tried and true while they fix the errata.

I worded that badly, but for one product a shell script (which the user creates via a gui) just calls a seperate program per trace and the user just tells it how many to kick off at once. So it is just running a seperate program per core and that is incredibly trivial. In some cases it even does it via rsh (old program) or ssh and just runs one program per core on the remote nodes.

You'll see AMD pretty much only in Cray offerings where they have a proprietary interconnect currently married to hyper transport. One big thing Cray talks about nowadays is how they are moving to a more processor agnostic interconnect so that they'll soon be selling Intel based systems.

In everything built since Nehalem came out without such considerations, pretty much all of them went Intel because that was the point where Intel began stomping AMD on both work done per clock *and* memory performance. Before Nehalem some workloads still indicated AMD because their memory performance was better, even if the Core2 architecture was besting them on performance per clock.

The first-tier vendors that carry AMD now largely do so because AMD hasn't demanded a socket change in a while and the vendors can get away with supporting new AMD products in 'old' designs with little incremental investment. This along with AMD aggressive pricing translates to pretty inexpensive pricing being possible for them. At very large scale, however, the additional operational expense associated with more servers sucking down more power and HVAC to get the same work done is a problem that becomes difficult to ignore.

On what? AVX specific benchmarks optimized for Sandy Bridge which do not use Bulldozer's FMA instruction which would give it a 2x performance boost? Poorly programmed OS schedulers which do not scale well beyond 4 cores? Or what?

I would say that the complete opposite is true. Intel has better IPC and IPW. Their branch predictors are the best in the business. The Achilles heel of AMD's fusion processors is the fact that they bundled a solid GPU with a mediocre CPU. Well it turns out that you can't really upgrade the CPU in a Llano rig. On the other hand, it's trivial to upgrade one of Intel's crappy integrated GPUs with a discrete card.

I mean, look at what the 3850 compares to in CPU-intensive benchmarks: http://www.anandtech.com/bench/CPU/2 [anandtech.com] - it's behind Wolfdale chips that were released 2 years earlier and aren't even available anymore. Let's just face it: AMD chips are terrible. That doesn't mean that they're a bad purchase - they're still great value. But you get what you pay for. Intel's technology is much more advanced. And honestly, I'm firmly in the don't-buy-from-Intel-cause-they're-evil camp. Thankfully I can build my own desktop, but a Linux-based AMD laptop is pretty much impossible to find.

We've yet to see where Bulldozer can go and it's definitely a design aimed at a 6+ core future.

Throwing cores at the problem isn't really a solution for the desktop. Most desktop apps are still single threaded and even games are usually unable to use more than four cores.

The big question has to be: why are AMD losing money?

Becuase intel are both bigger and technically ahead (both better designs and better processes afaict). This means a few things.

1: Intel can almost certainly produce equivalent/better chips to anything AMD can make at a lower cost.2: Intel can produce chips that are faster than anything AMD can make. These chips can be sold with no competition (at prices that go up by big chunks for each minor step-up in performance).3: Intel can spread their R&D costs over more units.

AMD got ahead of intel briefly because intel went up a dead-end with the pentium 4 but intel fixed that with the C2D and afaict AMD CPUs havebeen behind intel ones ever since. Afaict AMD has an advantage in integrated graphics but Intel is working hard to try and destroy that too and any serious gamer will probablly go discrete anyway.

Where AMD has chosen to not compete

I'm pretty sure that if AMD COULD compete in the high end desktop market they would. The very existence of the FX brand implies that they want to.

BS. Intel was fashionable, that's all. AMD has always been at least neck and neck with Intel if they weren't ahead of them, despite all of Intel's dirty tricks. I've been very satisfied with AMD ever since my 486DX3-100, and my Sempron and Turion based boxes just build upon that. Was Intel smart enough to buy ATI, or is it more familiar to associate them with nVidia? Need we mention Itanium? Who sold 64 bit CPUs first?

AMD has always been at least neck and neck with Intel if they weren't ahead of them, despite all of Intel's dirty tricks

AMD was ahead of Intel between 1999, when they introduced the first Athlon, and 2006, when Intel finally shitcanned Netburst in favor of Conroe and its successors. Since then, they haven't been anywhere near "neck and neck" with Intel. They have gotten beaten decisively in everything except the low-cost sector.

Our testing of everything up to quadcores says that clock for clock, AMD made mincemeat out of Intel.

Tom's Hardware disagrees. [tomshardware.com] Basically, the newest AMD K10 cores are about on par, clock for clock, with a 2006 Core 2 Duo, and get the pants beaten off them by Nehalem and Sandy Bridge.

Now, should I believe Tom's Hardware, or some random Slashdot poster named "postbigbang"?

One big thing you leave out: Intel stepped up their anti-competitive behavior, buying off the big computer makers to get them to cancel AMD-based computer lines.

Dell had the Optiplex 740 line. It was a damn good line, very effective, came in $150 under an "equivalent" Intel computer. What was Intel's response? They stepped up their monopolist subsidization and got Dell to back down. Repeat for a number of other manufacturers, and AMD's stuck in a bind again.

And one big thing you leave out - Intel was never "found guilty" of the alleged practices. Lots of high-volume innuendo, but the manufacturers who supposedly got pressured denied that it happened at all. Yes, Intel settled because it was costing too much to have to deal with the discovery and allegations. Look at the recent settlement with the New York State AG (that lawsuit was politically motivated, in my opinion, in exchange for AMD saying it would build a fab in NY.)

Bullshit. Opteron is a brandname which was composed of several products. The original Opteron line was quite innovative. When AMD introduced the K10 core they had several issues, Intel took the technological low road (multi-chip module) and got quicker results, AMD had bugs in the more complex K10 which was introduced late. However the bugs were all resolved in Shanghai/Instanbul. The FPU latencies were in many cases cut by half, the TLB bug was fixed, and AMD had the same performance than an Intel processor of the same period at a lower price.

With Bulldozer the same thing is happening. Bulldozer has a lot of bugs which cripple its performance. Once the bugs are fixed, if ever, the performance will climb up. The theoretical peak performance of Bulldozer in FP for example is 2x that of an Intel processor because of the FMA instruction. This should make the processor very popular in supercomputing and other workstation environments. Its peak integer performance is already awesome as well.

That's not what is says in the fine article. The European Commission fined intel a billion dollars after finding “Intel has harmed millions of customers by deliberately acting to keep competitors out of the market for many years”. HP and Dell didn't seem to be denying the allegations either - and with 3/4 of their operating budget paid for by intel, how could they? Finally, intel admitted culpability as part of the settlement to drop the US lawsuits.

Intel didn't need to be creepy. This doesn't mean that they weren't. Sometimes a competitor get's under your skin and when you see the chance to squash them like bugs you take the shot. I remember working for Borland in the mid 80s. The Borland CEO just loved poking Bill, and I'm pretty sure that there was no love lost on the M$ side. At some point a limo would pull into the Borland parking lot every couple days over an entire summer. Each time another critical Borland language developer would go out to lunch and never come back. It was discovered that they were being offered ridiculously lucrative opportunities at Microsoft. Over that summer, the entire language development team was simply sucked out of Borland. MS was sued for predatory practices and Borland won the case. Bill opened his wallet, pulled out a hundred million dollars (in other words pocket change) and said "Here... now go away, you're dead!" Businesses are soluble in money. Pour enough on them and they go away.

Well, he's listed me as one of those "shill accounts", so he clearly has absolutely zero proof (although I can't prove that to you, obviously).

I've been posting on slashdot since my first time around at university, so that would be 1999-2000 or something? Maybe 2001 - it might have been in my second year when I got a computer of my own rather than the ones in the lab. My UID is whatever was assigned to me when I made the account, and this is the only account I have.

Potentially, AMD is still favored by many people who don't mind tinkering. For instance, for under $100 you can get an AMD x4 with a top end of 3.8ghz or more. My development box as an AMD x6 that was $130 running all 6 cores at 4.2ghz solid. To buy anything comparable from Intel would be well over $400. It's the same story in server land. AMD vs Intel really depends on application. AMD has true physical core superiority. Intel bet on hyperthreading, and it works well for many projects, until you actually need 12 physical cores for number crunching and not just thread spawning. Then it's AMD by a mile.

I use Intel Xenons in my mid to low web/caching servers and I use AMD 12 cores+ in my data servers/larger VM hosts. It just seems to be the recipe that gets me the best bang for my buck, but to each their own.

This post looks like something from three years ago. Seriously, most apps are multithreaded now. Office is. Firefox is. Most Valve games are. WoW is. I could go on, but I don't think I need to. If one uses a Mac, most apps are multithreaded because of libdispatch/GCD.

AMD has their version of hyper threading. One can debate if it's better or worse than Intel's, but I'm not impressed by the benchmarks. By doing fusion and hyper threading, AMD has said they don't care about core count anymore. There's just not room on the die for it. AMD went from shipping 6 core chips to quad cores with their lame HTT and marketing them as 8 core. They're doing all the wrong things Intel did now. You can complain about performance, but not tactics.

Intel blows AMD out of the water in gaming benchmarks because their chips are faster. AMD doesn't want the performance market anymore. They want the lame consumer laptops people buy to use Facebook. Best Buy is full of them. AMD has a good product lineup for this market. Actually, if I were buying a cheap laptop I wouldn't even consider Intel because of the GPUs. AMD graphics are better for light gaming and video.

This is hugely incorrect. A Bulldozer module contains two, physically separate integer cores. It shares only the decoder, L2 cache, and FP coprocessor (which can handle loads from both processors simultaneously, unless they're AVX instructions, but it still executes them fast enough anyway). Hyper-threading uses the same core for two threads, sharing all of the execution resources - it's much less efficient in terms of performance per thread.

Why is AMD doing this? By separating out the FP unit, they can replace it with a GPU-based SIMD unit, allowing them to _vastly_ increase their FP processing ability. Heavy workloads are either already FP-based, or shifting to FP. Once they get this going, I'd expect to see a four-core module built around a variant of the Graphics Core Next compute unit - four sets of 512-bit wide SIMD units, allowing truly enormous FP throughput* with a large amount of integer capacity, while having all the benefits of a large, shared cache. The only problem with this design is the decoder throughput, and I'm certain that AMD is aware of this.

*I wouldn't be surprised to see an instruction set for 512-, 1024-, and 2048-bit wide SIMD instructions, to take full advantage of the FP performance with fewer integer threads. There are also out-of-order advantages to having so many FP ports available, even when dealing with narrower instructions. Or they might use a cut-down version - one 512 bit unit - instead, presumably with a two-core module.

This is exactly what I was trying to say about their new fusion design.

AMD has realized that the critical aspect of CPU performance is math capabilities and with their new Fusion Designs, they have begun the replacement of the 387 math coprocessor with something that offers far better performance using elements from GPU designs such as the stream processors.

Yeah, I've heard these crazy counter claims. They didn't duplicate everything so it's not a full 8 core chip.

Intel Atom chips have hyper threading technology; well at least the newer ones do. They may have made independent ALUs in the AMD parts, but it still performs like HTT on most operating systems. Unless schedulers are modified, an OS will not know how to efficiently schedule processes for these chips and they will never perform like they're supposed to. I consider that a design fail.

You mean few games are multithreaded. The very use of a PC is a multi-core experience, your anti-virus running at the same time as your OS at the same time as your video chat encodes video at the same time as your chat decodes video at the same time as you play some facebook game...

You don't need to write multi-threaded software to take advantage of the latency reduction from a multi-core CPU on a modern OS.

True, much cooler than AMD counterparts but try running that i5, or i7 in my case, with a stock fan/heatsink with 100% load on all cores as I do when ray-tracing.. In the first 15-20 minutes the temp gets above safety limits. Since renders can take hours or days, I can't use Intel stock fans. But the Intel chips have much better protection mechanisms that the AMD counterparts. Intel chips will first start by deferring instructions to the next clock then after a while will execute a HALT instruction to protect themselves. I have seen AMD chips that would go POOF under thees conditions.

What? Why? That's crazy, I've run compiles for days (yes, I'm still a Gentoo user) on 10-year-old, crappy, about-to-be-tossed-in-the-trash, filled with dust, consumer hardware more times than I can count. I'm currently running a compile on my new(ish) consumer-grade CPU under a virtual machine, while typing this response, that'll be going for hours on end, and if anything went wrong because of the CPU I'd consider it a broken CPU.

Thank god more people don't think the way you do, if they did we'd already be buying "unlimited" CPUs. Your viewpoint is an asinine one that is just begging to be charged more for no reason other than you've outright told them they can get away with it.

Intel has had its share of buggy and bad designs, and that's even without going into discussion of the HMSS Itanic. Some AMD chips do great job of bang for the buck, my laptop has a nice dual core one that made the cost much less than comparable Intel chip would.

Still, AMD needs to get more risky with heavy investment into more advanced design and fab. mediocrity just isn't tolerated in processor design.

Itanic is not a buggy or bad design. It's just a design without a good market. If you were doing a lot of computation where that last.01% of performance was important and you had the time/budget to write Itanium-specific assembler, you'd love Itanium (64 64-bit registers is nice). It's just solves problems that most people don't have.

If [..] you had the time/budget to write Itanium-specific assembler, you'd love Itanium (64 64-bit registers is nice)

I thought one of the major problems with Itanium was that it used EPIC architecture [wikipedia.org] which relies heavily on the compiler explicitly figuring out how the parallel instructions should be scheduled (rather than the CPU itself doing this at runtime)... except that apparently such a compiler was never really written.

(Interesting quote I just came across in the Itanium WP article [wikipedia.org] from Donald Knuth- "The Itanium approach...was supposed to be so terrific- until it turned out that the wished-for compilers were basically impossible to write".)

"Once Intel realised they were falling behind, they dropped their brain-dead policies and pushed out better chips than AMD's."

Hmm. Not so much. More along the lines of they had a "Oh Shit!" moment, and cross-licensed AMD's 64-bit design (Intellectual Property swap) to get back in the game. Even Intel's earliest attempts (at a 64-bit x86 architecture) were pathetic in this area, with numerous complaints about their broken, half-assed 64-bit support (it supported, at first, only a handful of 64-bit instructio

The article is pretty explicit about how AMD dug its own grave. I don't think blaming an Intel monopoly is all that convincing.

Really? The article mentions how Intel managed to get Sony money to cancel ALL AMD shipments, and how they paid Dell roughly 3/4 of a billion dollars in a single quarter to not use AMD chips. But I'm sure you're right, I'm sure keeping AMD out of all of the major OEMs(except to some extent HP) had nothing to do with it.

they paid Dell roughly 3/4 of a billion dollars in a single quarter to not use AMD chips

When they're buying back multiple billions of dollars in product, 3/4 of a billion dollars off is called a discount - not a monopoly. Businesses use that exact strategy all the time. "If you buy a huge order of our product instead of the competitor, we'll offer you $____ off!" They also have the option of sending as many 'promotional' free products as they want in order to convince the potential customer, even if it is half of the customer's order.

3/4ths of a billion dollar was their operating profit. I'm not sure how you quantify that as a discount. That would mean essentially Dell was losing money on every chip sold prior to that quarter. So lets not beat around the bush, Intel had the money and willpower to forcibly squeeze AMD out of the truly lucrative markets.

For the record though: Intel CPUs are faster but in every day business use AMD chips were probably a much better buy for office machines. In fact if you look at the cheap CPUs for the

Intel seems to be winning because of marketing. Their top end CPUs out perform AMDs, but few people actually buy those.

Intel's MID-RANGE CPUs beat AMD's high-end, even though the AMD CPUs are 50% larger. That's a recipe for disaster, because AMD are forced to sell their most expensive CPUs for less than Intel's mid-range. Few people can see any good reason to buy a slower, more power-hungry AMD chip instead of an i5 unless the price is low enough to justify that.

I wouldn't bet against Bulldozer in the long term because the benchmarks I've seen seem to indicate some kind of unexpected bottleneck in their hyperthreading implementation; if that's the case then a new generation could actually make some use of all those extra transistors. But for now it's hard to see how they're going to make enough money from it to fund development of the next generation.

Let's not forget the underhanded tactics that Intel used. They were forced to pay a minimal $1bn to AMD for it. I always thought its too small an amount for losing their position as leaders in the CPU market. And now look how things turned out...

This comment, and those below, are ignoring the real, underlying issue for AMD: outsourced production. Once AMD made the decision to decouple the design and manufacturing of chips, they were dead. RTFA! Customers who want to buy AMD chips can't get their hands on them. Is that due to underhanded marketing tactics? No, poor decision-making at the top. Having worked at Intel, I've seen how the tight coupling of design, testing and production can work wonders.

I completely fail to see how AMD agreeing to a price and later regretting it because they sold their advantage is in any way Intel's fault or an 'underhanded tactic'..

$1,000,000,000.00 is a lot of friggin' money. If a huge company like AMD can't make use of that much money to better their products and come out with the next new thing Intel wants to license from them in the future, then they deserve the failure they bring on themselves.

When Intel produced laughable chips for years they still remain the absolute market leader, because of their unethical tactics.

The original P4s were comparable to the Athlons available at the time; when I bought my P4 it was a bit faster than the comparably priced AMDs for the uses I had for it. It was only when AMD released the Athlon 64 while Intel was struggling to push the P4 faster that they raced ahead... at that point everyone who knew about technology was saying 'Don't buy Intel, buy AMD, P4 sucks', and no amount of 'unethical tactics' could have kept Intel ahead for long.

AMD's budget range is still better than Intel, when compared at a constant price against AtomBut with the netbook/nettop market starting to flatline (or so I've heard), maybe they just made a wrong stratey decisionAlso, the botched Bulldozer launch: they should have used the no. of complete modules in the processor name, instead of the number of Integer unitsThat way they wouldnt have a 6 core which was actually 3 core, but rather a 3 core which performed better in Hyperthreading than an equivalent IntelGetting the driver issue sorted out before launch would have helped as well

Correct me if I'm wrong, but isnt the major cost of a chip the recovery cost of R&D and not the cost of the physical materials?If so, it would have only meant a longer time of recovery for R&D costs and not inability to make a profit at lower costs

AMD will never fail because Intel won't let it fail since it is their DOJ defence against being a monopoly. The couple of times AMD got ahead of Intel on technology, like x86-64, Intel started a money losing price war to put AMD back in its place. When AMD is struggling, Intel raises profit margins on its products to help them out. There are also less advertised ways Intel helps keep AMD afloat: Patent sharing, employee no-stealing, joint tools development like OpenAccess, etc. Having worked in that industry I was always surprised that the DOJ never came down on them for those agreements. The patent sharing and joint tools ones are official even though Intel puts like 10X more into them as AMD does. I left that industry after 5 years since I saw it as a dead end since you only have a few companies competing for your skills. As my manager at Intel told me, "I won't give you a raise since you only have one other place that would even care about the skills you picked up here, AMD and we really control them too."

As bad as your post paints it I'm afraid the reality is even worse. The reason the DOJ has not come down on Intel is that our government has been fully captured by corporations. Intel pays its 'donations' and tells them what laws to write, what laws to enforce, blah blah blah.

And what we end up with is not only lesser products because there is no real competition in some markets but what you experienced on a personal level. And I'll end it at that because I'm sure/.'s rabid far right wing mod squad is going to blast me down and I don't want them to take your very good post with me.

AMD might have made okay CPU's but their partners made junk. You simply can't buy quality motherboards for AMD. All of it seems to be low-end crap with weird flaws. Every AMD system I have put together I wound up regretting. Things would crash randomly, freeze randomly, or just act downright strange. With Intel-based systems, I rarely have this problem (though I always pair it with a boring, plain-vanilla intel motherboard).

Bottom line, I simply cannot recommend AMD-based systems. Sure it costs less, but you pay for it in frustration.

AMD might have made okay CPU's but their partners made junk. You simply can't buy quality motherboards for AMD. All of it seems to be low-end crap with weird flaws. Every AMD system I have put together I wound up regretting. Things would crash randomly, freeze randomly, or just act downright strange. With Intel-based systems, I rarely have this problem (though I always pair it with a boring, plain-vanilla intel motherboard).

Bottom line, I simply cannot recommend AMD-based systems. Sure it costs less, but you pay for it in frustration.

Strange. I was able to find a quality board for an AMD processor from several choices. I've used AMD for years and never EVER have hardware based crashes. I think the trick is to do your research, buy a name brand board and spend more than $80 for it. Yeah, I could have gone with the $30 board, but then I'd be in the same boat you're in.

My last system was a dual core Opteron 175. Something in the system finally died after years of abuse. I don't know if it was the processor, RAM, MB or even the power supply. Frankly, I didn't care as the system had outlasted its usefulness and it was way past time for an upgrade. My current system is a Phenom II 965 with a Gigabyte board and 12 GB of name brand, PC1666 RAM. No problems whatsoever. Sure, it's not as fast as the 7-series, but I saved hundreds by going with AMD and frankly, I never wait on anything. It's still much faster than what I need.

The processor is really not the bottleneck any more for the vast majority of people.

Asus makes Crosshair motherboards, which have been pretty freaking awesome for AMD chips, and they've done pretty well with my current motherboard, the Crosshair Formula 4. No frustration here.

On a side note, there does appear to be some possible issues with NewEgg, however. If you check out the Crosshair V (5) section, there have been some comments with suggest that NewEgg has been recycling equipment (DOAs, and what not; many of the comments are recent), and Asus may be feeling some indirect hate for that. Personally, I've had two Corsair H70s, that were ordered as 'new' (i.e. not open-box), show up with obvious signs of previous use. I had been told, after the first incident, that it had been a mistake ("Someone must have grabbed things from the wrong pile"), but after the second incident, I am not so sure. I find this entire business to be incredibly annoying, as NewEgg has been a good supplier of equipment in times past...but I do not appreciate the problems they are causing me (Corsair has the latest H70, and is replacing it directly; still, it's taking almost a month to get this mess cleaned up).

The only reason they boom is when Intel makes a mistake. In the mid 2000's Intel bed on that crappy Pentium 4 line. This allowed AMD to gain a huge foothold. It was only temporary until Intel figured out they goofed and corrected. AMD sat on their hands and didn't invent the next thing so Intel just stomped all over them.

This isn't the first time this has happened. The same thing happened in the days of 486 and 586's. AMD gained a huge share then lost it all as Intel corrected they're mistakes and AMD failed to continue to innovate.

It's almost like AMD shows the way then Intel does it better. It will probably happen again assuming AMD doesn't eventually just die.

That campaign really had a lot of success. The only people who buy AMD are geeks who only do it when it gives a good price performance ratio. It does for me, going AMD simply means you can spend your budget on a fast SSD which will do a hell of a lot more for your performance then a faster more expensive intel CPU with a regular HD.

But people like me are the exception and AMD never really managed to remove "a computer has an intel inside" from the consumers mind. Just try your local electronic store.

Netbooks were a chance, AMD didn't put restrictions on its netbooks but they failed to push high end netbooks before Intel again stole their thunder with smart books. My netbook has got 8gb in it, it makes it a very smooth machine, just light and cheap enough to lug around and not worry about it getting dented or worse, stolen. Netbooks partially failed because they sold with slow HD's and tiny amounts of memory, hurting their performance no end.

AMD just never had the clout to sell its chips on even terms. And it is sad because Intel dropped the ball completely when they believed they had no competition. There is a reason that 64 bit linux is report as AMD64. Intel failed and AMD delivered but for AMD to have truly broken through they need a long string of victories and no losses like Bulldozer.

If AMD wants to succeed, they might consider something that Intel is also thinking of doing. Intel is having trouble gettings its chips into tablets and phones especially, so they have considered making their own... AMD could do a lot better getting their CPU's in PC's if they started selling them. Control the whole supply line and pass the savings on to the consumer and beat Intel and Intel Inside PC makers on price. Intel can't do that for fear of pissing of all its customers but AMD doesn't have many bridges to burn.

Yes, making PC's is a very low margin industry but that is partly because you are buying all the parts from third parties. AMD wouldn't be doing that. The profit on the CPU inside the PC would be part of the profit of their PC. The profit on the graphics card would be part of the profits on the PC.

Risky and unconventional but unless THEY build the PC, they are always going to have a hard time getting their CPU into the PC.

Dell – then the world’s biggest PC maker – received billions of dollars to “remain monogamous” with Intel. At their peak in the first quarter of 2007, payments from Intel made up 76% of Dell’s quarterly operating income: $723 million against a total of $949 million.

And I really wonder why Intel hasn't been gutted and salted for monopoly abuse, with its CEO and main backers arrested. How can it not be MORE clear than that ?!?

The movie character Gordon Gekko, got famous for his words "Greed is good" in the movie Wall Street, but he was wrong, and AMD has proven it - as so many others have before them, greed is indeed NOT good, it's a destroyer of all things good.

Why?

Because AMD was Warner Brothers when Disney always bet their money on Cute & politically correct. AMD appealed to the young student generation, the people that wanted POWER but didn't buy into the heavily advertised Intel hype. Sure - nothing wrong with Intel, I was an avid Intel fan myself, the AMD processors where notorious for overheating, and several issues on certain math performances, but AMD overcame those issues, and produced some absolutely AMAZING processors that even outperformed their competitor at a staggering 3rd of the price back then, it was a no-brainer, every geek wanted an AMD in their computers, many of them where excited about overclocking their AMD cpu's to unseen speeds, it was indeed the "rogue" choice, but people (like me) loved it, and certainly took advantage of it.

But anyone who gets up there, get's taken by GREED, it's kind of like Nintendo who just couldn't understand why no one wouldn't pay the same price for their toy, when it was 3 times slower than the competitor, it's like Sony who simply didn't understand why no one wanted their proprietary formats and couldn't understand the need to have an open platform, when they could be in total control instead...

Yep, story of our lives as computergeeks & users, history repeats itself, and it never fails to tell things like it is.

AMD had a wonderful technical position, Intel bet the farm on Itanium and NetBurst. AMD countered with an x86 architecture that was much much more efficient than NetBurst, a 64-bit implementation that didn't break backwards compatibility, and to further embarass Intel an affordable NUMA architecture with on-package memory controllers. For all this, 'Intel Inside' *still* carried some marketing weight despite the horrible tech behind it at the time. AMD failed in two ways:-They failed in marketing execution to erode the value of 'Intel Inside'.-When they did succeed, they didn't really come up with any *new* game changing plays. Intel's QPI was catch up to hyper transport, but since then Intel has continued with superior fab technology, advancing performance per clock, more memory channels per package, and incorporating features for particular sore spots like AES and h264 encode/decode. AMD's biggest advantage at the moment is that Intel GPUs are relatively poor and the Fusion line can quite thoroughly embarrass intel at gaming. The problem being the gaming market is very comfortable with discrete GPUs and this difference matters for a relatively small slice of the market.

This makes it difficult to be sure how much better Intel chips really are than AMD chips. When the Intel chip scores higher on a benchmark, and the benchmark includes Photoshop, was the Intel chip actually better or was Photoshop compiled with the Intel compiler?

Sadly, I think Intel chips really are better now; given that Intel is leapfrogging past AMD on process technology, they have major advantages so their chips ought to be better.

But I still buy AMD. Yeah, I'm giving up some increment of performance... but the chips these days are so fast, I can survive on only 90% performance or whatever. And I prefer to avoid doing business with a company that continues to sell a compiler that sabotages performance on competitor's chips.

Personally, I would love to see AMD sell a line of processors that return "GenuineIntel" for the CPU ID, and thus run Intel compiler code at full speed. When Intel sues them, they can argue that this is necessary for full compatibility with the code produced by Intel's own C compiler. (Yeah, I know. It will never happen. It's a fun daydream but that's all.)

Even if AMD doesn't have the top performing chips, they continue to score very well on price/performance, and the performance is good enough for me. And they are less evil than Intel. So I remain an AMD customer.

HP convinced Intel that it was not going to be possible to scale performance of the x86 architecture any more, and that the only options was a new architecture that exposed massive processor resources directly to the compiler which would use its broad scope of compile-time code visibility to explicitly schedule the code optimally for execution. The result was the Itanium architecture,

Unfortunately (for them) this turned out not to be a win, because even the best compilers can't predict what's really going to happen at runtime. They also took WAY too long to get Itanium working, and also what they were doing was pushing a hardware problem back up into software with an apparent belief that a silver bullet would be available to slay the problem of a ridiculously complicated piece of software that they needed to develop (the compiler).

If not for the delays and one other little problem they probably would have succeeded in replacing x86 with Itanium, whether or not it was a technically viable idea.

That one other little problem was AMD, who decided that yes you could push the x86 architecture quite a bit further along than where it was when Intel effectively abandoned it during the Itanium development. AMD pulled some rabbits out of hats and basically took over x86 development from Intel for a while. Eventually Intel was forced to get back into the x86 world in a serious way, when it became clear that AMD was going to pretty much take over the market with 100% compatible processors (and with 64-bit capability) where Intel was going to be stuck pushing an overweight, incompatible, late, and under-performing alternative.

Unfortunately for AMD, once Intel finally got the boat turned around in the right direction, they had the money and the engineers and architects to do x68 even better, and they have proceeded to produce a series of incredibly impressive implementations which squeeze the most out of both process improvements as well as architectural advances.

I think that without the Itanium detour there would never have been an opportunity for AMD to do what they did, but without AMD we would probably all be struggling with the baroque complexities of Itanium-powered PCs (which is too bad because then I would have been able to make lots of money hand-coding Itanium machine instructions for people:)

If not for the delays and one other little problem they probably would have succeeded in replacing x86 with Itanium

No way was that going to happen. Intel for some crazy reason forgot that one of their *biggest* draws in x86 land was backwards compatibility. Consider the fact that even this very day most applications ship as 32-bit x86 applications. Over 10 years after Itanium's launch and 9 years after the initial availablitily of x86_64 day to day life is still largely based around x86 compatibility. PAE is a servicable workaround still for 99% of applications out there. While it sucks, we'd probably still be on

The article looks back at the year 2006. 2006 was also the year when Apple switched to x86 computers. At the time, there was obviously the choice between AMD processors and Intel processors. And many people at the time said that Apple should have gone with AMD; we now know that would have been a big mistake.

In 2006, AMD processors were better than Pentium 4 processors. Pentium 4 design goal was "highest possible clock rate" with complete disregard of actual performance, because customers bought GHz. AMD countered by using processor numbers. Instead of a 3800 MHz Intel chip you could buy an "AMD 3800" chip which many customers thought was the same as the Intel chip, but in reality had much lower clock speed and slightly higher performance.

At the same time, the old Pentium 3 had much better performance per Megahertz than Pentium 4. Pentium M was a slight improvement of that, and Core Duo was again an improvement. Apple built a few Pentium 4 3.6 GHz Macs for developers. My first MacBook with 1.83 GHz Core Duo (May 2006) ran faster.

I think with the introduction of Core Duo, and with Intel getting rid of the abomination that was Pentium 4, AMD was beaten. They just didn't know yet for a long time. Reading the description of Pentium 4 internals, all I could think was "WTF". Same with Itanium (which was WTF squared). Athlon was in comparison a clean design, which was why it performed so much better per Megahertz. So was Core Duo, and since then Intel managed to stay with clean design and improve performance bit by bit.

Intel has made great success out of the different designs originating in their Israeli operations. The Core chips and others, for example. These designs gave Intel a tremendous product for years and years and made them a lot of money and made a matter of pride for Israel. This is not noticed so much in the US where nobody thinks much about where the design came from. But it matters back where the product was born.

And it was not unnoticed by Israel's enemies the Arabs. Nor was it missed by AMD. When AMD needed funding, it went to those Arabs and played investing in AMD as both a way to make money and also a way to smack at Israel via Intel. The Arabs bit. AMD got cash and turned out some decent products and there were collaborative efforts too like the Arab investment in Ferrari which went as far as Arab firms and AMD sponsoring Ferrari's Formula-1 team. The stickers on that car tell the story.

But the results were not enough, the fabs were woeful, and Intel came out with even more better stuff like the Core i3/5/7 series and AMD's Arab backers saw that they weren't going to win with this horse and refused to keep pumping in money with out something to show for it. They wanted a major win and got a lot less.

The Arabs have a lot of cash but they tend to be shrewd about it and demand results. AMD failed to deliver.

That's how AMD lost. The money to fight a war has gone away. All they can do now is fight small skirmishes.

It's a pity. As and AMD fan, and a fan of underdogs, I wish they'd continue to stick it to the man. But at the same time, even as I think highly of my quadcore PhenomII x4 965 AMD desktop, I see it get whooped in the ratings by similar Intel products. The only place AMD is winning is in being cheaper.

First is as a Linux users I always went with AMD, most of the machines with the AMD moniker had an AMD processor and kickin nVidia graphics on-board. So most AMDs at the time just fell into the "Just Works" category.

Now since AMD's aqisition of ATI - it's AMD with Radeon, so the "Just Works" appeal is gone. ATI support in Linux is still commonly flaky (yeah you can get it to work, but "just works" isn't the catchphrase associated with ATI.

Second is the new branding AMD Fusion and AMD Vision - which is just as incomprehensible as intels i# labeling in my book - can't easily remember which one was betther than the other, is it Fusion or vision, and how do those stack up to a Phenom II CPU??

AMD lost their customers because they made their customers "lost". Just this month my Linux box died... got too confused with the AMD branding so finally went with an Intel i5, made sure the MB supported PCIe and popped in my nVidia card, and am not looking back... Maybe by the next purchase they will have their ducks in a row so I can do some informed shopping... Until then, well... I just don't really know what is what with AMDs.

We can see some of the same behavior with MS, where they basically stopped doing anything with IE and slowed down considerably the Windows development in the 2k/XP run. Then all of a sudden they find that Mozilla and Linux can be credible threats on the casual home market, their traditional marketing leverage vs corporate office sales. Just consider the quote from Gates about him preferring people pirating Windows than considering alternatives. The central issue is one of mindshare. If a potential employee already knows the product from home, MS can claim that there will be little to no training time once hired.

Then when you look at the server market, they do not compete at all anymore. Performance per watt and per dollar both lag badly behind the Xeon.

That's untrue in my experience and has been for 5+ years. In the higher-end 2-4 CPU server range, AMD has had the best performance/price ratings for a long time because competitive Xeons are much too expensive. For example, the highest-performing 4P Linux servers (e.g. SPECint2006 rate) are currently Xeon E7-4870 based, followed closely by Opteron 6282 SE, but for the Xeons you'll pay 4 times as much as for the Opterons. A typical server configuration with 256GB memory will cost you ~8000 EUR ($10500) if you go with the Opterons and ~20000 EUR ($26000) if you go with the E7-4870 (and if you can actually find one on the market). More affordable Intel-based servers are not competitive performance-wise with the 6282 SE. If you don't need much parallelism and a lot of RAM, you might be able to get a more affordable offer using Xeons (with 2 of their 4C CPUs e.g.), but even there C32 based Opterons will offer much better performance per Dollar at comparable or lower TDP even (e.g. 2 x X7542 vs 2 x Opteron 4238). We've always been comparing closely when purchasing beefy 1U/2U servers over the past 10 years and Intel has not had competitive offers since their socket 604/Clarksboro Xeons when they allowed decent amounts of RAM (24 FB-DIMM sockets) in 1U compared with socket 940 Opterons (at somewhat sane prices). YMMV if your CPU needs are different, although I'd like to know how...

Yes and no. The over-complexity of the x86 architecture is a factor that makes hitting this limit easier. If they had put all the effort into ARM or another RISC architecture that they did put into x86, the end result would be a leaner faster CPU. For 64-bit, we do have PPC64 as an architecture choice. Or if Intel had switched over to ARM long ago, we might already have ARM64 by now. Anyway, a RISC architecture like ARM could go further than x86. I don't know if it could make it to 15 GHz. But I do b