Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

colinneagle writes "It's a darned shame, but the writing is on the wall for AMD. The ATI graphics business is the only thing keeping it afloat right now as sales shrivel up and the company faces yet another round of staffing cuts. You can only cut so many times before there's no one left to innovate you out of the mess you're in. Qualcomm, on the other hand, dominates this space, and it has the chips to back it up. The Snapdragon line of ARM-based processors alone is found in a ridiculous number of prominent devices, including Samsung Galaxy S II and S III, Nokia Lumia 900 and 920, Asus Transformer Pad Infinity and the Samsung Galaxy Note. Mind you, Samsung is also in the ARM processor business, yet it is licensing Qualcomm's parts. That's quite a statement."

Intel is already years ahead of AMD. They have well over 80% market share in the PC market and over 90% in the server and workstation market. There's a large performance spread between AMDs processors and Intels processors in both single threaded performance and overall performance per watt. If Intel wants to bend consumers over, they are already in a position to do so. However, they seem to be sticking to their roadmap despite the fact that AMD has been falling farther and farther behind.

If Intel wants to bend consumers over, they are already in a position to do so. However, they seem to be sticking to their roadmap despite the fact that AMD has been falling farther and farther behind.

Have you looked at Intel CPU prices lately? It hasn't been this bad since the Pentium II times. I would also point out that there are no Ivy Bridge server processors available, nor is their 6 core processor based on Ivy Bridge despite the first Ivy Bridge processors coming out a long time ago.

What are you talking about? In performance per US dollar, Intel has been winning the race hands down. I can get a Core i-5 3570k for about $220 USD. With decent memory, motherboard and cooling I can clock it up to 4.4 ghz and it's stable and not running too hot. To get that kind of performance at that price from AMD... oh, wait.. I can't.

No, they do it in large part because Intel has engaged in some pretty obnoxious antitrust violations over the last decade or so and got what's barely a slap on the wrist. AMD for it's part did some really stupid stuff as well, but it's hard to make a profit when your competitor is paying systems integrators not to use your products.

Also, while most folks here seem to be on the AMD is walking dead meme, the fact of the matter is that Intel can't afford for AMD to go out of business any more than MS could have afforded Apple to go out of business during the '90s. If it really does get to that point, you'll see Intel laying off for a while to let AMD catch up.

The big problem that AMD has right now is old debt and an inability to produce enough chips to satisfy demand. That's not something that's generally true of chips that are being sold for the maximum price people will pay.

Yes, in fact if you read the fine print every Intel processor built today is built on the AMD64 architecture.

IMO the biggest mistake AMD ever made was to license that tech back. Due to Intel licensing them the X86 arch though it may have been forced due to some sort of reciprocity clause for anything developed to supplement x86.

AMDs market share was too small for courts to go after them for market abuse though, if existing deals didn't force it then they should have kept it.

I think that would have gotten ugly, considering they were licensing tech from Intel. I don't remember the details but I remember their being some kind of reciprocity there. I'm sure someone more informed than me will clarify.

The Phenom IIs were not that bad. They had the best value. It was the bulldozer ones that suck goatballs. But they are from 2009 and there time has come. They were only 10% slower than the first generation icore5s/icore7s, but I could get a whole cpu + motherboard for $229! Not just the cpu. If you want the extreme edition of intel you would pay$700 just for the chip and that would cover the cost of the whole system.

Today though you are correct. The newer Bulldozers that just came out are competitive with i

I don't think Intel can afford that type of strategy anymore, even if AMD was gone. If they started charging $550 for i5s, then gamers and light users will turn to consoles and ARM chips, repectively. Demanding professional users (video editors, programmers etc) would be stuck, but only for a while. I think what Intel is doing now is as far as they can go: artificially disabling advanced features on cheaper chips. They're the top dogs because x86 is the stanrdard for desktops, and it's only still the standa

They're the top dogs because x86 is the stanrdard for desktops, and it's only still the standard because it's cheap.

No, they're top dogs because of "Wintel". If Windows had been running on ARM since XP it would be a whole different story.

Yep. Corporations are the bread and butter. Consumers are fickly and dirt cheap. They will happily pay $1800 for a desktop if no competition exists because that is what their tools require. This is what they used to pay back in the 1990s. DO you think Intel actually cares about gamers? Then why such a horrible crappy graphics that is 10 years behind and so terrible that game developers are quiting the PC platform due to it owning 70% of the market!

The pricing is, but not the CPUs. The problem is there is a finite amount of 22nm capacity. Right now Intel has only one 22nm fab online. They are in the process of converting their fab in Israel to 22nm, but right now the one in Chandler is it.

That being the case, there is only so much they can choose to produce on that process, and what they are choosing to do is mainstream desktop and laptop processors. They've changed their strategy from using the newest process to the highest end parts first to using i

If Intel wants to bend consumers over, they are already in a position to do so. However, they seem to be sticking to their roadmap despite the fact that AMD has been falling farther and farther behind.

Of course they are, because their process and IPC improvements is how they have such a huge gross margin - I think around 62%. AMD has been in the 40s but their last quarter was an abysmal 37%. Look at this chart [anandtech.com] over die sizes. From Lynnfield in 2009 to Ivy Bridge in 2012 their mainstream die has shrunk from around 300mm^2 to 150mm^2 which makes the chips far cheaper to produce while their prices stay high and Intel pockets the difference. That might be good for Intel but with fierce competition they could

I can't speak for other industries, but for the semiconductor industry gross margin is measured as revenue from a chip minus the immediate production costs. For AMD this would be how much they paid GloFo for the chip (or rather averaged across the wafer), plus the costs of testing, assembly/packaging, boxing, and shipping. It does not include advertising, R&D, taxes, etc. And as I stated earlier, R&D is a massive expense. All of those engineers designing the next chip are a huge cost that have to be paid.

You can take a look at AMD's finances first-hand and see how this plays out; AMD has never made a profit with gross margins below 44% or so. Intel would be an even better example: 13.5B in revenue, 3B in net income, and a gross margin of 63.3%. That would put Intel's profit margin at 22% versus their gross margin of 63.3%. Where did all the money go? R&D and fab upgrades. Gross margin only covers your immediate expenses in the semiconductor industry.

That's bullshit. AMD couldn't afford to build a new fab because Hector Ruiz blew up AMD's cash reserves buying ATI lock stock and barrel over the stock market price just before the 2008 market crash. In fact this particular little deal smelled so bad a lot of people went to court and Hector was forced to quit his post.

It's a lot more complicated than that. photolithography is a very complex process. As dies shrink due to a smaller nodal size it becomes increasingly more difficult to fabricate a single chip until that process matures.

All 150+ 4/6/8 core Sandybridge processors were sourced from only 5 different chips with 2/4/8 cores a piece and varying amounts of cache. The yield on the 8 cores is low even on the mature 32nm process so they demand a huge price premium. Those with defective cores have some disabled and are sold as 6 core variants.

Since defects are fairly consistent per wafer, yields on a 200mm^2 Sandybridge are exponentially higher than they are on a ~400mm^2 Sandybridge. The same is true for Ivybridge. I'm not sure if Intels 22nm process has matured enough to make 8 core Ivybridge processors economically feasible quite yet. Thus, 220mm^2 yields on Intel's 32nm process may be comparable or even higher than 160mm^2 yields on Intel's 22nm process.

TSMC's 28nm process was backlogged for quite some time due to low yields. The GTX680 was unavailable for the longest time because it required that a large chip be fabricated with no defects, the GTX670 which came later allowed for part of the chip to be disabled, thus increasing yields. AMD had the same problem with their HD 7000 series, low yields on the top end processors reduced their ability to ship those processors. Fortunately for them they had a stripped down version (HD 7950) ready to go at the same time rather than months later.

Intel is a remarkably conservative company. They're not known for announcing a product unless they know that they can make it available and thus it doesn't make sense to introduce an 8 core Ivybridge processor unless they know that they can actually deliver it. This is why the Sandybridge-E processors came around much later, and the same will be true for Ivybridge-E

Well, I'm not entirely sure of that. For example: I do audio work, and video work, and like gaming, and compile my own software. All of those things take a robust desktop architecture to do well. You're not really suggesting that I'd switch to a tablet running BOINC in the background 24/7 while I process high-def audio files, are you?

So let's discuss alternatives. Say AMD goes down. What are my options as a consumer in, say, five years if I want to avoid Intel, but want all the horsepower I can get my hands on for a desktop workstation? I really don't thing it's going to be Qualcomm, if they're targeting low-wattage mobile devices. Are there any other CPU manufacturers who are positioned to step into that market?

This is why I'm still selling AMD units, although I've been sticking with the Phenom II and Liano chips. The problem is the Bulldozer chip is really designed for server loads, not desktop. You can find the AMD Phenom II X6 new for less than $120, the Deneb quads for a little less than $100, and the Athlon triples and quads for $60-$80, that's a damned good bang for the buck.

But I wouldn't count AMD out just yet, they did recently hire back the lead designer of the Athlon64 who went to Apple and designed the

As for ARM taking over from X86? That is like saying mopeds are gonna replace pickup trucks, they are simply two different use cases. With ARM most folks toss the device every couple of years or when their contract is up, whichever comes first, whereas with X86 frankly the PCs I was selling on the low end 5 years ago would be more than enough for most people. With ARM you are seeing an early 00s style MHz race while on X86 frankly the Phenom I X3s and X4s I was selling 4 years ago give you cycles to spare, its really no comparison.

You are talking apples vs pineapples here. Right now, ARM is a phone chip and x64 a laptop chip. But at this point, it depends on how things go b/w MS, Google and Apple. If Microsoft really blows it w/ Windows 8, Intel will go down w/ it, despite all that technological superiority and fab capacity. As the history of RISC in the 90s proved, the best technology doesn't always win.

Intel has sent nothing overseas. Their manufacturing R&D is all done in Oregon, and most of their leading edge chips are made in Oregon and Arizona with fabs in Israel and Ireland as well. They have exactly one fab in China that makes 65nm products, which now just consists of some old chipsets.

We MUST have competition in the high-end processor market. Intel has a long history of abuse and monopolistic practices. Without a decent competitor, you can expect your processor prices to soar, just as they did in the past when Intel was essentially the only player.

The problem is, yes, people can boycott personally. But when IT folks go to work, it would generally be way out of place for an employee to place his personal boycott preferences above the customer's requirements. Intel delivers the best performance-per-socket and performance-per-watt, so that is what people tend to buy.

AMD is still competitive in some small areas, for instance if you needed AES or virtualization acceleration but didnt want to pay for the upgrade to the i5 line (since all AMD procs AFAIK

Punish? No, punishment would be a fine or confiscation of assets or something. Dividing the company into two would simply mean that shareholders would have a stake in each company until they sorted things out.

We, the buying public, would have two companies with top-end technical capabilities, duking it out in the market. Unless you want to assert that corporations in ultra-powerful positions should be left alone even when it goes against the interests of, well, eve

Allow me to counter your drivel: people like you should be dragged out into the street, kicking and screeming, then hanged from the nearest light-post.

You sing the praises of paying virtually no taxes, while benefiting from the wonderful things those taxes provide: a stable economy, powerful defensive force, extensive critical infrastructure, political influence in international dealings, etc. You go on to further sing the laurels of leaving the country to avoid tax hikes in the form of loophole elimination

I have my own company, make innovative products with innovative process, and I sing the same tune. Monopoly is bad for everybody, except for the monopolist. It is bad for the economy, it is bad for the democratic political process, it is bad for capitalism.

There is a solid, peer-reviewed theoretical foundation behind the idea that monopolies should be regulated, backed by a mountain of empirical evidence. There is no case for unregulated monopoly regardless of the market it operates in.

Not superior everywhere. AMD have a 16 core 2GHz chip thta you can put in 8 sockets and the closest thing Intel has is a 10 core Xeon that is no faster and a lot more expensive. You can put together a 64 core AMD machine with 128GB of ram for $9000, with Intel you just about have to add another zero for anything similar and it's no faster.As a poster above is pointing out Intel's newer and faster stuff is only for the desktop and hasn't made it into their server chips, and for more then 2 sockets the spee

Superior now. But when amd were ahead intel bribed the major pc makers not to use amd chips. During that time most of dell's income came from intel payments, for example. This is what destroyed amd since they could and can no longer afford r&d.

I disagree. Without Intel's backroom dealings AMD would have made enough money to weather the idiocy of Hector the Sector Wrecker (as muh ex-Motorolla buddies call him).

The whole reason that GloFo had to be spun off is because AMD invested in huge new fabs because they were fab-limited, but then found out they were Intel-limited, their marketshare didn't increase and their fab capacity was unusued. That's crippling for a silicon manufacturing company, so AMD had to stop being one.

Hector was no help, that's for sure, but I really think it was Intel that crippled AMD at the worst/best time.

I guess you're just unaware of when AMD had the superior product, but couldn't get OEMs to sell products at the volumes such price and performance superiority would have suggested, because Intel, still the dominant player, had made deals with them not to sell AMD parts. Their market share was growing, necessitating a new fab, but then they hit the artificial limits defined by Intel, a crippling blow after investing billions in a new fab.

There's only been several verdicts against them by the regulatory authorities of multiple governments, and a lawsuit settled between Intel and AMD in AMD's favor with a 1.75 (iirc) billion payout. A pittance compared to what was lost, of course, but still heavily in the news.

I suppose it would have been easy to miss if you only just started following the CPU industry.

They had a faster processor, but that is only one part of the equation. They had two major problems:

1) They didn't offer a CPU/chipset/mobo solution. Intel does it all for customers, they make the entire core if you want. This is useful to OEMs because there's no finger pointing when there's problems. Doesn't matter which of those components is broken, same company is responsible, they need to find and implement the fix. With the Athlons you could have a 3 way pointing match between AMD, VIA, and whoever made the board all claiming the other guy was responsible for a problem.

2) No good chipset. The processor was all kinds of fast but woe betide you if you wanted to use it with, say a GeForce DDR. The VIA chipset that was the "premier" solution for it implemented the AGP spec improperly and wouldn't work with the GeForce card since the AGP slot wasn't really AGP, basically just a fast PCI slot. This wasn't the only problem, just one of the most major ones.

So it is no surprise that some OEMs shied away from them. I built an Athlon system and it was a couple weeks of hell trying to make it work before I found out that no, there was just no way my GeForce would work with it. Back the parts went and in came Intel parts that functioned without error.

Likewise at work we did have some Athlon systems, Gateway I believe, and they were far more trouble than the Intel systems as a whole.

Intel isn't just popular because of the power, but their stability. It matters in business. AMD never really had a competitive solution in that regard.

I'm not saying Intel didn't also try to squash AMD (IA64 was another attempt, since there is no cross licensing for that instruction set) but AMD did little to help themselves. They produced a good processor without the hardware to support it.

Then they caught another break, with the fuckup that was the P4, but they rested on their laurels and didn't really do much in the way of architecture updates. Intel hit back with the Core 2, then Core i, then Sandy Bridge all of which are stellar performers per clock and there was just nothing new from AMD, until now Bulldozer which is pathetic, worse than their old chips at times.

As demonstrated by their marketshare and margins increasing rapidly until they hit the artificial barriers created by Intel. Whatever problems you believe they had were demonstrably not sufficient to limit AMD. Only Intel was.

1) They didn't offer a CPU/chipset/mobo solution.2) No good chipset.

OEMs might have preferred and AMD-sourced mobo (and they did exist), but it didn't stop them from using AMD parts in either desktop or server markets.

Also, you seem to be talking about the early to mid K7 days when 1) chipsets were relevant and 2) the VIA chipset was the best perform

Usually, I'm all for efficiency, and think it's a good thing when everybody isn't in on the game, as it wastes time for all of us on the average. If Amazon eat up all the smaller outlets (including in meatspace), and only two or three car manufacturers remain in the world, I would see that as progress, as it streamlines production, without unnecessary duplication (often by those who would be less efficient anyway).

However, I do agree with you that I think at least TWO companies are required for any parti

Well, you are assuming that there are never dis-economies of scale. Which in my personal experience at least is not the case. Not always, and certainly a task such as designing the 787/A350 takes a very large entity. But I have seen many cases where the optimum entity size was exceeded and inefficiencies increased exponentially.

If Amazon eat up all the smaller outlets (including in meatspace), and only two or three car manufacturers remain in the world, I would see that as progress, as it streamlines production, without unnecessary duplication (often by those who would be less efficient anyway).

It certainly would cut out all that complicated "setting competitive prices" stuff that Amazon has to do now. Efficiency FTW!

Strictly speaking you don't have that now. The best AMD offering is barely at the mid range of an Intel lineup. But that is, believe it or not quite secondary to the story.

AMD bought ATI, that was probably good from a technical sense, but they went from having no money to having 5 billion dollars less than no money. So they sold their ARM business to Qualcomm. Who, if you frequent job boards for these things, are either actually at old AMD/ATI facilities, or they are right next to them, including ATI he

Back when Intel was pretty much the only player in the desktop market (there were a few Z-80 systems later on, but relatively few), the CPU could be half the cost of the whole PC, and that just kept getting worse until they actually got some competition in the market.

You mean like how a new FX-8150 just keeps up with an i7 920 that was released 4 years ago?

FX-8150 $190

If you are spending $190 on a processor, you have two choices.. the FX-8150 or the i5-3330. There is absolutely no doubt that the FX-8150 beats the crap out of that particular i5. Intel has no competitor to the FX-8150 at its price point.
Why do you Intel fanboys always do faulty comparisons?

AMD created x64; Intel licences it from them. In return AMD licences x86 from Intel. If AMD does go tits up at some point, it will almost certainly be Intel at the front of the queue to buy all the x64 rights.

What if they didn't sell, though? That's a funny thought. Since pretty much all OSs and programs are now 64-bit (even Adobe's suite is almost completely x86-64 by now, if I recall correctly), what would happen if Intel simply couldn't make x86-64 anymore because AMD decided they want to take it to the grave? I don't think it will actually happen ever, but Intel would be in a bit of a picke, wouldn't it?

Samsung has to run its parts and products business independently, otherwise their parts business would lose Apple as a customer, and loathe as anyone is to admit it, Apple is a customer you'd rather not lose.

Yup, entire article (if you can call it that) is garbage. Not even really clear to me what they're trying to say... "Qualcomm dominates this space"? what does that even mean? Qualcomm has no slice of the x86 market, Adreno GPU a success where AMD failed? Ummm... pretty sure Qualcomm needed a quality chip to integrate into it's Snapdragon and AMD was happy to sell one, unless AMD had some secret ARM program that they were planning on taking over the mobile market with that didn't succeed that I never he

Samsung is licensing the SoCs for the US market only. The flagship products (Galaxy S II,III and Note) are all using Exynos for every other market.

Yeah, the summary misses that. And, frankly, my takeaway from this is exactly the opposite of the submitters' - Qualcomm is beholden to the competition, and that's not a good situation to be in. Apple has shown where the future probably lies, being well on their way to bringing their chip production completely in-house; and Samsung obviously has the means to do so. I wouldn't be surprised to see Google itself do some strategic purchasing in this area, if it hasn't already.

We all know what is coming but it feels like the industry has been moving backwards the past few years. Features that were standard are disappearing being replaced by much more elaborate procedures in the name of idiot friendliness. The OS is being marginalised by it's shell UI. Not more computers, just appliances.

Not to mention that the "article" is making it sound as AMD and Qualcomm are even in the same market:
"Qualcomm, on the other hand, dominates this space". What space is the author talking about, exactly?

Also, Qualcomm is licensing ARM Holdings PLC's technology, like just about everybody else, but you won't find many people waxing lyrical about them.

And yes, we need AMD around -- unless we want to go back to days when a Pentium costed an arm and a leg just because Intel said so.

Does Qualcomm even "dominate" the ARM space? Last I heard, Nvidia SoCs are showing up in quite a few "prominent devices," too, and there are numerous other vendors. Some of them target niche applications, but what does that matter? It just demonstrates that the ARM processor market is much less homogeneous than the x86/x64 market traditionally has been, so it's less likely that any single chipmaker will dominate. If anything, it's ARM Holdings that wins, not Qualcomm.

From all the reports coming out of AMD, they're doing no more than what every ARM SoC vendor is doing and including GPU cores on the CPU die, which they were doing well before AMD released the Fusion line.

Only for AMD, the SoC design process they've adopted has resulted in newer processors being slower than older ones.

In AMD's defence - CPU speed doesn't actually matter that much. This is one of those odd quirks of where we are in the software - hardware cycles. A good GPU will likely have *much* more impact on your noticeable computer performance than a 10% faster CPU. It's really bad form to release a brand new CPU that is actually slower than your old one (clock for clock, in absolute terms, etc.) and the tech press pounced on them for it. But AMD *could* have and should have made the argument probably correctly t

In AMD's defence - CPU speed doesn't actually matter that much. This is one of those odd quirks of where we are in the software - hardware cycles. A good GPU will likely have *much* more impact on your noticeable computer performance than a 10% faster CPU. It's really bad form to release a brand new CPU that is actually slower than your old one (clock for clock, in absolute terms, etc.) and the tech press pounced on them for it. But AMD *could* have and should have made the argument probably correctly that you're better off with an AMD Fusion product than an Intel i5 with on chip piece of shit HD graphics 3000 from intel. Granted intel has improved a lot now that they've given up on Larrabee but their HD graphics chips are still horrible compared to what AMD (ATI) can bring to the table.

Well, the benefit to using Intel for "everything" is that at least their drivers are open source. I have a brand new i5 laptop for work. While they are handed out with windows, they do allow you to use any OS that you'd like (unsupported). Using Linux with the i5 machine showed intel for Proc, Wireless, and graphics, and I didn't have to mess with or agree to use "non-free" drivers to operate the machine's wireless or to run dual 24-inch 1920x1200 monitors with no problems.

From all the reports coming out of AMD, they're doing no more than what every ARM SoC vendor is doing and including GPU cores on the CPU die, which they were doing well before AMD released the Fusion line.

Their goal is far more than that, it's not just about the die but integrating CPU and GPU cores into what they call a Heterogeneous System Architecture (HSA) where they live in the same address space and you can alternate between CPU and GPU processing with extremely little overhead. It's a huge change in the way you think of computing. The downside is that nothing changes without software support, your regular CPU or GPU-based code will take no advantage of it and in practice AMD doesn't have the weight to

Qualcomm manufacture ARM chips, like a dozen other companies, there is nothing special about them.

This is explicitly false. Qualcomm designed their own cores that implement the ARM instruction set. They did not license the Cortex A-x designs and glue them together (like every other ARM SoC vendor, including Samsung.) That also ignores the fact that they are the only ones making usable LTE basebands right now. Qualcomm right now is so dominant that if anything, they're the Intel of the mobile world.

Well Qualcomm designs their own chips. They don't just license a CPU core. In that regard they do something similar to AMD which licenses the X86 architecture from Intel but design their own chips. It used to be that AMD had their own manufacturing capabilities but this is no longer true. We can thank Hector Ruiz for that.

Qualcomm started life in 1985 as a maker of cellular communications semiconductors, and it hasn't strayed far from that formula. It's pretty much the go-to company for CDMA chips and is now taking a lead in 4G LTE as well.

My impression is that what ARM licenses out is the high-level architecture - instruction set, external hardware interface, that kind of thing. But not low-level hardware design that implements all that stuff, so every vendor really does their own thing there. In which case it wouldn't be all that different from AMD, since they also license a lot of the architecture from Intel, and then implement it differently.

My impression is that what ARM licenses out is the high-level architecture - instruction set, external hardware interface, that kind of thing. But not low-level hardware design that implements all that stuff, so every vendor really does their own thing there.

My impression is that ARM licenses both - you can design your own ARM processor core and license only, for example, the instruction set, or you can license one of ARM's core designs to, for example, include in a system-on-a-chip along with your own designs or other licensed designs.

I don't know for certain, but, from Qualcomm's Snapdragon page [qualcomm.com], the reference to "ARM Cortex A5" in the description of the Snapdragon S4 Play, I infer that they licensed a core from ARM.

The other Snapdragons have a "Krait" processor, and, from this Qualcomm press release [qualcomm.com], which speaks of another Snapdragon "[featuring] a 1.5GHz quad-core CPU—based on Qualcomm’s Krait micro-architecture", I infer that Krait may be a Qualcomm design rather than an ARM design. Qualcomm's Snapdragon S4 white paper [qualcomm.com]

AMD management made some bad decisions, then got rid of all the people who argued against those decisions. Now they are going to cut costs by firing the engineers who could develop new products. It is now inevitable: AMD is doomed.

"Unless the entire board and their puppets are removed in the next week or two, the little chance AMD has now will vanish. There is no up side here."

"AMD senior management, or (mis)management, as we are now calling them, have delayed the roadmap past the critical point. Project Win was survivable, barely. The churn of technical talent made things worse, far worse, and put the company at the breaking point. Layoffs sapped confidence, and senior management was negligent in not messaging a damn thing to those who mattered internally and externally. The cuts that will follow ensure that the plans in place are not achievable, and SemiAccurate can not see AMD surviving at this point."

Charlie is an armchair CEO of AMD, and his analysis is about as accurate as the name of his website suggests. He has a handful of insiders at mid-level positions and his views of what's going in the company are heavily skewed by those people's opinions.

He is an analyst; his job is to write analyses. He has been rather harsh on AMD, but then he has been harsh on Intel and harsh on nVidia also.

I didn't disagree that that's his job, I'm just saying that he does it poorly. He has a sprinkle of insider information, and paints a very inaccurate picture around it. He probably heard that AMD is planning on licensing out its graphics cores, and misunderstood it as outsourcing. Or, he heard that Markham location will be hit hard and he thinks graphics cores are don

Article is wildly inaccurate. Worldwide, all GS2 phones have exynos processors from Samsung. GS3 all have exynos except for LTE capable variants made for the U.S. and those have Krait series qualcomm processors, not snapdragon. The Note2 is launching with the new LTE capable exynos everywhere, further cutting qualcomm from the largest android manufacturer. If the author can't even get the details right, why would I trust their conclusion?

One, LTE variant as well I believe, which accounted for the tiniest fraction of GS2 deployments. Exynos incompatibility with wifi was already mentioned and the reason was essentially why Verizon held off on the GS2 but made a deal for early Gnexus exclusivity.

Contrary to multiple postings on this thread, Qualcomm designs it's own ARM compatible CPUs (they call the latest version Krait) via an architecture licence from ARM. That's pretty make Qualcomm somewhat like the old-AMD designed it's own x86 compatible CPUs via an architecture license from Intel. However, the new-AMD licences their x86-64 architecture to Intel which designs it's own CPUs (arguably better than AMD).

On the other hand, Qualcomm acts alot like Intel in the cell-phone space. They use their patents on CDMA and other wireless communications and their first generation 4G-LTE modem/radio to bully cell-phone manufactuers into using Qualcomm SOC chipsets very similar to they way that Intel uses their CPUs to bully computer manufacturers into using Intel chipsets. They have been known to threaten to use bundling, bulk pricing, and even limited availability tricks on other low-end high-volume phone product lines to convince cell phone designers to use their chipsets. Thus you see even Samsung is forced to use Qualcomm SOC chipsets even though they make their own Exynos SOC. This makes them definitey not-like the new AMD in this sense.

To be honest, I'm a lot more worried about the prospect of losing what used to be ATI. ARM is bringing healthy competition to the processor market and Intel is forced to dramatically reorient its business if it wants to keep its edge.

But who's really competent in the GPU market? PowerVR? Give me a break. It's a duel between AMD and Nvidia, and if AMD disappears Nvidia will jack up their prices even more than they used to. They've got extensive contact with developers and industries reliant on graphics hardw

Sure and when Qualcomm and Apple get a hold of the 64bit ARM rtl, once they hand tune it (as they have done already with v7 cores), I don't think intel will have a lot of leeway to gouge since low cost alternatives will be available.

Arm played this really well. Start from embedded and then move up to workstation silicon vs Mips who did the opposite.

The Galaxy S3 is only Snapdragon in the US and Japan market; only one of the S2 flavors has the Snapdragon. The Galaxy Note has two flavors, one with the Snapdragon, one without.

Iif you're going to post an article hemming and hawing about the state of affairs in the mobile phone chipset market, at least get the chipset right when speaking about the most popular phone in the world right now (the S3), else you'll look like a complete idiot.

No it's because Qualcomm owns the LTE market. In order to sell a phone with LTE you have to buy a baseband from Qualcomm since they make the only capable LTE chips on the market. Qualcomm (i.e., it's foundries) have been capacity constrained for at least a year now so they can insist you buy their entire SoC with integrate LTE baseband if you want an LTE chip. (That's ignoring the fact that you usually have less power consumption if your baseband and SoC are on the same die.)