AMD Teams Up with ARM to Transform the Datacenter Industry.

Advanced Micro Devices and ARM Holdings on Monday announced initiative that promises to change the datacenter industry. Under the terms of the agreement, AMD will offer datacenter-class microprocessors based on both ARM and x86 architectures. The first AMD Opteron chips based on the ARM architecture and AMD/SeaMicro Freedom fabric are projected to emerge in 2014.

"AMD led the data center transition to mainstream 64-bit computing with AMD64, and with our ambidextrous strategy we will again lead the next major industry inflection point by driving the widespread adoption of energy-efficient 64-bit server processors based on both the x86 and ARM architectures. Through our collaboration with ARM, we are building on AMD's rich IP portfolio, including our deep 64-bit processor knowledge and industry-leading AMD SeaMicro Freedom supercompute fabric, to offer the most flexible and complete processing solutions for the modern data center," said Rory Read, president and chief executive officer at AMD.

AMD Set to Design 64-Bit ARM Server Processors, Will Not Drop x86

AMD's Opteron microprocessors based on x86 and ARM architectures will integrate high-performance production-proven Freedom fabric originally designed at SeaMicro. The Freedom can connect thousands of processor cores, memory, storage and input/output traffic with up to 1.28Tb/s (160GB/s) speed. SeaMicro’s fabric supports multiple processor instruction sets, which makes it compatible with both AMD x86 and ARM technologies.

The integration of fabric into central processing units and eventually enterprise-class accelerated processing units will allow AMD to address different server markets with more or less unified offerings, which will offer different levels of performance and power consumption. AMD does not seem to have plans to put x86 and ARM cores into the same server chips, but will develop common building blocks that will be used for both x86 and ARM system-on-chips/microprocessors as well as various accelerated processing units. Essentially, AMD plans to integrate its leading-edge stream processing capabilities, advanced Freedom fabrics and other innovative technologies into ARM-based server chips to quickly and easily become a leading player on the market that is about to start a quick growth.

AMD stresses that x86 architecture is not going to disappear from servers or PCs in the coming years. Therefore, the chip designer will continue to develop x86 Opterons in addition to ARM Opteron products. It is also necessary to note that AMD does not rule out any possibilities to create a server chip based on its Jaguar ultra low-power x86 microarchitecture. It is possible that in the future AMD's Opteron lineup will include processors based on three different architectures: high-performance x86 (Steamroller, Excavator), ultra low-power x86 (Jaguar) and high-performance ARM (ARMv8 and successors).

"Over the past decade the computer industry has coalesced around two high-volume processor architectures – x86 for personal computers and servers, and ARM for mobile devices. Over the next decade, the purveyors of these established architectures will each seek to extend their presence into market segments dominated by the other. The path on which AMD has now embarked will allow it to offer products based on both x86 and ARM architectures, a capability no other semiconductor manufacturer can likely match," observed Nathan Brookwood, research fellow at Insight 64.

Offering different products with different architectural peculiarities for different markets is a part of AMD’s ambidextrous strategy. It remains to be seen how successful will AMD be with completely different offerings and whether internal competition between ARM and x86 [and potential cannibalization of the latter by the former] will do more harm than good for AMD.

Many large server customers find that they do not necessarily take advantage of all the performance their servers have. In some cases, the cost of a datacenter equipment and hardware equals to the cost of electricity it consumes throughout its lifetime. Therefore, for the majority of datacenter owners cutting power consumption is a crucial thing. Although server software in general is not compatible with ARMv8 64-bit architecture, many big server customers, such as Facebook, are eager to port their apps onto non-standard platforms themselves to save on energy costs.

AMD claims that it is working with several hardware OEMs and numerous software makers to ensure that there is the right eco-system for x86 and ARM server central processing units. AMD and its partners believe that software developers will be quick with porting their programs to ARMv8 64-bit architecture and the situation with slow transition to x86-64 will not repeat itself.

Rory Read, chief executive officer of AMD, believes that ARM-based servers are going to capture a double-digit percent of server market in three to five years. Mr. Read’s optimism regarding ARM-based servers is shared by numerous high-ranking executive around the industry. Actual penetration of ARM into the commercial server segment will depend on availability of compatible software and hardware. But software makers will only port their applications in case there is a market for them, which means actual deployments. So, at the end of the day, the success of ARM and AMD in servers depends on software developers again…

Discussion

Earth shuttering. LoL This does not mean anything for us consumers and there is a question of whether AMD and ARM will manage to steal market share from Intel and bring to both of them some reasonable profit that will justify this move.

Preview

That goes to show just how small your level of thinking is. Doesn't mean anything for consumers, as if. You do realize everything that happens in the big business world, all the technology employed, researched, and produced for them, eventually works its way down to consumer-grade products right?

Preview

You just assume things and every single post of yours is to disagree with me. Your level of thinking is completely off. This teamup and designs will be strictly for datacenters. I fail to see what this haves to do with consumer products. This has nothing to do with consumers and AMD clearly stated is OFF the traditional PC market. I think normal consumers don't need server grade products.
Bulldozers design and architecture was server oriented and unfortunatly it failed at both. If AMD cant design and deliver on its own to compete in the server market, that just shows how weak AMD is.

Edit: Oh i see the other idiot friend of yours jumped in to comment as well. LOL

Preview

And where's your second troll head? It hasn't popped up yet today. I was expecting some of the ol' troll tag teaming with this news. Now go find a bridge to hide under, perhaps a sandy or ivy one. I will trade barbs with you any day of the week. But don't forget to try to make intelligent comments on the articles.

Preview

I prefer the bridge i am under to be solid and to be sure it will not collapse. That is why my ivy is arriving tomorrow. I will keep you updated and will post some benchmark results for you to see. I will be overclocking this baby to its limits. Lets see if Vishera will be able to keep up.

Preview

Even a high school kid can solder, get real. Every computer repair kit has a good pair of soldering tools. Also, that's a bad idea, for various reasons. You have nothing to gain from that, but a lot to potentially lose.

Preview

A lot of us bought AMD's CPUs when they were better during A64/X2 era but we don't sit here and trash talk AMD's inability to compete with Intel knowing the company is 75x smaller. AvON, if someone gave you $10 million, you still couldn't make me better ice cream than Haagen Dazs.

No one cares what you'll do with your delidded IVB since in 7 months a $225 Haswell will crush it.

Preview

Preview

Does anyone else see the irony in this statement? Says alot about your character Avon if you think you are cool buying two Intel cpus back to back even though that is completely unnecessary, overkill, and a waste of money. Intel brainwashing at its finest lol

Preview

It's not hard to see irony and contradiction all over Avon and 123's posts. They type so many comments like this that, psychologically they must have conditioned themselves to believe in their own words.

Preview

Actually 123 is correct that Ivy Bridge beats Vishera/Piledriver. That graph only shows overall performance. If you do all those tasks that techreport did this is the overall performance. Not everybody will be doing all those tasks on a daily basis. Only part of the test that everybody will be doing on a daily basis. Click on gaming and things gets into perspective. The i5-3570K is actually a better buy. Taking that gaming graph as an ideal of the performance vs price. The i5-3570K is overall a better value.

I have an i3-3225 and it may score low compared to A10-5800K, but the gaming graph shows a completely different story. I do not do any gaming but the i3-3225 has equal performance to a FX-8350 and the i3-3225 is cheaper.

Comparing the FX-8350 to an i3-3225 for which processor has the lowest power consumption. The i3 wins. Even the i5-3570K has lower power consumption than FX-8350.

Saying that FX-8350 is best at multi-threading will be AMD light bulb of the day. The amount of programs that are multi-threaded are at low numbers, so multi-threading does not count.

Preview

"The amount of programs that are multi-threaded are at low numbers, so multi-threading does not count."

I'm sorry? There are tons of multithreaded software in existence. In fact, it's rare to find a program that isn't multithreaded. Even modern games are getting well threaded.

There is another factor too, a factor benchmarks can't measure, responsiveness of the system. I've used i7 systems, and FX systems. FX systems feel smoother, a lot smoother with applications and multitasking. Look tecknurd, people who need the power and buy products like FX-8350 are already power users who do a lot of multitasking. These aren't people from the dawn of GUI OS's who still use one program at a time.

Preview

Preview

Apparently you completely missed the point... again.... I think it's time you got off this site and did some research. Let's start from the top, shall we?

Where did I state that consumers are buying server grade products? All technology starts in the server world, all the way back to the ENIAC, and even the Internet. Going back farther, people didn't see the need for telephones as they were only for big important businesses and government facilities, but eventually that tech made itself down to the consumer world. The same thing goes for the invention of electricity, where people said they were fine with their kerosene lamps, but eventually even that made itself down to the consumer world. The same thing is true for all technology involved in computers. RAM, motherboards, protocols, standards, everything begins in the server world. Server products are server products because of the premium you are paying for higher end technology. After the technology has had it's use in the server world and the servers are moving to bigger and better things, you find server technology trickling down to consumer products. Let me guess, you are using a multicore processor, where do you think the first multicore processor came from? It was an AMD Opteron server processor.

Preview

Well i will have to say this again, AMD is OFF the traditional PC market. Whatever you try to pass through AMD Opteron Bulldozer based and Vishera Sucks and it Sucks even more for desktop computers. And they will never make it back to high end desktop products. And that is a fact.

Preview

Can you please remind us about what this conversation was already about? Instead of ignoring the point of the discussion and bashing both me and AMD, can you get back to the point? Oh, that's right, you don't read comments, you just hit the thumbs down button on everyone and comment on how much you hate AMD.

You have this pattern of derailing the topic as soon as you have lost the argument.

Preview

All his responses will lead to the same conclusion: AMD sucks, they failed, blah blah blah.

Then he goes out and buys IVB, what a joke. 7 months before Haswell launches, he wastes $ on a SB refresh that needs delidding to even make it worthwhile over Sandy. He doesn't even sounds like an intelligent Intel consumer. Who would be stupid enough to buy into Socket 1155 at this point when it's near EOL as early as June 2013? He can't even do simple things like timing his Intel CPU upgrades properly and he criticizes AMD's strategy.

Preview

Preview

Yes i need a new computer because my 1090T is not doing the job done for me anymore.
So my only option is ivy which is of course miles ahead of phenom ii and vishera.
And besides i only need a motherboard and the CPU. I have all the rest that i need. I am going for the Z77A-GD65 board.

Preview

You never answered me. You said this development has no affect on the end user, I explained yes it does and how, you did not respond (you lose by default then). Face it, the events here are big news for everyone, and the fruits of this will make the way down to consumers in the near future whether you like it or not.

I'm sorry you are upgrading a perfectly good processor to something that isn't really an upgrade, even for gaming, more or less a sidegrade if anything. Meanwhile, I have an army of ten FX-8120's in a cluster configuration that can eat your little 'ivy' alive in performance per watt and sheer computational power. Unlike you, if I buy hardware, I need a lot of power, not one computer, not two, but as many as money can buy.

Preview

Preview

More than your brain could grasp. A lot of batch jobs involving x264, neroaacenc, ffmpeg, various types of servers from web hosting, game hosting, SAMBA and FTP file servers, email, or whatever else I need to do at the moment. Idle cycles are spent in distributed computing projects to help medical researchers. As I've said earlier, my home network is more advanced than a typical corporate network. The power consumption really isn't that bad, it's actually quite low. Desktop hardware is good enough to be used as server hardware these days, especially FX processors.

Preview

Compared to Sandy bridge, ivy bridge is about comparable for over clocking headroom. That is why it is a side grade

In any case, the frequency potential of the new Ivy Bridge processors turned out to be below our expectations. We didn’t manage to overclock them even to the same heights as the previous-generation Sandy Bridge. So, we can state that the overclocking potential of the newcomers has become worse, which may have been caused by the reduction of the geometrical die size of the new Ivy Bridge. Its overall size is 25% smaller than the Sandy Bridge die, and the computational cores have become only half the size of the Sandy Bridge cores. However, contemporary approach to processor die cooling doesn’t allow increasing the heat flow density in equal proportion, which causes local overheating of some parts of the processor cores during overclocking. High operational CPU core frequencies indirectly confirm that this problem indeed exists, although the processor cooler remained practically cold in this case.

Preview

@mmstick
"I'm sorry you are upgrading a perfectly good processor to something that isn't really an upgrade, even for gaming, more or less a sidegrade if anything."

I hope you are joking on this claims you are making. Those staments you just made were way off. In fact i can visually see by eye that on some games This CPU is holding me back And laging in some cases. Instead of wasting money on a expensive GPU which will cost me 450 dollars and more and actually i get my hardware much more expensive since i dont live in the United States. I will get FPS boost from the ivy bridge somewhere about 20 to 30 FPS and also have a much better CPU. That is making your upgrades wiselly not just some one sided fanboyism ideas like yours. You are clearly brainwashed and AMD is like a religion for you.

Preview

ROFL, 20-30FPS? From what, 150FPS to 180FPS? Get real, your monitor doesn't even refresh fast enough for that kind of framerate. I'm sorry Avon, but I'm an avid gamer myself. I bought a $500 GPU to go along with my FX-8120, and I own a steam list of over 400 games. There isn't a single game that I own that the framerates are less than the refresh rate of my 60Hz 1920x1200 monitor, that includes the most demanding game on this planet, Shogun 2.

You speak of making upgrades wisely, yet you are replacing a perfectly good CPU with something that isn't that much better. You want to talk about fanboyism ideas and religions. Take a look in the mirror Intel fanboy, Intel isn't like a religion to you, it IS your religion.

Preview

You are a moron i cant do 150 fps you idiot. i can do max 70 to 80 fps and i have an hd6950, but on some very demanding games it wont go that high. Unless you are claiming idiotic things again that you do 150fps in very high demanding games which i would not be surprised based on your silly claims until now. Don't tell me that you play games at silly low resolutions and not extreme settings.
Its true infact AMD cpu's suck so much at gaming its unberable to stand this type of performance from silly AMD cpu's. In fact i don't understand you AMD fanboys how you deal with this kind of crap. I bet my 1090T kicks the crap out of your Bulldozer in gaming.

Preview

Get real, My system used to be an overclocked Phenom II X6 1100T, then upgraded to this FX-8120, there was a decent improvement in framerate, but more importantly better responsiveness. Your problem is you have a 6950, and not a 7950, that's your problem, it has nothing to do with your CPU, and improving your CPU isn't going to magically improve your framerates more than you already get. Avon, you are way too contradictory, first you ignore my comments, then you say I claim idiotic things, and then you go off on the fanboy wagon. ROFL, didn't you bother to read my comment. I game on a 28 inch 1920x1200 resolution monitor. I may not get 150FPS in games like Shogun 2, but having an Intel CPU wouldn't magically get me there because graphics cards aren't strong enough. No, what you are stating, that 20-30FPS improvement, is ONLY possible to be seen in a benchmark when you are running a game at a really low resolution, and you are already getting high framerates like 150. The jokes on you Avon. In fact, even in those 'CPU tests', where they put the games in low resolution in order to test how a CPU can effectively utilize a GPU in these low demanding environments, the difference between AMD is still so small, perhaps 5% difference, that any overclock overcomes that difference, therefore making your FPS point moot. Name one game that I cannot get at least 60FPS in with my 7950 and FX-8120@4Ghz, I dare you. Shogun 2 is the most demanding game on the market, and I have no problems running it perfectly smooth. BF3 is so well threaded that even using Intel has 0% advantage. Please enlighten me to these nonexistant games of yours that I cannot run well, I'd like to play them.

Preview

You are a complete ass and an idiot i never said that i am getting 150 fps, its you that pulled out those numbers out of your arse. LOL
You are ashamed to state your framerates with your 7950 cause you know the cpu sucks. LOL
I am probably doing the same frame rates as you with a lower GPU the 6950. LoL That 8120 is worth nothing at all. LoL
I am curious to know your framerates at f1 2012 all maxed out all on Ultra my fps does not go bellow 70. What is yours?

Preview

Really now, out of all games you pick one with the least need for CPU. F1 2012 is not a demanding game at all, my FPS does not go below 100 with 1920x1200 8xMSAA on ultra. Even if I were to get more than 100 FPS I wouldn't be able to tell the difference because my refresh rate is only 60Hz.

Preview

You are wrong, on my 1090T when i checked all 6 Cores were utilized very well. About 50 to 64% on all cores.
Edit: Why you have downvoted me, just because i am correct?
Its not my fault if your bulldozer doesn't know what to do with its cores. LOL

Preview

Preview

AvONbaCK, it seems everyone but you who followed AMD's public strategy knew this announcement had nothing to do with desktop/server x86 CPUs, but a new direction for AMD. AMD has said nearly a year ago they are no longer interested in competing with Intel in the high-end CPU race. Wake up man, wake up! It has been publicly known information.

You are seriously lost. AMD is licensing ARM to be able to enter the market much quicker than had they chosen to make their own efficient CPU design. Frankly, Intel can't make a more efficient server CPU than ARM, so doubtful AMD could have either given the urgency of being first in this micro-server space.

It looks like you and your other troll friends are the only delusional consumers who are still thinking AMD will produce an Intel beating consumer CPU. You might as well ask Honda to make a Bugatti Veyron beating supercar with a fraction of the resources VW Group has. Stop being delusional.

Preview

What do you mean looks like?

You and AvONbaCK have been living under a rock or something? AMD a long-time ago announced they are done in the high-end CPU space. They will only make gradual improvements about 10-15% per year but that's it. No more chasing to beat Intel in high-end x86 CPUs.

Why do you think AMD took a step into the microserver market in March of this year with the acquisition of Silicon Valley startup SeaMicro for $334 million? Their strategy change was evident for a long-time to anyone who actually followed this company closely. But it looks like you and AvON are still surprised by this news.

I told him already:
- AMD doesn't have the $ to design a faster CPU than Intel on paper
- AMD doesn't have 1 node manufacturing advantage that it needs to execute the design
- Therefore, outside of multi-threaded apps by virtue of offering more cores in AMD x86 CPUs, Intel will continue to hold the edge for a long-time.

If you are willing to spend > $200 on a gaming CPU, you go Intel, end of story. But that market of PC gamers is small. Even when AMD had a good gaming CPU during XP+, A64/X2 eras, it barely made made a dent. Most consumers will buy Intel CPUs on brand name alone and marketing even if AMD were to design a better CPU (not like it's happening).

Preview

Except AMD has the best graphics card on the market. Both in gaming, the professional workstation world, and in science. One 7950 of mine completes well over 3,000 Help Conquer Cancer work units in a day, meanwhile Mr. GTX 680 is barely scraping 400 work units per day. Everyone at WCG is swapping their NVIDIA cards for AMD cards because that is where the power is.

Preview

@BestJinjo
I am afraid that is you that needs to wake up.
If AMD is no longer interested in High-End CPU's that means they can no longer compete in the server market either. That is why they made this deal with ARM because Rory f**cked up the company so badly and lost allot of talent that it can no longer deliver good products and their only option is to team up. Plain and simple. Something that you are having difficulties on understanding what is really going on.

Preview

You seem to not understand that there is a market for power efficient servers that don't require the high-processing power of x86 10-core+ CPUs. For those uses, no one in the world has a more efficient CPU compared to what ARM offers. You realize no company in the world has been able to dethrone ARM in the smartphone/tablet CPU space? Why would you think AMD can do it when even Intel can't do it?

There are already companies moving into this direction as Austin, Texas-based start-up Calxeda is also focusing on ARM-based processors for data centers.

You seem to be blaming RR for where AMD is today without realizing AMD's problems started in 2006 when they bought ATI. They ran out of $ to design high-end x86 CPUs for desktops, laptops or servers. It's amazing it took 6 years before they ended up where they are today. It's a miracle AMD survived this long against Intel. It is you who is delusional since you continue to want AMD to do something that's financially impossible given their resources.

You seriously have no idea what you are talking about. You are asking a micro-brewery in Wisconsin to create a better selling beer to compete with Stella Artois or Heineken. You are asking Mazda to make a 911 competitor. Are you joking? AMD never had a chance against Intel and it was only a matter of time before they gave up. We knew it was coming sooner or later.

AMD Market cap = 1.46B
Intel Market cap = 109.82B

You and others here seem to be in denial, expecting a company 75x smaller to compete with Intel in the high-end x86 CPU space. AMD doesn't even have its own fabs. Get real.

The only reason AMD even had a glimpse of brilliance during A64/X2 eras was mostly because Intel made a mistake with Netburst. If Intel brought to market Benias, AMD would have been behind in every generation in CPUs since its existence.

Preview

Preview

I am pretty sure the highest-end P3s beat out K7s. Those Tallatin P3s were pretty fast. Also, Pentium 4 is Netburst gone wrong, which is what I mentioned already as the only real time when Intel royally messed up. Other than A64/X2, AMD was always behind and competed on price/performance and overclocking. CPUs like XP 2500+ Barton were awesome for the $ and gaming, but overall they still lost to the best Pentiums of that era. So even with Netburst, AMD was barely ahead until they launched A64 and that was mostly Intel's flop with P4. With A64/X2, AMD still had just 25% CPU market share despite having a superior CPU. In reality though Intel was never even behind in architecture since it delayed C2D and massaged Benias. Had Intel launched Benias to compete with A64 intead of Pentium 4, AMD would have been behind as well.

My main point is now AMD has no $ at all to compete with Intel in the high-end CPU space and people who keep saying how AMD is a failure are id**ts since it's like asking an auto company 75x smaller to compete with Toyota or GM or Ford. Delusional!

You know why VW Group loses $4 million dollars on each Bugatti Veyron they sell because they can. Small car companies cannot afford to make the best supercars just like small PC companies cannot afford to make the best CPUs. Why people still expect AMD to make a faster CPU than Intel is mind-blowing. The funny thing is they offer no alternative strategies but continue to live in their dream world that AMD should continue to spend millions trying to beat Intel when it has failed to do for 6 years in a row.

Preview

Preview

@BestJinjo
Tha is due to their own doing and wrong decisions. And most importantly bad Management and The Board that pulled off the plug. What did they gain? "Failure" That is very simple and the short story.

Preview

It should be noted that AMD's market cap is so small because they are failing to make money, not because the company itself is somehow tiny. When they were doing well in 2005, their market cap was close to 30B, and Intel was at 90B.

Preview

That's right. Apple for example has a disproportionately large market cap compared to company size and current profits. So market cap also represents punters' bet on future market share/ profits/ company growth. AMD's smaller cap represents punters' bet that the company's profits will nosedive over the coming quarters through loss of market share in the future and also punters' loss of faith in its plan for future growth making it difficult for AMD to issue shares (or the financial like) at a higher price to raise capital. That is why Read has to come out swinging with AMD's new plans to sure up investors before it becomes a negative feedback loop. The "ambidextrous" plan he proposes today is, in other words, not to put all the AMD eggs in one basket when there are Intel piggies about. Creating more opportunities for business profit out from underneath the shadow of Intel will attract more investors.

Preview

Preview

@AvONbaCK - Like there's nothing certain in life? Can't fault your insights there. Btw, this does mean something to consumers because micro servers are designed to run web services. And web services are for whom? That's right. A quick shout out to all the Yanks on the east coast preparing for Sandy. Good luck. Are thoughts are with you.

Preview

It's just how technology has worked in this world for thousands of years. The highest grade technology fresh out of research and development goes to the highest echelons of power. In this case this is server grade use. Consumer hardware is older technology that has aged to the point where it is suitable for mass production and use in the consumer market.

All developments here are a sort of insight of what we will see in consumer products in the future.

Preview

AvONbaCK, it seems you really don't understand x86 CPU business, nor the current trends in the marketplace outside of x86 CPUs.

AMD was the last company on earth who could produce a competitive x86 CPU processor to Intel's. But even AMD has conceded that it is impossible to make a superior x86 processor given the fraction of the engineering and financial resources they have. As I told you in another thread, you also don't seem to get that Intel will have a full node lithography advantage over AMD for at least 5-10 years.

Therefore, instead of wasting hundreds of millions of dollars AMD doesn't have to satisfy your specific CPU gaming needs, they are looking at a big picture of where the world is going and trying to invest into those opportunities. It may or may not work but trying to compete with Intel in the x86 high-end CPU/server space is futile and AMD knows it. Since you haven't offered any alternatives to what AMD's management should be doing, but only continue to criticize everything they have been doing, it looks like it's impossible to discuss this topic with you objectively as you'll hate any direction AMD goes unless they magically create some Haswell beating CPU.

Preview

Idiot they don't need to compete in the high-end just the low/mid end. They need to focus on sub 10 watt APU's ACROSS THE BOARD. Too late for AMD to do that NOW IT WAS THEIR ONLY CHANCE TO SURVIVE. Now the are either going to die or become just another ARM maker which the market is OVERCROWDED WITH those.

Preview

You are calling me an idiot?

I didn't invent market or consumer trends. Tell all the consumers who don't care about traditional PC space anymore or corporations who want more choices in the server market space when it comes to efficient CPUs.

Last quarter iPad sold more units than the entire desktop PC OEM market. Next year iPad will sell more units than the entire laptop PC market. You can deny it all you want but the # of PC gamers like us buying GTX680s in SLI and Core i7 3770K @ 5.0ghz is tiny. It's practically immaterial in making $ overall. Even for Intel most of the $ is in servers/workstations, while for Nvidia it is in the professional GPU space. Less than 10% of NV's desktop Kepler revenue comes from their $300+ GPUs, while the entire consumer desktop GPU division for NV is less than 20% of the entire company (in other words $300+ desktop GPUs are less than 2% of cash flows for NV). That shows you how small the market of PC enthusiasts is overall for these companies.

AMD never said they will stop making x86 products or stop focusing on APUs. The whole point of this deal is to integrate x86 and ARM CPUs into servers and allow them to work together using the SeaMicro software fabric. No other company right now knows how to get x86 and ARM processors to work together. The APU x86 strategy is there to stay, but even with Trinity being a competitive all-around CPU vs. Core i3s, it will hardly sell enough to make a difference since the traditional PC market is shrinking, not growing. The PC market is dying, as consumers no longer care about laptops or desktops. None of my friends want a new laptop or desktop. They are all buying iPad, tablets and smartphones every 2 years like clock work, while their desktops/laptops are from Core 2 Duo eras.

The major growth in the next 5 years are more efficient servers, smartphones and tablets. No need to waste $ designing $300-1000 CPUs if you know you can't beat Intel anyway. It's common sense to anyone except you and AvonX and that guy jmlg or w/e his name is. AMD failed with Phenom I / II and Bulldozer during the time when they had even more $ than today. No company in the world will beat Intel in the x86 high-end CPU space for at least 5-10 years at this rate. AMD knows it can't do it unless the company grows 5-10x the size of today. And the only way to grow is to move into other markets where there is strong growth.

Preview

That's because your friends are idiots who just want their new apple toy everytime comes out with a upgrade. Fool the Desktop is not dead just yet. I never said AMD could beat intel at x86 however AMD is spreading its wings too far I can't wait to see them GO DOWN. AMD doesn't need to grow to 5-10x the size they need to focus on APU's AND ONLY APU's below 10 watts that is. AMD is doomed as a company bye bye roy.

Preview

Well that's your problem, not AMDs. Why should they focus on APUs and lose out on all the opportunities and customers who want their other products? Haven't you seen their graphics card division consistently producing better graphics cards than NVIDIA year after year lately? I mean really, even in the workstation and scientific field AMD graphics cards are heralded as the king of performance. I get well over 3,000 work units in a day of Help Conquer Cancer complete off a single HD 7950, meanwhile the people with GTX 680s barely even manage 400 work units in a day. A guy with dual GTX 590s could only barely manage 1500 work units in a day.

Preview

Preview

mmstick and linuxlowdown you guys are retards to put it bluntly. Both of you are AMD fanboi's with 0 IQ points. Its fun to tick off AMD Fanbois because they can't see the writing ON THE WALL. AMD IS DEAD its a thing of the past and the glory days of AMD are LONG GONE and are never going to come back.

Preview

2.

This is pretty big news, ARM decided to throw its lot in with AMD and if they can make their products worth while to AMD and ARM's existing and hopefully new server customers the future could look pretty ggod for both companies. Client space is getting less and less important as table techonology and general computing mobility mature so having more datacenter options to choose from and on a more energy efficient level is definetely something worth investing in.

Preview

Preview

4.

$BestJinjo$"(...) while for Nvidia it is in the professional GPU space. Less than 10% of NV's desktop Kepler revenue comes from their $300+ GPUs, while the entire consumer desktop GPU division for NV is less than 20% of the entire company (in other words $300+ desktop GPUs are less than 2% of cash flows for NV)"

I dont wanna hijack thread where youre competing with your friend but please explain this nonsense

Where do you think frikkin envys money coming from if not from GPU market. Their failed to be adopted Tegra2/3 chips that have how many smartphone units (SPU) sold. NTM that those came at fraction of price their CPUs are sold.

Or your're implying that they have made more catastrophic GPU revenues than is that of Tegra?

"I am pretty sure the highest-end P3s beat out K7s. Those Tallatin P3s were pretty fast. Also, Pentium 4 is Netburst gone wrong, which is what I mentioned already as the only real time when Intel royally messed up."

It's Tualatin and it's just 0.13um P6 derivative and it was nowhere competitive with 0.18um Palomino (K7 w/ SSE), which was a furnace btw but in those days didnt matter that power/compute ratio. It fall of because of Synchronous FSB was 1980's archetype and because even it had lot of resources P6 lacked of IPC until it's redone in Core2 incarnations. Banias/Dothan aka P6-M were just same old power sane P6 attached to Quad-pumped FSB which was ... part of NetBurst architecture. And that give them more appealing modern look.

And when you're feel so you made us "NetBurst arch" (P4P) explainable please tell us why and where did "it went so wrong" as you said. I above explained one part that was good and there were many others good parts.

"AMD was the last company on earth who could produce a competitive x86 CPU processor to Intel's. But even AMD has conceded that it is impossible to make a superior x86 processor given the fraction of the engineering and financial resources they have. As I told you in another thread, you also don't seem to get that Intel will have a full node lithography advantage over AMD for at least 5-10 years."

AMD and its silicon manufacturing partners are already full node shrink behind intel but it seems that made them no worse to make their CPU competitive from times they were lagging only for few month.

Why do you think AMD ever wished superior x86 CPU? To stay competitive with intel?
AMD always offered only more appropriate solution to PC market that was on same level as that of Intel just a year behind. In days of Pentium & K5 that wasn't so. Just like nowadays with their Bulldozer 32nm vs. Intel's SandyBridge 32nm. Intel this time even felt confident enough to delay his new Ivy Bridge CPU lineup claiming some bogus issues with current/future chipsets (iirc)

Why did AMD ever wish to implement x86-64 (AMD64) instruction on what was already Intels home turf? And if we except that, and every other instruction set AMD introduced was a spill out. Like their failed to establish 3DNow!, or SSE5 SIMD subsets. While the latter one is canceled during implementation in favor of crappy AVX which resulted that conceived Bulldozer on proven 45nm SOI node didn't saw daylights which resulted in production delay for two whole years.

Why did AMD so eagerly expect to build its Bulldozer modular architecture?

(btw. It's uncomprehensible to read a thread where you two competing in your economic skills mixing stocks market cap with companies products and so. AvONbaCK, and that $$unnamed$$ are freakin flamers telling us nothing)

Preview

Preview

Idiots like "BestJinjo" think that just because AMD cannot compete in the PC Desktop Market all of a suden its dying off. LoL
Actually Desktop PC's has always been the future and always this stupid consoles has been holding us back.
If they focused on Desktop PC's games and apllications in general would be YEARS ahead. Its a shame really that idiots like this are ruining the market.

Preview

5.

ARM's CEO had to endure the incompetence of its travel staff that booked him to fly to San Francisco through New York in the middle of the chaos that surrounds the Hurricane Sandy....

The same will happen with this new "Project Win #2" which will marry ARM w/ x86. I don't see any single benefit of Arm w/ x86.
14nm node in 2014 looks the kind of fiction also. It's impossible from any angle.
Rory should have worked in marketing business.

Preview

"Rory should have worked in marketing business"
That is what he was doing at Lenovo and he thought its same thing working at AMD. LoL
He ruined the company completely and brought it down to its knees. Insted of helping he made things even worsed. I don't see how they will get out of this, but one thing for sure is that it will not be easy.

Preview

6.

It's a great opportunity for AMD at least they weren't be hedged by the Intels clause where AMDs x86 cant be higher than 35% for two consecutive quarters if they want to keep their x86 license

Bad thing is that AMD is once again fully streaming to servers just like during AMD64 introduction. I'd be more than pleased to see few cheap to implement ARM cores in current x86-64 based PC solutions, and i dont like the feeling that i'd have to wait when AMD will become so charitable to implement it into PC grade CPU As we dont need Freedom fabric in them it should be done sooner rather than later. Excavator core on 22/20nm seems perfect candidate for me.

I certainly wouldn't like to see once again AMD needs to consult Intel how long to delay their products in someones Favor like they did with Bulldozer and favorizing Intel and their AVX.

Preview

Seriously that is the best decision AMD has made during last years so Rory looks like a wise man now!

After initial deployment in Servers the AMD's ARMv8 64-bit SOCs will definitely appear in Tablets and Smartphones which will replace the current desktop PCs and Laptops very soon.

In 2-3 years from now there will be no desktop PCs and no Laptops, there will be only tablets with detachable or bluetooth keyboards and smartphones which you'll be able to connect via wireless interfaces to Ultra HD TVs and Monitors.

And in 4-5 years from now we will not have to carry heavy laptops to work - we'll come with smartphone at the office and wireless keyboard, mouse, monitor will connect to our smartphone automatically and all data will be processed and stored on our smartphones.

Preview

I think that it is apparent that Rory Read got the CEO gig at AMD for proposing the ambidextrous market plan back in 2011 during his job interview process. It needs someone with good contacts in the industry to pull it off. Having been CEO at Lenovo previously, he probably has Warren East on speed dial.

Preview

8.

AMD is too small at the moment INTEL is sleeping and waiting for AMD to grow up 10x before INTEL makes the move to take over. INTEL does not worry about AMD making great improvement to compete in INTEL market segment. the more AMD improve the better for INTEL at the last AMD´s existing. because it only mean INTEL will not have to wait too long for AMD to grow up 10x. INTEL at 75x the capital of AMD will only need to spend a few of those 75x to take over AMD making it the GIANT of CPU/APU this time. anyone else care to take down the GIANT or HULK. it´s unlikely anyone else dare to do it. INTEL IS WAITING TO COLLECT ALL PROCESSOR JUNKS. IT´S NOT FUN TO COLLECT SMALL THINGS SO INTEL IS WAITING FOR AMD TO GROW LARGER FOR THE CHALLENGING AND MORE FUN(EXCITING).

Preview

9.

that collection was for INTEL but for me. i was exciting about that 8 cores at 4 GHz for a little Virtualbox system. but it was a long time waiting since AMD last generation 8 cores failed miserably. the new generation seems to be abit better but i need to skip lunch and dinner to get enough $ for the fun.

Preview

The only loser we see is the 'Intel fanboy' here. He thinks that just because a processor from brand X is 15% faster that it deserves the extra $200 and so he must type pointless drivel to AMD articles. Meanwhile, everyone who is actually using processors doesn't care about your opinion.

Preview

Yup good luck with tablets, mobile phones, apu's and and the power hungry AMD cpu's that pull nearly double from intel and slower performance. LoL
In fact i just received my ivy today and its already hitting 4.9Ghz stable i am ready to push this puppy to 5 Ghz. Seams that i pulled off a good chip. Got lucky

Preview

Preview

They are simply here to instate their opinions that anything involving AMD will automatically 'fail', despite results actually showing otherwise. As well, from a business perspective, the death of AMD is very much impossible due to how well rooted they are.

Preview

Preview

I'm not a part of your world, certainly. Your world is that of a delusion, a delusion I wish to be no part of. While we are enjoying our Linux and AMD, you can enjoy the sinking ship that is your wallet with your Wintel.