AMD today unveiled its most powerful member of the legendary AMD FX family of CPUs, the world's first commercially available 5 GHz CPU processor, the AMD FX-9590. These 8-core CPUs deliver new levels of gaming and multimedia performance for desktop enthusiasts. AMD FX-9000 Series CPUs will be available initially in PCs through system integrators.

"At E3 this week, AMD demonstrated why it is at the core of gaming," said Bernd Lienhard, corporate vice president and general manager, Client Products Division at AMD. "The new FX 5 GHz processor is an emphatic performance statement to the most demanding gamers seeking ultra-high resolution experiences including AMD Eyefinity technology. This is another proud innovation for AMD in delivering the world's first commercially available 5 GHz processor."

"AMD continues to push the envelope when it comes to desktop capabilities and power performance," said Wallace Santos, CEO and founder of MAINGEAR. "In unveiling the world's first 5 GHz 8-core CPU, AMD continues to lead the way in innovation while providing our customers with a best-in-class experience. We are thrilled to be part of this exciting launch."

The new 5 GHz FX-9590 and 4.7 GHz FX-9370 feature the "Piledriver" architecture, are unlocked for easy overclocking and pave the way for enthusiasts to enjoy higher CPU speeds and related performance gains. Additionally, these processors feature AMD Turbo Core 3.0 technology to dynamically optimize performance across CPU cores and enable maximum computing for the most intensive workloads.

AMD was the first to break the 1 GHz barrier in May of 2000 and continues to set the standard in technology innovation including the first Windows compatible 64-bit PC processor and the first native dual-core and quad-core processors. AMD also introduced the first APU (unifying CPU and Radeon graphics on the same chip) and the first x86 quad-core SoC, continuing forward with HSA architectures and programming models.

The new AMD FX CPUs will be available from system integrators globally beginning this summer. Two models will be available:

147 Comments on AMD Unleashes First-Ever 5 GHz Processor

by: drdeathxYou make a good read but do you realize less than 2% of users are enthusiasts and all of this this means nothing. All people will see is 5GHz and BTW, the 5GHz processors will give 20% more performance than FX-8350. So that said, it is good marketing.

Think about it this way, Intel just launched a line of new CPU's. AMD has to somehow answer for this, but releasing a similarly clocked Piledriver core like the FX-8350 is useless, AMD had no choice but to increase the clocks. This is not just for marketing; it was more of necessity IMO. Now if AMD were to release Steamroller cores instead, that would have been a different story. I don’t see Steamroller cores clocked at 5GHz, I don’t think the initial CPU’s will be that fast. If rumours are correct, Steamroller will defeat Piledriver clock for clock by about 30% or more.

by: VinskaWhat point are You trying to make MLScrow-san?
You write all these things about bulldozer being bad, how this new chip would underperform and so on. But what's Your goal? What are You trying to achieve by all this?

(doubleposts are frowned upon, you know?)

Well seeing how he has six posts in this thread and only this thread since he has joined I wouldn't be surprised if he just so happened to be a little more intel than anyone of us. I wouldn't be surprised if he his ip traced back to somewhere blue if you will. Considering his arguments seem to match the typical M$ toad.

by: RCoonI'm all for opinions and discussions, but you know, only when the user knows what they're talking about, and have actually done some proper research. Implying the 8350 is bad i find amusing. I also find it amusing when people compare a £150 chip to a £270 chip, and then boast how the latter outperforms. Most of the people here who are capable of discerning benchmarks properly will know they are an extremely poor way of visualizing a components actual real world performance.
Also not entirely sure where this "people feeling burned" thing is coming from. This is a preoverclocked chip with warranty. There are a huge percentage of users who dont overclock, so this chip is an awesome way for them to grab a hold of an excellent gaming processor without any worries of its performance.
I dont know if you just bought a 3770k, or if you're really sensitive about benchmark scores (you shouldn't, its like having low self esteem, just doesnt make sense), but if I were building a new rig for someone, be it budget or a high end single gpu rig(I'll underline that so you can do research about single GPU PC's and how processors perform, HINT HINT GO TO ANANDTECH) I would gladly buy my third 8350.
There is nothing you can say that would otherwise change my opinion, and would kindly request you dont bother quoting or bother replying to this post. Instead, go and find out what benchmarks actually mean, how they're done, and how the 8350 performs in terms of its price/performance in every task. You might find out just why so many people still buy them.

You know its strange to me , the discourse regarding dual gpu on amd systems as I have four gpus in my main rig and despite the bottleneck alleged I do get reasonable fps on most games with sensible ultimate settings on 1080p ie morpho AA plus 2x msAA not crazy aliasing , two for render two fpr physx btw.
I can except an intel might run these better (2011 skt only though) but ive not owned an Isomethingmeaningless anyway so my 8350 @4.9 runs very well imho, I also have a pciex ssd thats going back in when I get hold of a socket extender there aren't many intel setups with 4-5 gfx use able pciex slots and mobo and cpu wouldn't cost 《300uk notes as mine did.

*sigh* If anyone actually wants to have an intelligent discussion about this with me, again, please make intelligent comments and refrain from insults and ridiculousness. I know just about everything there is to know about AMD's chips and Intel's chips and how they compare to one another in just about every single category, including price/performance, in which AMD easily takes the crown, but we weren't discussing that, we were discussing performance alone. One may be priced more than the other, but they both offer 8 cores (physical or virtual) and are, for the general public, the best of each company (excluding the E parts from Intel).

And just for the record, the number of posts that I have in a thread has absolutely no relevance to my personal preference for one company's product over the other, but, if you read anything I wrote earlier in the thread, you'd realize that I fully admit to being an AMD supporter more than an Intel supporter. I like AMD, I'm just disappointed in Piledriver. It wasn't enough of an upgrade over Zambezi for me to want it. If it was for you, good for you, enjoy what you bought, but again, SR FX will be much better and I also don't think that SR FX will come out starting at 5GHz, but one can hope. Considering it's being made with High Density Libraries, it will cut down on the clocks. I personally foresee it being released in the 4GHz range, but only time will tell.

What's the price on this 5GHz booster gonna be? Surely it can't be super $$$ like some posts threw around last month. (Just curious -- I'm working on an A8-5500 mini-itx openSUSE 12.3 build presently.)

by: MLScrow*sigh* If anyone actually wants to have an intelligent discussion about this with me, again, please make intelligent comments and refrain from insults and ridiculousness.

I mostly see people trying to have an intelligent discussion and hardly any "insults/ridiculousness" at all. *shrug*

by: MLScrowI know just about everything there is to know about AMD's chips and Intel's chips and how they compare to one another in just about every single category, including price/performance

Meanwhile, every time I asked You about that I was silently ignored. [yes, more than once. srsly, man, WTH]
If You claim to know that much (and claim You a lot - nearly every post), why don't You share it? Nobody will care if You flail around claiming to know everything, while not showing anything to back it up.

Interesting discussion...quite in depth for the most part, but interesting all the same..makes me wonder how many watts my 2500k pulls while sitting under load at 5ghz,obviously it's only got half the cores and it was only £150 retail but there is food for thought..

by: MLScrow*sigh* If anyone actually wants to have an intelligent discussion about this with me, again, please make intelligent comments and refrain from insults and ridiculousness. I know just about everything there is to know about AMD's chips and Intel's chips and how they compare to one another in just about every single category, including price/performance, in which AMD easily takes the crown, but we weren't discussing that, we were discussing performance alone. One may be priced more than the other, but they both offer 8 cores (physical or virtual) and are, for the general public, the best of each company (excluding the E parts from Intel).

And just for the record, the number of posts that I have in a thread has absolutely no relevance to my personal preference for one company's product over the other, but, if you read anything I wrote earlier in the thread, you'd realize that I fully admit to being an AMD supporter more than an Intel supporter. I like AMD, I'm just disappointed in Piledriver. It wasn't enough of an upgrade over Zambezi for me to want it. If it was for you, good for you, enjoy what you bought, but again, SR FX will be much better and I also don't think that SR FX will come out starting at 5GHz, but one can hope. Considering it's being made with High Density Libraries, it will cut down on the clocks. I personally foresee it being released in the 4GHz range, but only time will tell.

by: theoneandonlymrkYou know its strange to me , the discourse regarding dual gpu on amd systems as I have four gpus in my main rig and despite the bottleneck alleged I do get reasonable fps on most games with sensible ultimate settings on 1080p ie morpho AA plus 2x msAA not crazy aliasing , two for render two fpr physx btw.
I can except an intel might run these better (2011 skt only though) but ive not owned an Isomethingmeaningless anyway so my 8350 @4.9 runs very well imho, I also have a pciex ssd thats going back in when I get hold of a socket extender there aren't many intel setups with 4-5 gfx use able pciex slots and mobo and cpu wouldn't cost 《300uk notes as mine did.

Dont get me wrong, the AMD series is excellent, and is fully capable of running dual GPU setups. I ran two 570's and two 7950's on an 8350 with no complaints other than drivers for the cards and the occasional game that didnt have crossfire support. The 8350 was and is still my favourite CPU in the last few generations. However the skt2011 does actually produce more frames in multiple GPU setups. I wouldnt want to come across as an AMD fanboy, so I would have to openly and clearly state that the high end intel are far better at bleeding all the performance out of multiple GPU setups. I just prefer the AMD featureset overall, intel has PCI 3.0, AMD seemingly has everything else, including multiple SATA 6Gb/s support in comparison to Intel's mere two on mid end MoBos, and PLX at a cheaper price.

by: RCoonDont get me wrong, the AMD series is excellent, and is fully capable of running dual GPU setups. I ran two 570's and two 7950's on an 8350 with no complaints other than drivers for the cards and the occasional game that didnt have crossfire support. The 8350 was and is still my favourite CPU in the last few generations. However the skt2011 does actually produce more frames in multiple GPU setups. I wouldnt want to come across as an AMD fanboy, so I would have to openly and clearly state that the high end intel are far better at bleeding all the performance out of multiple GPU setups. I just prefer the AMD featureset overall, intel has PCI 3.0, AMD seemingly has everything else, including multiple SATA 6Gb/s support in comparison to Intel's mere two on mid end MoBos, and PLX at a cheaper price.

Yeah I'm a reasonable guy I agree , I think a fair amount of its due to memory channels ,it would be nice to see amd do similar. Damn phone sorry.

by: VinskaI mostly see people trying to have an intelligent discussion and hardly any "insults/ridiculousness" at all. *shrug*

Then you obviously aren't reading closely enough, because from my point of view, there has been exactly the opposite.

by: VinskaMeanwhile, every time I asked You about that I was silently ignored. [yes, more than once. srsly, man, WTH]
If You claim to know that much (and claim You a lot - nearly every post), why don't You share it? Nobody will care if You flail around claiming to know everything, while not showing anything to back it up.

First of all, to say that I "flail around" is an insult itself. These are words on a screen, no "flailing" can take place. Unnecessary. And if I did, perhaps, ignore something you've said, it wasn't on purpose and it was probably because of snide remarks such as that. I have no problem discussing anything, so what exactly have you asked me that I haven't answered? Please feel free to ask me anything and I will give you my answer as best as possible and I have no problem admitting when I don't know enough to answer you. I'm not trying to be arrogant here, I'm just trying to explain that I understand what I'm talking about as other's have tried to insinuate that I do not.

Also, keep in mind my word choice, because I'm very careful with what I say, and I didn't say that I know everything, I said "just about", which leaves room for a lot of information. I am not a chip designer (though I'd like to be one day), so I can't quite tell you certain in-depth details, but I can provide what information is publicly available and provide reasonable predictions and speculation based on that information. Hardware is probably one of the only passions that I have in my life (sad perhaps, but true) and I make it a point to educate myself and search for any piece of new information (rumor or not) regarding this topic on a daily basis.

by: MightyMissionInteresting discussion...quite in depth for the most part, but interesting all the same..makes me wonder how many watts my 2500k pulls while sitting under load at 5ghz,obviously it's only got half the cores and it was only £150 retail but there is food for thought..

I ran a theoretical calculator which came up with around 189W, however, I found a review that Bit-Tech did that actually had their 2500K clocked to 4.9 (not quite 5.0, but close enough) and theirs was pulling 221W. Keep in mind, these are total system power draw numbers, but you can get an idea of how much more power your system is drawing just by the overclock alone as their value for the stock 2500K running at 3.3GHz pulled 148W, so a 73W increase. By the way, grats on getting a golden chip that'll hit 5.0GHz. What kind of cooling are you running if you don't mind me asking?

by: Tatty_OneIt would be a good idea if we get back to discussing the CPU as opposed to each others personalities.... thank you!

Thank you.

by: RCoonDont get me wrong, the AMD series is excellent, and is fully capable of running dual GPU setups. I ran two 570's and two 7950's on an 8350 with no complaints other than drivers for the cards and the occasional game that didnt have crossfire support. The 8350 was and is still my favourite CPU in the last few generations. However the skt2011 does actually produce more frames in multiple GPU setups. I wouldnt want to come across as an AMD fanboy, so I would have to openly and clearly state that the high end intel are far better at bleeding all the performance out of multiple GPU setups. I just prefer the AMD featureset overall, intel has PCI 3.0, AMD seemingly has everything else, including multiple SATA 6Gb/s support in comparison to Intel's mere two on mid end MoBos, and PLX at a cheaper price.

You're not coming off as an unreasonable brainwashed AMD Fanboy, but you admit to being an AMD fan. Me too. I do wish that AMD had included PCIE 3.0 in Vishera, but I suppose I wouldn't really quite need it since I haven't gone with my triple monitor setup yet, requiring multi-gpu's. PCIE 2.0 is good enough for a single card at the moment, but god damn will both AMD and NVidia hurry up and release their new mid level cards already!? I don't like spending $500 on a single card. I'd rather spend $200 on two mid level cards and run SLi/XF, or just find a upper mid-range card that can meet my needs (60fps avg, w/4xMSAA). I might actually need something twice as good in the near future as I'm considering getting a 3D setup (I played Arkham Asylum in 3D and it was fantastic - 3D Vision, never tried AMD's version yet).

Anyway, back to the thread subject. I'm not saying it's a bad chip overall, just not one that I would buy nor one that I would recommend for someone looking for a new rig right now. This new Piledriver doesn't add anything new, such as PCIE 3.0, it' simply the same old chip, with better binning and higher clock rate. For someone who is looking to purchase a new rig right now and are intent on AMD, I would advise them to wait just a little while longer and they can have an a better overall processor.

I don't mean to rain on anyone's parade who wanted to jump for joy at the news of a 5GHz part. I honestly believe that's an incredible feat for AMD to have achieved. I mean, 20-25% increase in clock just from 1 year of process maturity, that's pretty sick, and that jump in clock will definitely narrow the gap between AMD and Intel in terms of overall performance. Something I think AMD has needed to do for a while now and it looks like they are finally on track. YAY for a company that we all thought might have gone under. They really saved their asses, especially with capturing all the major video game console contracts.

The only things I'm concerned about with SR is that they are moving from SOI to Bulk for their process and it will be early in terms of maturity. Hopefully the time they have between Kaveri and their next FX line will be enough to mature the process so that we can perhaps see something similar to this 20% clock speed increase in this PD 2.0 when SRFX comes out. Fingers crossed.

by: MLScrow
I ran a theoretical calculator which came up with around 189W, however, I found a review that Bit-Tech did that actually had their 2500K clocked to 4.9 (not quite 5.0, but close enough) and theirs was pulling 221W. Keep in mind, these are total system power draw numbers, but you can get an idea of how much more power your system is drawing just by the overclock alone as their value for the stock 2500K running at 3.3GHz pulled 148W, so a 73W increase. By the way, grats on getting a golden chip that'll hit 5.0GHz. What kind of cooling are you running if you don't mind me asking?

i wonder how that compares to the 8350? out of the box power consumption for the 8350 was 189 watt for the full system. wonder how much that shoots up to while running at 5 GHZ?
My i5 will also hit 5 ghz, using the intel water cooler (effectively a rebranded antec 620)

by: erockerWell.. Your posting strategy needs some help too. With a little less caps lock and a little less colorful language, people may actually take you seriously. :)

How were people meant to take that seriously? I was joking...jeeze, nobody can take a joke anymore...a pretty bad one, I admit, but it sounded good before this hangover passed, I promise :laugh: :toast:

by: Am*How were people meant to take that seriously? I was joking...jeeze, nobody can take a joke anymore...a pretty bad one, I admit, but it sounded good before this hangover passed, I promise :laugh: :toast:

They always seem to sound good before you post them for some reason. :toast:

by: TheinsanegamerNi wonder how that compares to the 8350? out of the box power consumption for the 8350 was 189 watt for the full system. wonder how much that shoots up to while running at 5 GHZ?
My i5 will also hit 5 ghz, using the intel water cooler (effectively a rebranded antec 620)

Actually, by bit-techs numbers (they also ran an 8350 through its paces), it pulled (on their system) 213W running P95 stock and overclocked to 4.8GHz was 364W. That isn't 5GHz but you can use that to calculate what 5Ghz would theoretically be on that system or find someone who actually posted that info somewhere. Link: http://www.bit-tech.net/hardware/2012/11/06/amd-fx-8350-review/7

Forum links aren't working with IE for WP8, something to know for anyone looking into WP8 for their next phone (could be temporary, bit still a problem since it's release? C'mon now).

by: Am*How were people meant to take that seriously? I was joking...jeeze, nobody can take a joke anymore...a pretty bad one, I admit, but it sounded good before this hangover passed, I promise :laugh: :toast:

heres a hint, turn off the computer or hit it with a hatchet while youre drunk or hung over, it will make you feel better :toast::cool:

I can't understand why all of the interest in this chips power usage. It's going to be more than a comparable Intel. It's likely gonna be a bit scarey. If that concerns anyone, then I advise they don't consider this CPU. :cool:

I find it interesting that they feel confident enough to release a commercially available chip at these clocks for 24/7 use. Is it reasonable to assume it will be "common" in the not too distant future? Intel seems to be content with their clock speeds and is focusing more on power figures. I think it could get interesting to see where this ends up.

by: RelayerI can't understand why all of the interest in this chips power usage. It's going to be more than a comparable Intel. It's likely gonna be a bit scarey. If that concerns anyone, then I advise they don't consider this CPU. :cool:

I find it interesting that they feel confident enough to release a commercially available chip at these clocks for 24/7 use. Is it reasonable to assume it will be "common" in the not too distant future? Intel seems to be content with their clock speeds and is focusing more on power figures. I think it could get interesting to see where this ends up.

I own a Bloomfield so power consumption clearly isnt my overiding priority!

This TDP is actually quite impressive if you compare it to the first gen bulldozer CPU's. My chip under load is pulling in excess of 25 Amps over the 12V CPU rail at 4,6Ghz-> 300 Watts for 4,6 GHz. Compare that to a MAX powerdraw of 220W at 4,7GHz on all cores and the TDP of these chips suddenly seems quite good.

by: xorbeWhat's the price on this 5GHz booster gonna be? Surely it can't be super $$$ like some posts threw around last month. (Just curious -- I'm working on an A8-5500 mini-itx openSUSE 12.3 build presently.)

I certainly hope it won't be priced like the FX's of old. In my opinion, they should release these at the same price-point they released all of their recent flagship parts so far ~$200USD (-$30 for underclocked unit).

by: MathraghThis TDP is actually quite impressive if you compare it to the first gen bulldozer CPU's. My chip under load is pulling in excess of 25 Amps over the 12V CPU rail at 4,6Ghz-> 300 Watts for 4,6 GHz. Compare that to a MAX powerdraw of 220W at 4,7GHz on all cores and the TDP of these chips suddenly seems quite good.

Agreed, "if" the chip actually runs at the advertised TDP which AMD has lowballed before, but even if it does run at 220W+, that should be expected. It's still a PD core after all. It takes exponentially more power the further you clock beyond 4GHz. The maturing of a process can only do so much and with 8 physical cores, even if sharing resources, it is reasonable for it to use twice as much power than a 4 core part. Why it is being made out to be such a big deal (beyond MB requirements) by some, is somewhat silly to me.