The design will feature an octacore, unlocked Zambezi processor
dubbed "FX", reviving AMD's old enthusiast CPU branding. Zambezi,
codenamed after a river in Africa, is AMD's high performance 32 nm SOI process upcoming desktop
CPU based on the company's newBulldozer architecture.

The new platform will also feature a Radeon 6xxx HD graphics card from AMD and
an AMD 9-series chipset motherboard (socket AM3).

Leslie Sobon, AMD's vice president of worldwide product marketing, comments,
"AMD’s FX brand will enable an over-the-top experience for PC enthusiasts.
By combining an unlocked, native eight-core processor, the latest in chipset
technology, and AMD’s latest graphics cards, FX customers will enjoy an
unrivalled feature set and amazing control over their PC’s performance."

The obvious competitor of Scorpius will be Intel Corp.'s (INTC) Sandy Bridge, possibly paired with GeForce 5xx series GPUs from NVIDIA Corp. (NVDA).
With eight physical cores, Scorpius will arguably have the edge over
single-socket Intel designs, though, which currently only feature four cores
(eight threads). Intel will bump its core count to six cores in the near
future, but it remains to be seen whether that will be enough.

Performance numbers on Bulldozer are still lacking, so it
remains to be seen exactly how powerful this octacore gaming rig will be.

Regardless of who comes out on top performance wise, it's refreshing to see a
reinvigorated AMD challenging both Intel and NVIDIA in the CPU and GPU sectors.
A competitive market should push all three PC hardware makers to quicken the release of powerful new hardware that will delight PC gamers and
enthusiasts -- few as they may be, these days.

Comments

Threshold

Username

Password

remember me

This article is over a month old, voting and posting comments is disabled

They're built on GF's 32nm process. I don't think that they've announced prices yet, but we'll have to wait and see where the performance puts them relative to Sandy Bridge. I would guess the upper-tier FX parts would target the same users as the 990X or the upcoming Sandy Bridge EX.

Gaming wise the 990 is currently beaten by the 2600k. So whatever is replacing the 990 is most likely what it is targeted at and the current 2600k.

FX chips historicaly were 1000$ parts. The fact they are reviving the FX name gives me a considerable amount of hope for thier new cpus. But i dont expect them to be 1000$ parts anymore.

That being said i'm betting sandy and ivy will still be faster than amd. But all amd needs to do is get within 10-15% of sandy and ivy realisticly and then keep the prices reasonable and they will have an easy win with thier more flexible setups.

Wasn't Intel saying that their new 22nm process would provide Ivy with 20% speed improvement over Sandy? If I remember that correctly then AMD has to be at least equal to Sandy to have a prayer against Ivy.

people buying 990x aren't doing it for gaming, so the whole point is moot. can't arbitrarily compare it with SB. therefore it's not irrelevant. besides benching crowns, if you need 6c/12t and lots of memory bandwidth, it's a very nice platform.

Actually, bulldozer will probably be a better competitor to the 990x vs the 2600k. AMD emphasized throughput on this generation, not single thread performance. Likewise, the 990x is more throughput focused while the 2600k has better performance per thread.

FX haven't been $1000 parts (even the dual chip 4X4 crapshoot) since the intel shot out Core2Duo and then held AMD down while kicking them in the nuts for two years.

When it comes to gaming, AMD does okay against the Core i-whatever-the-F-it-is CPUs. Looking at costs, the AMD platform could be $150~300 cheaper, resulting in games getting 80fps vs 90fps with a 2600k. (yeah, this is in general)... I'd rather pocket the money or put it towards an SSD or better graphics card, its about budget and balance.

For general use, AMD Fusion Llano goes up against the i3/i5 CPUs quite well, not a small feat.

These new FX chips MUST be equal to SB in performance, if not better as well as proper price target. An 8X CPU is useless if its still slower than a quad-core SB. And besides gaming, the rendering power of the new CPUs better be good too.

The new Bulldozer CPUs are supposed to easily be faster than AMD's current top tech... with room for even greater performance. Thats all we need.

PS: But obvious, TOP performance isn't everything. The market of low-powered yet powerful CPUs to allow people to check their mail and simple gaming is whats selling like hot-cakes... not $1000 chips.

PS2: AMD's new box art looks HOT. But we need those in our hands today.

1000$ chips as in thats what they were charging for them. Not that they were worth the 1000$.

When c2 came out the fx line pretty much vanished. I want to say the fx-74 was the last one? And up until atleast the fx-60 amd was charging 1000$ per cpu , never mind a 200$ one oced to fx levels easily.

What many have missed is that Bulldozer has a new core design, while Llano still uses the K10.5 design used by the current Phenom 2 processors. That is still decent performance for those who would be using an APU with the built-in GPU.

AMD has really been acting as a platform company for a long time now, where the overall performance of CPU+GPU+chipset is more important than having the best in any one area and having garbage for the rest of the system. Low end Intel systems tend to have a LOT of annoying minor problems that you don't see on comparably priced AMD systems, even if the Intel systems are faster at certain tasks.

quote: Gaming wise the 990 is currently beaten by the 2600k. So whatever is replacing the 990 is most likely what it is targeted at and the current 2600k.

I have a 980X, and I've had it for a while now. The 2600K and even the 2400 will outperform it in certain benchmarks, however it's worth noting that the difference is not substantial. There are also benchmarks where the 980X still outperforms the newer CPU...to say "beaten" isn't really accurate. It's more like "contending with". If you use your computer to do anything other than gaming, the 980X would still be a better choice for the money, and you can OC it to the 4GHz realm reliably to squeeze more life out of it if need be.

quote: That being said i'm betting sandy and ivy will still be faster than amd. But all amd needs to do is get within 10-15% of sandy and ivy realisticly and then keep the prices reasonable and they will have an easy win with thier more flexible setups.

No, I would say that they need to be equal to or better than Intel's offerings to compete. Once you cross a certain price point - the price point where you're really just interested in performance and not 'bang for the buck' - most people are going to go for the platform that offers the best performance.

AMD's prior success was due to them being able to match or exceed Intel's performance while offering a less expensive price point.

The fun thing with X980/intel Extreme editions CPUs are that you don't need to overclock them. Just set Turbo to 4.4ghz on all cores. This is a very elegant solution since when people overclock, they use more energy and produce more heat.

With the turbo solution, the extra heat/energy is only used when it is needed.

BTW. How many are using 100% CPU time with a 4+ ghz CPU? The only windows program I have used that maxed out my cores are video encoding/rendering. (and even with video encoding: most programs only use 2-8 threads. Not a single program was faster when I switched from 4core/8thread intel to a 6core/12thread intel)

quote: The fun thing with X980/intel Extreme editions CPUs are that you don't need to overclock them. Just set Turbo to 4.4ghz on all cores. This is a very elegant solution since when people overclock, they use more energy and produce more heat.

That's what I do and it works great.

quote: BTW. How many are using 100% CPU time with a 4+ ghz CPU? The only windows program I have used that maxed out my cores are video encoding/rendering. (and even with video encoding: most programs only use 2-8 threads. Not a single program was faster when I switched from 4core/8thread intel to a 6core/12thread intel)

Since most multicore chips are 2 cores / 4 threads (for intel chips with HT), I'd guess that software developers optimize threaded software for the most widely deployed platform. I know that 3DSMax or other high end rendering software can and does utilize all available cores. Studios will often customize the code of 3DSMax for their custom setup...but that is a far stretch from the typical home user.

Anyway, the main reason I bought a 980X is not only because I want the best performance I can get, but also for longevity. I like having a reliable system and each time I upgrade I do it from a "clean" install, so I have to reinstall all my programs and it's a pain in the ass...so if I can get by upgrading once every 2-3 years all the while having a high-performing system I'll take it.

Also, even if software is only using 4 of the 6 cores, that means you can still have a responsive system while rending something in the background...without compromising the speed at which it renders.

They posted the prices a ways back. The Highest end Octa core is only going to be this.. I CAN NOT WAIT myself. I am useing a 980X and am just about sick of Intel Price Gouging! SUCH dickheads.NOT to mention how STUPIDLY PRICED THE MOBOS ARE!.. I have always been a Intel Guy BUT Times are a changeing!

Here are the first figures made public of the market prices of AMD's upcoming two lines of desktop processors. AMD will approach the desktop PC market with two platforms, the A-Series "Llano" accelerated processing units (APUs), and the FX-series "Zambezi" processors (CPUs). APUs are functionally similar to Intel's Sandy Bridge processors, in having processor cores, a graphics processor, memory controller, and PCI-Express switch packed into a single piece of silicon. AMD is apparently relying on its powerful GPU architecture to make Llano a more wholesome product. Zambezi functionally resembles Intel Westmere/Bloomfield, in having a number of processing cores, a high-bandwidth memory controller, and a large cache packed into a single die, making up for a performance part. By mid-June, AMD will launch the FX-Series with two a 4-core, a 6-core, and two 8-core parts. The series will be led by eight-core AMD FX-8130P priced at US $320, trailed by FX-8130 at US $290. The former probably is a "unlocked" part. Next up is the six-core FX-6110, priced at $240. Lastly there's the quad-core FX-4110, going for $220. You will notice that the price per core isn't as linear as it was in the previous generation.

I understand the need for AMD to bring something new to the table but they need to focus more on architecture power rather than power by numbers.

This whole thing seems to be aimed at gamers but what game makes use of the cores ? barely any game stresses my 4 core..I read something a while ago, can't remember where think hardOCP, that stated most games run on 2 or 3 cores, only a few use all 4 cores.

Any game or application that makes good use of multi-threading will use a LOT of threads. AI...how many enemies are there on the field of battle, or NPCs wandering around? If every one had a different thread, then games could potentially make use of HUNDREDS of cores. The problem is that many games and applications are not coded to really make the best use of multiple cores. First person shooters where it is just arena type stuff with a bunch of human players wouldn't have a use for it, but AI...there's your application.

quote: With eight physical cores, Scorpius will arguably have the edge over single-socket Intel designs, though, which currently only feature four cores (eight threads). Intel will bump its core count to six cores in the near future

......So i guess the 6-core 12 thread 980X Intel CPU that i've been using for well over a year now was just me imagining things Jason?

Intel also make 10 core Xeons, obviously for servers, but still single socket.

In all fairness to him, i think hes thinking of the 1155 compared to am3/am3+ since the costs for the overall platform should be comparable. Youre right in the server segment they have had single packages with 10-12 cores for awhile now.

Actually I was just thinking that the AM3 socket was getting a bit long in the tooth. Believe me, I love that I can keep my motherboard and (hopefully) with a BIOS update, drop in the new chip, but at what point do they need to increase the pinout to increase performance?

I am surprised no one mention the spider platform. AMD officially annouced that Barcelona would outperform Intel glued quad-core by 50% but later on they found fastest Barcelona would be way slower than slowest Intel quad core, the Q6600, they put Barcelona with HD3800 video card together and announce the SPIDER platform.

Just get the chips to the market. All the delays and no performance leaks makes me think that they are having problems getting sufficient performance. I cant imagine AMD not bragging and shooting off their mouth if Bulldozer was going to beat Sandy Bridge, much less Ivy. They even bragged up Phenom I and we all know how that turned out.

And "octa-core" sounds impressive, but few games can take advantage of more than 4 cores, much less 8. Seems to me like AMD is making up for a lack of good architecture by just slapping on more cores, kind of like Intel did in the P4 days by trying to keep increasing clockspeed.

And I am not an Intel fanboy. I would love for AMD to become competitive again instead of just a low-end provider. I just am getting tired of AMD and their fans talking up such a good game without any hardware or benchmarks to back it up. Just get the hardware out already!!!

I certainly hope this new AMD offering is competitive with Intel.If AMD can pull this off at those prices, it would force Intel to scale back on their price gouging. $320 for the high end chip? That would be great.

Another reporter apparently jumping on the bandwagon of bashing the PC? "...powerful new hardware that will delight PC gamers and enthusiasts -- few as they may be, these days."

I mean, if that statement is true, I guess all this hype is for naught...evidently, there's barely a market anymore for PC gamers and enthusiasts. Sure, we might be considerably smaller in size than the console peeps... but, hey, we're still alive and going strong.

These days there are quite a few people who own consoles comparatively speaking, but there are FAR more PC's floating around than 10-15 years ago as well.

If there wasn't money to be made in this market, both Intel and AMD wouldn't be spending time developing hardware for it...so do yourself a favor and quit "hastening" its supposed demise...;)

Multimonitor setup's are for work not play, unless you can magically make that bezel disappear from the displays, it will always be a niche market.

Also, just because the game "supports" it, does not mean it takes advantage of it. Simply allowing the display to take advantage is pointless..now if you made it take advantage of the game FEATURES that would be something to consider if you like a certain game a lot.

3 monitiors, good graphics card you might as well just buy one big monitor instead without the drawbacks.

Multimonitor setup's are for work not play, unless you can magically make that bezel disappear from the displays, it will always be a niche market.

Also, just because the game "supports" it, does not mean it takes advantage of it. Simply allowing the display to take advantage is pointless..now if you made it take advantage of the game FEATURES that would be something to consider if you like a certain game a lot.

3 monitiors, good graphics card you might as well just buy one big monitor instead without the drawbacks.

In certain online games I've heard it can provide an advantage -- particularly online games. My information comes largely from a friend of mine who was very passionate about EVE Online, and had a couple allies who swore by multimonitor setups, bezel and all...

Im with your Its awesome and significantly more enjoyable and immersible. Its also significantly improved my gameplay ability.

Triple Monitor widescreen gaming is NOT the same as a single Big screen. With Triple screen gaming you have a much wider field of view. With a large single screen its like wearing binders with a limited field of view.

Multiple monitors are more natural like peripheral vision.

I overlap the monitor bezels to minimize it but overall there is no missing information behind the bezels its not like a car where the divider between the windshield and side widow can cover an area.

You can combine these ideas if you could get a single display that had the super high resolutions that Eyefinity 3 or 6 monitor can provide. The real problem is that 1920x1080 is 1920x1080, no matter how large the screen is.

Just because you don't use it doesn't mean other people don't. Can't stand bezels? Use projectors, problem solved. Also, Eyefinity allows 24x the pixels of a single display, so unless you know of some magic uber display, that's not a substitute.

That's probably what he meant to say. The vast majority of displays in use around the world are in the 720p range, at least that's the most commonly USED resolution.

My only experience with multi-monitor gaming was a 3-way setup. Odd numbers seem to be the only way to go, at least for shooters, as losing the reticule is simply not an option. Personally, I'm just fine with a single monitor with super high resolution.

That "super high resolution" is the problem. Displays that go over 1920x1080 or 1920x1200 tend to be VERY expensive, so the less expensive solution is to go with Eyefinity. With one main display in front, and the two angled on the left and right, you get the super high resolution. Since these 23 inch displays take up the center of your field of view, you have a natural edge to your vision. Peripheral vision will let those two side displays do the job they are there for, giving you a greater field of view.

The second row of displays(above) is where many won't see much value in from a gaming perspective, though there may be games/situations that would make that useful as well.

I'd prefer to see large displays with higher resolutions, but three 1920x1080 displays is less expensive than one display that provides a 5760x1080(if one were ever made).

Actually, the best game I've played with multi-monitor was EVE online. The real estate provided by 5760 x 1080 gives you lots of room for menus, readouts, etc. without impairing your view of your surroundings nearly as much. The bezels are there, yes, but it's like the door posts in a car. One monitor is the equivalent of a windshield, the two ancillary ones like your side windows. Or, if you prefer, looking through a large window or sliding glass door.

All top end Phenom II's be it x6 or x4 are 125W parts. So in making this an APU, they add 2 more CPU cores when very little takes advantage of 6 cores. Two more cores plus an ATI GPU all on the CPU die? Either these will be 140-150W monsters or be clocked incredibly lower than they should be. Regardless, K10 cores no matter how many cannot compete against the current Core i cores.

They should have invested in making K10 more efficient, not add more inefficient and/or power hungry cores.

Hopeful but skeptical. Not to mention in conjunction with a die shrink...

quote: All top end Phenom II's be it x6 or x4 are 125W parts. So in making this an APU, they add 2 more CPU cores when very little takes advantage of 6 cores. Two more cores plus an ATI GPU all on the CPU die? Either these will be 140-150W monsters or be clocked incredibly lower than they should be. Regardless, K10 cores no matter how many cannot compete against the current Core i cores.

He's talking about the new processors with the APU integrated. You know the ones you were talking about here.

"So in making this an APU, they add 2 more CPU cores when very little takes advantage of 6 cores. Two more cores plus an ATI GPU all on the CPU die? Either these will be 140-150W monsters or be clocked incredibly lower than they should be."

It even says in this article they are 32nm. 45nm is the old ones without the APU, 32nm is the new ones with the APU and 8 core or whatever they are coming out with.

These are not the same cores in Phenom II. This is a new architecture. So it'll be 32nm already, and we don't yet know how power hungry Bulldozer is going to be. Seems like you are jumping to conclusions there, pal.

You are distorting the facts here. AMD measures maximum power, and the current Phenom 2 processors are using a 40nm process. There is a good reason Llano was delayed, and that was the need for 32nm to keep the power/heat under control.

I expect the top end Bulldozer chips will be 125W parts, with most being 95W or lower. Still, that is a socket designation, not a pure "how much power does this chip draw". If a motherboard can support 125 watt chips, THAT is what is required. Intel markets things differently, which is why you never know if the new Intel chip you just bought will work in your old motherboard without doing some research.