The basic building block of Bulldozer is the dual-core module, pictured below. AMD wanted better performance than simple SMT (ala Hyper Threading) would allow but without resorting to full duplication of resources we get in a traditional dual core CPU. The result is a duplication of integer execution resources and L1 caches, but a sharing of the front end and FPU. AMD still refers to this module as being dual-core, although it's a departure from the more traditional definition of the word. In the early days of multi-core x86 processors, dual-core designs were simply two single core processors stuck on the same package. Today we still see simple duplication of identical cores in a single processor, but moving forward it's likely that we'll see more heterogenous multi-core systems. AMD's Bulldozer architecture may be unusual, but it challenges the conventional definition of a core in a way that we're probably going to face one way or another in the not too distant future.

A four-module, eight-core Bulldozer

The bigger issue with Bulldozer isn't one of core semantics, but rather how threads get scheduled on those cores. Ideally, threads with shared data sets would get scheduled on the same module, while threads that share no data would be scheduled on separate modules. The former allows more efficient use of a module's L2 cache, while the latter guarantees each thread has access to all of a module's resources when there's no tangible benefit to sharing.

This ideal scenario isn't how threads are scheduled on Bulldozer today. Instead of intelligent core/module scheduling based on the memory addresses touched by a thread, Windows 7 currently just schedules threads on Bulldozer in order. Starting from core 0 and going up to core 7 in an eight-core FX-8150, Windows 7 will schedule two threads on the first module, then move to the next module, etc... If the threads happen to be working on the same data, then Windows 7's scheduling approach makes sense. If the threads scheduled are working on different data sets however, Windows 7's current treatment of Bulldozer is suboptimal.

AMD and Microsoft have been working on a patch to Windows 7 that improves scheduling behavior on Bulldozer. The result are two hotfixes that should both be installed on Bulldozer systems. Both hotfixes require Windows 7 SP1, they will refuse to install on a pre-SP1 installation.

The first update simply tells Windows 7 to schedule all threads on empty modules first, then on shared cores. The second hotfix increases Windows 7's core parking latency if there are threads that need scheduling. There's a performance penalty you pay to sleep/wake a module, so if there are threads waiting to be scheduled they'll have a better chance to be scheduled on an unused module after this update.

Note that neither hotfix enables the most optimal scheduling on Bulldozer. Rather than being thread aware and scheduling dependent threads on the same module and independent threads across separate modules, the updates simply move to a better default cause of scheduling on modules first. This should improve performance in most cases but there's a chance that some workloads will see a performance reduction. AMD tells me that it's still working with OS vendors (read: Microsoft) to better optimize for Bulldozer. If I had to guess I'd say that we may see the next big step forward with Windows 8.

AMD was pretty honest when it described the performance gains FX owners can expect to see from this update. In its own blog post on the topic AMD tells users to expect a 1 - 2% gain on average across most applications. Without any big promises I wasn't expecting the Bulldozer vs. Sandy Bridge standings to change post-update, but I wanted to run some tests just to be sure.

Post Your Comment

79 Comments

One problem this cpu has had from day one has been it's overpriced by newegg and others. AMD wanted it sold for around 240.00. Because of the high demand and low volume they jacked the price up because they could "get away with it" If you put it's cost where AMD wanted it to be the picture becomes a bit clearer. I keep getting the feeling that all anyone would want AMD to do is make a Intel clone to perform equal to intel's cpus. That is stupid and backward thinking. Innovation takes guts because there is more chaos attached to it but that is how we step forward into new and better things. Without it we start to just stagnate. I am more inclined to go with AMD because they are willing to innovate, to take a chance to make a cpu from a different angle. And while this can be troublesome as we now see and will have growing pains I'm going to keep up hope that when it's software issues are worked out and some of it's latency is also worked out with the next gen at least they are trying to make a better item instead of just being a Intel drone company. And like some more long range views have pointed out, while it's having issues in some area's it excels in some others like multitasking or running more than one or more programs at the same time. No more programs going crazy when something like Norton's decides to run a virus check in the back ground ect..ect.. If all people want is clones than buy only Intel if you want.But if you manage to talk everyone out there to just do this and AMD goes away don't come back here crying about how much your next computer will be costing you because it will be a LOT more expensive and new cpu's from Intel will shrink to a trickle as there is no longer a need for them to do so. This is a good reason alone to support AMD no matter what they make. The whole AMD only or Intel only is such a load of crap in the long run. Both work well with current software. The real issue is will AMD be able to survive and if not will they take ATI with them as well? If so the whole computing landscape will change to a very dark and nasty place. Intel and Nvidia would love it, the rest of the computing world would suffer total despair. It's easy to loose a business, it's next to impossible to start it back up again so if AMD keeps getting trashed by the media and people fall into that line the outcome will not be a pretty one. It's almost too obvious that the bulk of the people here weren't around or old enough to remember when it was just Intel and their 286 and 386 days. They cost a ton of money back then, computers were not cheap. It gave Apple and even Atari some business and AMD started to come alone as well and the cost slowly came down because they could survive in that kind of setting. But times have changed way to much for that to be repeated. It's too bad more people don't think along the what if line so they will know what to look forward to in a best or worse case scenario. It's one reason why I'll use a AMD cpu. I can run anything I want to just fine and do it at a reasonable price. And it's mixed with a little thank you for getting prices down to affordable levels. That is worth supporting. I keep getting the feeling that people think of these as "I have the fastest computer on the block, look at me", the keeping up with the neighbor scenario that is funny in the best of times and horrible in the worst of times. And I don't care where that takes them in the long run because I never want to think that far ahead. Heck, I seldom see any long range thinking at all on the internet and that is scary. And I always thought that was just a politicians disease. If you could really stop and think you'd know just how much better we are today because of AMD's willing to take chances and try something new, and without them we might still be running 32bit single core cpus. I'm not being a AMD fanatic, just a realist. Reply

Amd first introduced a lot of the features that Intel used on Core architecture and Intel didn't mind copying them. So if even a company as big as Intel can copy, Why can't Amd. The cross license agreement between them was for this very same reason, Amd should take the best of every feature set that is used on x86 processors and make a good cpu with them, it seems to me that Amd might have too much pride for that, but speaking from a business stand point, it's the best thing they can do until they have gain more money.Reply

The cross-license agreement is only about the use of the x86 instructionset and its extensions.Specific implementation details such as Intel's caching, HyperThreading, manufacturing process and whatnot don't fall under that license.Reply

I was around in the 'Intel only' days as well...And I find it funny you say "It gave Apple and even Atari some business".Apple and Atari had been around in the home/personal computer market before IBM. It was the IBM PC that was trying to take business away from them, not the other way around.

I'm not so sure how much AMD contributed to getting cost down, if the cost went down at all...A high-end 486 CPU cost about the same when AMD released the Am386 (its first x86-competitor) as a high-end Core i7 costs today.I think the main differences are that the rest of the computer became a lot cheaper (monitors, PSUs, HDDs, motherboards, memory etc), and that these days, the mainstream and low-end ranges are often based on the same architecture, where in the old days the 'mainstream' alternative to a 486 would be a 386, and low-end would be 386SX or 286. So only high-end was state-of-the-art, if you were on a budget you bought older architectures at reduced prices.I don't think AMD had much of a share in either development.Reply

There are things that BD does well, and things it does badly (like most games). Its not good for manywindows users, but actually not bad for many linux users. Its a pity we didnt get a chip that performedbetter that SandyBridge at a lower price, but that was too much to hope for.

Its sad that BD performs poorly at single threaded applications, AMD didnt quite get the mix right in thisdesign and will hopefully improve it in subsequent versions of the chip. I like the fact they dont keep changing the cpu socket, while recently Intel have released 1155,1156,1366 and 2011 sockets !

For my current applications a cheap 990FX motherboard (which all seem to have working IOMMU support)and a cheap bulldozer, can do much the same job as a i7-3820. Its also a nightmare to find an Intel board for a 3820 chip that supports Vt-d properly on both the motherboard and bios.

So for things like Xen, AMD isnt a bad choice.

We are still lucky that AMD are competing with Intel. Competition and innovation benefits us all and helpskeep prices reasonable. (With WD and Seagate, buying Samsung and Hitachi, competition in hard drives has all but disappeared. No wonder we pay so much for hard drives now !Reply

I have been using AMD cpu's for years, bought an FX-6100, stuck it on my 890Fx motherboard, and i noticed great improvement in gaming. I don't know why people say it's 'useless' or other negative views but for me i can play Battlefield 3 in Ultra mode 1920x1080 and get between 43 - 48 fps which is fine, that's only on high action parts of the game, sometimes jumps upto 60fps absolute max. I only have a HD6950 2GB, not unlocked or overclocked and 8GB of memory on an MSI 890FX. Paid £114 all in for the CPU, i'm not complaining, if gaming was worse than my unlocked Phenom II 550 i would of sent it back.Hotfix did not make any difference that i can notice, my machine is for gaming. It might not be as great as the i7 but so what, as long as my machine does the job i need it to that is all that is important, all these benchmarks that are not doing the Bulldozer cpu's any favours have not put people off, we all can't afford intel.Reply

I know the architectures are very different, but the entire scene with AMD's Bulldozer architecture reminds me so very much of the scene with the CELL processor when the PS3 launched. And, the aftermath. The CELL processor had so much going for it at the beginning. It had extremely strong computational potential and pre-launch, the computational powerhouse aspect of the PS3 was it's biggest argument. In terms of raw power, it was the king of the consoles. But look at what happened. CELL is effectively dead in anything that isn't a niche server market as far as I'm aware, or concerned. It didn't matter that the architecture had incredible potential, it required a radical change in the way software was developed (game developers for the PS3). And the game developers never really grabbed the bit by the teeth. Perhaps they've gotten better at utilizing the CELL architecture now, but that doesn't matter, as all of the rumor mill is pointing towards x86 processors for -all- of the new consoles. CELL isn't even in contention, and I highly doubt that any console company is going to try to force a radical architecture change on the market anytime soon.

My point with this is that we can't assume Microsoft will ever code properly for Bulldozer architecture, ever. Unless Bulldozer can sway over developers towards looking at it with future prospects in mind, it wouldn't surprise me to see the software industry modify there software to the bare minimum to work well enough with Bulldozer, then abandon it at the first chance. I know that comparing the processor market with the console market is like comparing apples to oranges, and even when saying the next generation of consoles are all going to have x86 processors, the market there has changed a lot in six years as well. I just want to bring up this comparison, and perhaps some other people who remember the pre-launch PS3 hype and the seemingly overwhelming advantage the console had in potential, and how that never panned out as hoped, can bring some new points to the Bulldozer conversation.Reply