AMD detonates Trinity: Behold Bulldozer’s second coming

Share This article

AMD’s Cayman – Trinity’s Linchpin

In most literature, AMD refers to Trinity’s GPU as a “Northern Islands” class part without bothering to explain whether it’s based on Barts (a modest step forward from the old 5000 series) or Cayman (the high-end GPU that AMD confined to the 6900 series). Officially, it’s branding the new core as part of the Radeon 7000 family, which isn’t accurate either.

Trying to make sense of AMD’s branding has become murky at best. The 7000M brand is now polluted with three types of GPUs — 40nm rebrands of 6000 parts which are based on Barts/Turks, 28nm parts based on GCN (Graphics Core Next), and 32nm APUs based on Cayman. Trying to hash out which GPUs are the best match for the APU is a task better left to saints and madmen then poor journalists; we’ll leave the topic of paired graphics for another day.

Unlike Llano, whose integrated GPU was nearly identical to AMD’s discrete “Redwood” part, there’s no easy point of comparison for Cayman. The new GPU features an array of six SIMD clusters of 64 cores each; Llano had five SIMDs of 80 cores. The new GPU is slightly smaller than its predecessor, with 384 cores instead of 400, but one of the features AMD introduced with Cayman was a VLIW4 architecture that was significantly more efficient than the VLIW5 designs that preceeded it. AMD has also increased the number of texture units, to a maximum of 24, up from Llano’s 20. The total number of ROPs remains the same, at eight.

When it comes to game performance, AMD is more willing to share the goodies.

All game tests run at 1920x1080

The company has a bad habit of switching back and forth between desktop parts and laptop parts, and there’s no Ivy Bridge comparison data. Still, things look good. This was very nearly a given; Cayman made a number of efficiency improvements that were logical fits for Trinity, and all of AMD’s APU demos in the run up to launch focused on the GPU.

Trinity moves AMD forward, buys time for 2013 launches

After talking with AMD and reading over the company’s presentations, our educated bet is that Trinity is a qualified success. Piledriver may not move the bar very much on the CPU side of the equation, but between power consumption, temperature, and performance, AMD had to fix the first two to have any chance of launching a mobile part based on the architecture. If Piledriver can match Llano clock-for-clock (or within the same TDP), that’s still significantly more than BD managed when compared to Istanbul/Thuban.

Will it compete effectively against Ivy Bridge? No. But it was never intended to. AMD’s goal with Trinity is to position the CPU as a successor to Llano, a further fulfillment of the company’s “Fusion” vision, and as an anchor in the popular $400-$700 segment. Based on what we’ve seen and a few educated guesses, it’s got a fair chance of pulling it off — short term.

No matter how successful Trinity is in 2012, it doesn’t change the fact that AMD has no traction in tablets or sub-10W designs at a time when companies like Qualcomm have given notice that they intend to move into PCs. That’s fine for the moment, because Windows 8 won’t drop until the latter half of the year, and it’ll take 4-6 months past that point for some of the traditional smartphone/tablet players to make moves into the low-end PC space.

AMD needs a quick jump to 28nm Brazos and a fast refresh on Trinity. In theory, the new chips — Kabini and Kaveri — will be ready in 2013. The company has yet to put a quarter on that number, or to even comment on where the parts are being made. Trinity may be a good beginning, but it’s only that; AMD has a long way to go when it comes to carving out its own territory in between Intel at the top of the market and an onslaught of ARM-based hardware at the bottom.

Tagged In

Post a Comment

Are we at war ? :D This morning nVidia carpet bombed the ground with GT 670 and now AMD detonates trinity!!

http://www.mrseb.co.uk/ Sebastian Anthony

I love the smell of GPU in the morning.

http://www.facebook.com/profile.php?id=1223563048 Angel Ham

It smells like burnt thermal grease and toasted copper wires.

Charlie Staats

I know it’s been said before, but it almost seems like too little, too late. I’m sure the same company that brought us the first 64-bit architecture can pull through this “tick tock” fire fight, but man Trinity had better rock or they will be scrambling all the way through 2013 and maybe beyond. HP just came out with Sleekbooks that start at $699 and for an extra $50 or so can be fitted with a 2GB AMD Radeon 7670m. Trinity has it’s work cut out.

Joel Hruska

I’m sure the same company that brought us the first 64-bit architecture can pull through this “tick tock” fire fightWell, I’m glad someone is. I’m not.

Scali

The first 64-bit architecture? Uhhh, are you talking about AMD? IBM had a 64-bit supercomputer back in 1961.
64-bit CPUs were common among RISC microprocessor architectures in the early 90s.
IBM delivered the first 64-bit CPU for mainstream desktop systems: the PowerPC 970, aka G5.
Intel had a 64-bit architecture before AMD as well: Itanium, aka IA-64.
If anything, AMD was one of the LAST to bring us a 64-bit architecture. It just happens to be the most popular one in use today, in a true Betamax vs VHS scenario…

Joel Hruska

*yawns*

I agree he’s not being precise, but AMD built the first 64-bit architecture that was successfully adopted for consumer use. No, that’s not the same as being first. Absolutely true that the objectively “best” designs often don’t succeed due to other factors.

Scali

Not even that. As I said, IBM’s PowerPC 970 is that one. Apple sold millions of G5 systems to consumers before AMD ever sold a single 64-bit consumer system.

Joel Hruska

Apple launched the G5 in June 2003. AMD launched Opteron in April 2003 and followed with Athlon FX and Socket 754 in September of that year. Given Apple’s low market share at the time and the fact that AMD’s 64-bit architecture was known as the first mainstream option even if the part launched first in the server space, I think both your “millions” comment and your statement that Apple hit the target first are incorrect.

Scali

Not so fast there:
1) AMD launched the Opteron CPU in April 2003. That is not equal to people buying systems with those CPUs. The PowerPC 970 was launched in October 2002 for example. Especially in the server space it generally takes a few months before systems start selling, because of design validation requirements.
2) Opteron is not for consumer systems but for servers/workstations. We were specifically talking about consumer systems, as the G5 is, and the Athlon64/FX would be the equivalent of that.
3) Even though Apple has a low marketshare overall, in the absolute sense this market is still many millions of units large. Granted, they may not have quite reached a million in the ~6 months they had before the first AMD consumer systems started selling, but I don’t think it is too far off. The G5 sold incredibly well from day 1.
4) How is it a fact that AMD’s 64-bit architecture was known as the first mainstream option? This is *exactly* the issue that is under question.

Bottom line: the PowerPC G5 was the first mainstream 64-bit system. That is a fact.
It’s about time that AMD fanboys got educated.

Joel Hruska

The first Apple G5 started at $1999. That’s a Power Mac. That’s also absolutely a workstation-class product and Apple positioned it as such, equipping it with PCI-X slots.

If we count workstations as part of the consumer market, AMD wins. You could buy an Opteron + motherboard on launch day, in April.

If we go by pricing, AMD wins. Apple’s first 64-bit system for under $1500 was an iMac for $1299, introduced in August, 2004. AMD launched Socket 754 / Athlon 64 in September, 2003.

If we go by the arbitrary fact that some regular consumers bought Power Macs for $1999 and completely ignore the idea that some consumers bought or built Opteron workstations, then Apple wins. Hurrah for Apple!

Scali

We are talking mainstream here.
Yes, some, in fact, many regular Apple consumers bought Power Macs for $1999. Apple never really had the notion of ‘workstation’ anyway. Apple systems generally were relatively expensive compared to the budget PC systems… which meant Apples also came standard with things like SCSI, fast expansion buses and network cards, which generally cost extra on PCs, or were only on-board on ‘workstation-class’ systemes.
You can’t compare them that way. Only clueless people who’ve never used any platform other than pathetic x86-based PCs would make such useless comparisons.

Likewise, yes, you *can* buy your own motherboard and CPU, but most regular consumers just buy a complete system, and these consumer systems are never sold with Opteron CPUs. I have no doubt that there are more regular PowerPC G5 consumers than there are regular first-generation home-made Opteron system consumers.

Something is ‘mainstream’ when a considerable amount of regular consumers buys the product. And the PowerPC G5 certainly fits that description. Opteron systems never did, and never will.

Joel Hruska

Scali,

You have a serious problem when it comes to assuming people who disagree with you are stupid. I’ve used other platforms than x86-based PCs. I owned a Power Mac G4 (2x867MHz) and a Macbook Pro.

You want to disagree with me, that’s fine. Certainly Apple laid claim to the title of first consumer 64-bit platform. Also true that Apple doesn’t define Power Macs as workstations. It’s significant, however, that Apple had both a line of iMacs and Power Macs in the $1200 price range -. These were its mainstream desktop models, not the $1999 G5.

Further evidence of this:
The G4 FW800 series that Apple introduced in January 2003 had 4 DIMM slots, AGP 4x, and four standard PCI slots. It useed standard IDE and had two USB slots. Audio outputs were 16-bit, with no native optical SPDIF support.

The G5 packed the aforementioned PCI-X, AGP8x, more USB 2.0 ports, 24-bit audio (both input and output), optical SPIDF, 8 DIMM slots, and SATA. Apple’s high-end system from January 2002 qualifies as a consumer PC. The G5 was a de facto workstation whether Apple called it that or not, it carried workstation-class features and an that went beyond simply having PCI-X slots. Claiming that these sort of things were standard on Apple products at the time ignores the fact that previous G4 systems — including the highest-end variants — didn’t carry them.

You are more than welcome to disagree, but please do so without resorting to ad hominem attacks.

Scali

Not stupid, but ignorant, to be exact. Just because you may have owned a Mac at some time doesn’t mean your level of thought has risen above the x86 world. Not an ad-hominem attack, just an observation. Comparing two completely different platforms on price just doesn’t make sense. Apple has never competed with PCs on price. Not even today, when they are essentially using the same hardware. So bringing up the price difference shows a glaring lack of understanding of the market (FYI: when the G4 was introduced in 1999, it cost $2499).

Not sure what your point is. There were cheaper Macs than the G5? Obviously. That is usually the case with new high-end models. It’s no different from AMD, where lower-end 32-bit Opterons, Athlons and Semprons were sold for years after their 64-bit CPUs debuted. What is the significance of that? Very little.

“Certainly Apple laid claim to the title of first consumer 64-bit platform. Also true that Apple doesn’t define Power Macs as workstations.”

Indeed, end of discussion.

Joel Hruska

Ahh, I see. You’re arguing that Apple’s internal definitions only have relevance to Apple products and therefore aren’t applicable anywhere else. By that logic, Apple didn’t introduce the first mainstream, consumer-oriented 64-bit processor — Apple introduced the first mainstream, consumer-oriented 64-bit processor for Apple customers, who aren’t the same as regular customers, and can’t be evaluated with the same metric or by the same standard.

Well, that’s certainly true.

Scali

I’m not arguing anything. You’re the one trying to shoehorn ‘workstation’ in here, not me.
The fact remains that since the G5 was not marketed as a workstation (regardless of whether you think it is or it isn’t, because that is irrelevant), it was also bought by plenty of regular consumers. Which means it was the first 64-bit system to be adopted by the mainstream market, where 64-bit systems were limited to the professional server/workstation market until then.

Apple’s G5 brought 64-bit computing to regular desktop users, such as graphics artists, photographers etc. Not exactly the kind of audience that would have ever have bought server or workstation systems. They aren’t computer specialists.

Scali

We are talking mainstream here.
Yes, some, in fact, many regular Apple consumers bought Power Macs for $1999. Apple never really had the notion of ‘workstation’ anyway. Apple systems generally were relatively expensive compared to the budget PC systems… which meant Apples also came standard with things like SCSI, fast expansion buses and network cards, which generally cost extra on PCs, or were only on-board on ‘workstation-class’ systemes.
You can’t compare them that way. Only clueless people who’ve never used any platform other than pathetic x86-based PCs would make such useless comparisons.

Likewise, yes, you *can* buy your own motherboard and CPU, but most regular consumers just buy a complete system, and these consumer systems are never sold with Opteron CPUs. I have no doubt that there are more regular PowerPC G5 consumers than there are regular first-generation home-made Opteron system consumers.

Something is ‘mainstream’ when a considerable amount of regular consumers buys the product. And the PowerPC G5 certainly fits that description. Opteron systems never did, and never will.

James Barry

If AMD follows Intel’s naming scheme then “Trinity” will have 5 cores. Their upcoming Quadro will have 7 cores, etc. in some random and worse than meaningless pattern.

Barry_Thompson

Trinity is a river, the same as Llano.

Paul Bahre

AMD needs to get their Mojo back. And they need to do it soon. The price for Intel chips is through the roof. So the world needs a decent competitor to Intel.

Use of this site is governed by our Terms of Use and Privacy Policy. Copyright 1996-2015 Ziff Davis, LLC.PCMag Digital Group All Rights Reserved. ExtremeTech is a registered trademark of Ziff Davis, LLC. Reproduction in whole or in part in any form or medium without express written permission of Ziff Davis, LLC. is prohibited.