Post Your Comment

88 Comments

Correct me if I'm wrong, but shouldn't this be the very first AMD GPU with *full* Linux driver support from day one? I remember long ago it was promised to have the very same feature support on Windows and Linux, starting from 8xxx family.Reply

Like that is ever going to happen. As long as they don't share most of the code base for their drivers, this won't happen. And I am not aware of any plans on supporting Enduro on linux. Nvidia at least has a plan and a working community made workaround.Reply

No company in their right mind that is facing changing industry trends and experiencing financial problems has extra resources to spend on an OS that comprises just 1.25% of the entire market. Your expectations for Linux support at this point in time are completely unrealistic. The payoff for the investment is just not worth it. Reply

Actually, with Steam coming to Linux it wouldn't be a terrible idea betting on the gaming market share of Linux going up a small amount. If they could become the de facto GPU in the Linux space, I'd say that's worthy of putting effort in. AMD needs to be making bets right now, playing it safe isn't going to save them from bankruptcy in the future.

Think about this; NVIDIA historically has been terrible at catering to the Linux community. If AMD were to do something incredibly bold like open-source their (currently) proprietary drivers, that would all but guarantee them exclusivity in the Linux world. Competing for 1.25% aint much, but having a monopoly over 1.25%? that's a different story.Reply

Imagine if they did that? Gamers, and other enthusiasts building their own systems from scratch, might then completely overlook Windows, and use the free option. Who knows what skills the community could offer to them in improving drivers?

TRUST ME, I'm no Penguin hugger, I FIGHT with Linux, not use it.

With the arrival of Steam, the take up of Linux would be seriously re-considered if the drivers went open source... and given the Win 8 release (stealing my start menu, when everyone seems to want it!), it might force MS to actually listen to customers once again.

I am surprised that there is so little porting going on from PlayStation 3 titles to linux, given the fact the the PS3 already is running a linux-derivative.

At the same time, about 80% of big game titles coming for the PC are already portings or co-developments from/with the XBOX 360.

Of course, a market share of 1.25% of linux probably means that only 2.5% of all users care one way or annother what OS they have. Whatever comes with the PC (or tablet) and runs their software is fine for most people.Reply

PS3 games aren't built for Linux or any derivative thereof -- Early PS3s could *run* Linux, and I suppose it might even be possible that the dash interface might be derived from Linux, but games themselves take complete control of the system, much like an OS does. The PS3 has an optional, openGL-like API, but I'd wager that nearly all AAA games are writing commands directly to the GPU hardware, at least for the meat of the rendering. At minimum, porting a PS3 game to linux would require substituting Linux API calls for any bare-metal coding, porting any PPC/Altivec assembly to x86/SSE, porting SPU code to SSE or OpenCL as appropriate, substituting OpenGL calls for any GPU control statements (and similarly for audio), and finally auditing the code base for any statements that rely on how the PS3's particular CPU operates, because the behavior of certain operations (floating-point accuracy and rounding mode, shift behavior, others) can be different and this is visible even to high-level languages like C and C++. Then you might have fundamental differences between memory models (strongly vs weakly ordered) and cache coherency... All that just to get it working, and *then* you get to re-tune things for performance because consoles tend to have a few other tricks up their sleeves that PCs simply don't -- there's quite a few reasons why a PS3 or 360 game can hold its own against PC games with far better hardware.Reply

The hope is that the gains from releasing their driver source would outweigh the losses to NVIDIA, if there were to be losses.

At this point, I'm sure the majority of the software tricks either party uses is well known by both sides. If this were 10 years ago I'd say the trade secrets should be kept secret, but I doubt there is more than just minor things that NVIDIA doesn't already know.

In one of the more public secret keepings (if that makes sense) with eyefinity, AMD/ATI managed to keep that secret right up until launch day. It didn't take long for NVIDIA to do the same thing in software. So even keeping a secret closed up until launch day only gave AMD a few months (IIRC) of a head start.

That said even if there were major trade secrets in the code, I doubt they would be of much use. The two companies use very different hardware and quite likely have very different requirements.Reply

It's amazing how stupid the amd fan nuts are - they whined for years without PhysX and screamed proprietary, pretended nVidia didn't offer the port through NGOHQ and support it while amd killed it, then when amd finally put some money where their mouth was on open source, they LOCKED UP WINZIP tight into their little nazi proprietary bundle, becoming the most gigantic hypocrite possible.

Now, the little fluttering brainflies want another liars show - perhaps AMD should just tell them they will soon open source everything, then direct the idiot minions to attack nVidia because they have not done so already...

Yes, that will work - then when amd files bankruptcy and caves in from their cheap penny pinching idiot worshipful bagbrains, they can lock up the code forever in some deep and mysterious hostile takeover agreement... and their tinpot dictated believe any lie ever fans can blame the evil "BAIN!" that did freedom and open source code in...

There, that should be sent directly to amd HQ, they'll probably accuse me of corporate espionage for stealing their PR campaign outlines.Reply

You troll every single AMD article with meaningless BS. How does it feel when nvidia releases products after AMD and has shitty, shitty 'physx' (very little resemblance to actual physics and *nothing* that couldn't be run easily on a modern CPU) titles like borderlands 2 ? It looks like it was drawn by a kid with a smudgy marker. This wondrous display of fidelity and 'pisics gaise!' was going on while AMD was working on superior and innovative products with the larger developers, all while consistently outperforming nvidia for less.

Then...the compute comparison comes into play...

Sorry, little boy. The green-with-jealousy team brought nothing to the table this year except over-priced, no overhead, green-light crap.

Get an education, N0vida will not be able to pay you for being part of their water-army-comment-ridiculously-deluded-little-child group much longer...AMD now owns the console market. PS4: AMD APU. xbox720: AMD APU. Steambox: AMD APU, possibly with a small AMD discrete card. Wii: AMD APU.

You troll every single AMD article with meaningless BS. How does it feel when nvidia releases products after AMD and has shitty, shitty 'physx' (very little resemblance to actual physics and *nothing* that couldn't be run easily on a modern CPU) titles like borderlands 2 ? It looks like it was drawn by a kid with a smudgy marker. This wondrous display of fidelity and 'pisics gaise!' was going on while AMD was working on superior and innovative products with the larger developers, all while consistently outperforming nvidia for less.

Then...the compute comparison comes into play...

Sorry, little boy. The green-with-jealousy team brought nothing to the table this year except over-priced, no overhead, green-light crap.

Get an education, N0vida will not be able to pay you for being part of their water-army-comment-ridiculously-deluded-little-child group much longer...AMD now owns the console market. PS4: AMD APU. xbox720: AMD APU. Steambox: AMD APU, possibly with a small AMD discrete card. Wii: AMD APU.

Actually, I just switched my laptop to linux, and it's a lot less resource intensive. I imagine i would get better performances on games, so I see more people using linux for more 3D intensive applications.Reply

Last time I tried Linux, I actually found it felt every bit as resource intensive as Windows 7. It also took longer to boot and delivered worse battery life. If you're running a stripped down flavor of Linux, as opposed to Ubuntu, I'm sure it can boot more quickly, but part of what makes a modern OS desirable is also what makes it resource intensive.

As for performance being better in games, I doubt it -- the GPU is usually the big bottleneck, and the best way to improve GPU utilization is through optimizing the drivers. AMD and NVIDIA have been doing Windows drivers for years with every last ounce of effort put into improving performance; there's no reason to suppose that they would somehow get better results from their Linux drivers.

Open source Linux drivers would be a different matter, but I expect it will take years for the Open Source community to match what AMD has done in terms of optimizations -- unless AMD truly were to release every piece of low level information on their hardware that's necessary in order to properly optimize performance, which isn't going to happen.Reply

I guess YMMV. My laptop gets about 3 hours 30 minutes of battery life under Ubuntu 12.04 while it does only 3 hours under Windows 7. I've no comment about gaming performance as there aren't any games worth playing on Linux until Steam arrives.

I am using the closed source AMD Catalyst drivers as the open source drivers do not appear to support power management, so it eats through battery power like it's going out of style. Reply

I think you might want to wait for the massive 100% totally paper launch to "proceed to the era where hardware is actually available secretly to the reviewing e-reporters, before declaring a pure linux support win....

You know ?

Like maybe some actual hardware in someone's lab before launching the great celebration of amd keeping it's word....LOL

Why can't the series names be consistent across desktop and mobile? You think 7000 series, and you think GCN architecture, but really the mobile parts are on the previous one. And then you hear 8000 series and you think the next architecture, but really they're just finally moving the mobile parts to GCN. Why? What knucklehead thought this was a good idea? Reply

7700M, 7800M, and 7900M are all GCN, but yes, it's a bit of a pain. The reality is that making a new architecture and then executing that across all market segments is very difficult. NVIDIA's solution with 600M was to only do two chips initially, GK107 and GK104 -- and even then they still had a few Fermi-based 600M parts. Considering Cape Verde is totally design complete and needs no further work to ship parts, what would be the incentive to design a replacement that doesn't substantially change performance or features? They did make a replacement for the lower-tier 7500M/7600M, and maybe we'll see 8400M get "retired" -- the 7400M is really not much better than Intel's HD 4000 -- so now everything likely to be used is on 28nm GCN.

I'm still curious to see what AMD (and NVIDIA) have planned for their next generation GPUs on desktops. CI (Sea Islands) might improve efficiency in a few areas, but the only way to really make it faster while staying on 28nm is to go bigger, and Tahiti is already plenty big. NVIDIA will no doubt have GK110 based high-end parts, but I'm not sure any of that will touch the mobile market. I suspect we'll be waiting until the next process node before AMD and NVIDIA really get a major generational improvement-- so we're experiencing something similar to how the GeForce 400 and 500 parts weren't all that different, or Radeon 5000/6000.Reply

I still think it's unacceptable from a marketing point of view to launch an HD8870M part that happens to be slower than HD7870M. That's going to be extremely confusing/unfair to the consumers who don't follow the GPU industry as we do. That is worse than rebadging. That's similar to how HD6850/6870 replaced HD5850/5870 but unless you followed the GPUs closely, you might have thought they were actually superior parts.

I think AMD should have just gotten rid of the extra transistors used for double precision compute performance for HD8900 series to make room for more graphical functional units. They could always reintroduce the double precision performance back on 20nm node once transistor space/power consumption permits this. NV's strategy of focusing on lean gaming chips with Kepler paid off and knowing that AMD was going to deal with 28nm node again in 2013, as you said it's hard to imagine how they can squeeze a lot more performance than 15-20% (GTX400 --> 500 or HD5870 --> HD6970 series) without running into die space/power consumption constraints.Reply

I'm a bit confused here. Both the 7700 to 7900 ranges for desktop and mobile were GCN with the lower parts VLIW5. As most of the complaints about AMD's rebranding seem to be targeted at their mobile parts, I find this baffling especially considering NVIDIA's attempts at confuddling everybody with the GT 640.

Here, be less confused - amd has HACKED their prior core into a tiny bit, and claimed it is releasing a new gpu...

*"SI chips (“London” ) have 640 cores max for Cape Verde, 1280 max for Pitcairn, and 2048 for Tahiti... AMD has likely created a fourth SI derivative that drops down to two CU arrays, each with three CUs. (You can read more about the GCN/SI architecture in our earlier GPU coverage.) "

Oh, in other words, we've seen this chip before, and the prior derivatives are fuller chips and faster. Good job amd, cut your garbage in half or thirds or quarters or one sixth, and "release a new 8000 screamer !"

Not so confusing when the truth is told flat out. I guess amd had some serious binning issues with massive chip failures, so they just hacked off a tiny corner of the gpu that still "worked", kinda, and made it a new release...

" Performance is something of a wildcard with the new 384 core parts, and the choice of DDR3/GDDR5 memory "

Love that, the in cahoots vendors can release the new "screamer 8000 series!" and dump on you with ddr3 for the gpu and NEVER TELL YOU. Then, as a sick amd penny pinching bankruptcy producing fanatic, you get your ddr3 1/6th the former core piece of crap, and you "sing the praises of your immensely brilliant master builder amd!"

That's going to be another wonderful amd fanboy ride. I guess the lies spewed forth will have to up it another several notches. Reply

Both companies have been using these shortcuts for years, labelling old generation as next gen, mixing and matching 64/128-bit buses, DDR3 with GDDR5, etc. Next time if you are going to pretend to be an informed fanboy, at least do some research so you don't make a fool out of yourself on a tech website. Reply

The 7000 series is a mixture of last gen and GCN parts for mobile, which is annoying and confusing, but not new. Both companies have been doing this for years now...heck, really going all the way back to the Geforce 4 vs. Geforce 4 MX that was kind of a Geforce 2.Reply

The 4000 and 5000 series were self contained, however there are times that utiising a previous generation's technology to fill certain price points makes sense. The G92 was recycled enough, let's be honest.

We can go back and look at both companies in more detail if you like as regards rebrands and redesigns. The Radeon 9000 was a redesign whilst the 6770, besides the newer UVD block, was just a 5770 rebrand (decent card that sold well, so it's not really that offensive).

If AMD can behave as regards their branding then that's one major complaint out of the way, leaving a few more to go. NVIDIA need to sort themselves out in this area too.Reply

You people raged for what seemed a decade over the G80 + all it's derivatives, the difference being it was the king chip[ at the time, and for a long time, and all it's derivatives brought that king performance crown on down to the masses for a very long time. G80 aka G92 derivative, is still a decent gaming chip to this very day. LOL

I loved listening to the endless lies from you very same people, who now have a statuatory nuanced view with all sorts of caveats and cannot HELP themselves to definitely mention the not bankrupt competition when it's nowhere in sight - not even any tiny bit of the article (except of course the required pathetic rabbiting about nVidia, since that's who you people are. )

I will thoroughly enjoy amd's coming bankruptcy, I will relish every moment, you can count on it, I won't be able to stop laughing and celebrating. I will also blame it on people like you, as that where the blame lies. :-) Great job fanboy, destruction is thy fare.Reply

"Performance-wise, AMD expects the new GPUs to be substantially quicker than their predecessors. The company's internal benchmarks show gains of around 20-50% from the Radeon HD 7590M to the Radeon HD 8690M, and about as much from the 7670M to the 8770M. In other words, folks should be getting a lot more performance—and likely increased power efficiency—at the same price points."

No, these aren't rebrands - *unless the 8800m is*. Instead, these are the replacements to the old 5xxx/6xxx cards that were rebranded and called anything under 7690m.

It seems TSMC's ramped up production of 28nm enough and wafer availability + pricing has gotten to a point where it makes sense to salvage some potential scrap and make a lower end GPU out of it. If these are priced right they could make for good laptop chips, provided AMD can actually get Enduro working. Reply

It wasn't spelled out but to me it was pretty clear 8500/8600/8700M series are GCN parts in contrast with the older lower-end 7xxxM parts, which were a rebrand themselves.

What should be made more clear is how *ABSOLUTELY RIDICULOUS* this all is.After Nvidia rebranding Fermi parts to make them look like Kepler 6xxMs, now probably every single one of the new AMD mobile lineup is going to have lower performance than the respective 7xxxM SKU.Although we don't know the details yet, if they're comparing 7670M to 8770M and 7590M to 8690M it is clear that they're pulling off the same 58xx->69XX stunt. It seems pretty clear, by projecting what the gimped&rebranded 8800M is going to be.

I could look the other way when it was limited to entry level SKUs, but rebranding relatively high-end 78xxM GPUs to 88xxM while also substantially lowering their performance (don't tell me 500 more MHz on the RAM are ever going to make up for 100-150MHz on the core, this is all done to lower power consumption) is absolutely unacceptable. They're trying to make people think "hey look, not only the 8800M is newer, it also has a lower TDP!" as if they were completely new generation parts.

Also I can't wait to see 8700M parts clocked as low as 8500M, as the graph suggests it will be possible. Disgusting.Reply

It costs a pretty penny for a wafer at 28nm. If the yields are good, and all signs show that they are, it makes little sense for either company, nVidia or AMD, to make an entire product line using the architecture and process.

Why bother to bin something like a GT610m or 7490m if it's going to sell for pennies anyway?

This is approach isn't seen only in the laptop area but in the same low end discrete desktop GPUs as well.

But this isn't a rebranding at all. In fact, this looks more like a refresh of the entire mobile lineup that was previously rebranded. nVidia will likely follow suit as well and hopefully we'll see both companies on 28nm throughout all of their discrete GPUs for both mobile and desktop - assuming the others reach EOL (end of life) sometime soon.

So what's up with the tone? and not just you, but Jarred as well. If you're sick of refreshes and rebrands, this is exactly the type of news you should be happy to see...

"We’ll have more on that in a moment, but while we were expecting a rebranding prior to the call, it appears we are at least partially mistaken—you can see the previous generation high-end 7000M announcement from April 2012 for reference."

You weren't partially mistaken, you seem to be completely mistaken. Reply

Which part of the discussion of 8800M begin a rebrand did you miss? So everything 8700M and lower is using a new GCN chip, but at the same time we've just compressed the performance difference between 7500M, 7600M, and 7700M. The 8500M series is 384 cores at 650MHz (maybe lower on some models), the 8700M is 384 cores at up to 850MHz, so the range of performance is now about 30% increase.

The second part of the discussion is that we're looking at new parts at the lower end of the market, so no that's not nearly as exciting as real improvements in the midrange and high-end markets. One performance comparison shown in the slide deck of AMD vs. AMD is 8770M vs. 7670M, rather than 8770M vs. 7770M. It looks like the 8770M is around 30% faster, give or take, but the 7770M is 512 cores clocked at 675MHz while 7670M is 480 cores clocked at 600MHz. So, 7770M is around 20% faster than 7670M, and 8770M would thus only be a ~10% upgrade to 7770M.

Another comparison is 8690M vs. 7590M, where 8690M looks to be about 20% faster in most games. Problem is, 7590M is 480 cores at 600MHz while 7690M is 480 cores at 725MHz -- that's a 20% difference right there! In my book, you compare the same family to show improvement, so 8690M goes up against 7690M, 8770M goes against 7770M, and 8590M would go against 7590M. That's not what AMD is showing right now.

Granted, these are all preliminary results and we won't have hardware for a month or more I'd guess. We don't know power or other aspects of performance that are important either. What we really need to know to judge value and performance is: how does 8870M compared to 8770M, how does 8870M compare to GTX 660M and GTX 670MX, etc. Right now, we only know a few base specs, and I'm not going to spend too much time praising an apparently new GPU at the low end of the market. The 384 core AMD parts are going to be a nice addition for budget laptops, but until we have hardware and independent benchmarks in hand there's not much to say.Reply

I'd like to reference my above reply: it's irrelevant. It really is. Enthusiasts buy GPUs in a competitive market, where everyone looks at the GPU and overall price tag. Noobs buy looking at the CPU, price tag and MAYBE the AMD graphics sticker (though usually they don't look) requiring AMD to be just as competitive with pricing as if they made their model numbers match up with where you'd like them.

Also, it's beneficial that they're compressed: we no longer have crap (equivalent to) the GT 610, HD 7350 and other junk that barely deserves to be on a CPU die, let alone one dedicated to a GPU.

You're being pissy because AMD is providing incremental upgrades and bumping up the entire lineup to 28nm?

"Right now, we only know a few base specs, and I'm not going to spend too much time praising an apparently new GPU at the low end of the market. "

And when a majority of discrete GPUs in laptops are low end anyway, where does that leave us?

Right. The same place we started. You being completely unaware of what this new generation of products is meant to replace. Instead of realizing that a vast majority of mGPUs are sub GT660m - FFS, Jarred, have you seen the amount of Kepler rebrands "new" laptops have? These NEW GPUs are meant to replace everything that was lower than a 7730m, and even the first gen GCN parts might get a small architectural upgrade.

If you're sick of rebrands and want more efficient GPUs, whether it's lower end or not, this is the most welcome news we've seen in 2-3 years for mobile GPUs from AMD. And, as a writer for Anandtech, you COMPLETELY missed that point.

So well done. Really. Spot on article. While BSN* and TR actually at least partially addressed this point, you seem to ignore it altogether and go to the comments to defend your stance. Reply

Just to be clear, you completely ignored all my commentary on the fact that the only benchmarks AMD has provided show performance comparisons between different classes -- 8690M vs. 7590M and 8770M vs. 7670M. We also have the lopsided HD 8870M vs. GT 650M comparison, which is like comparing the desktop HD 7870 with a desktop GTX 650 Ti and crowing about how AMD is over 50% faster. That's a big part of why this announcement isn't really impressing me, and until we know three things there's not much more to be said:

1) We need to know the real performance numbers under a wide range of games.2) We need to see how much the laptops with the various GPUs cost.3) We need to know how much power the chips draw under various loads.

Until we know those items, to speculate and say that they're substantially faster and/or more power efficient is taking things too far.

Furthermore, regarding power use AMD and NVIDIA have been targeting (with their OEM partners) roughly the same mobile GPU TDPs for years: ~35W, ~45W, 60~75W, and ~100W (with that last being a more recent addition). Anything less than 35W is typically reserved for the stuff that no one actually recommends -- unless you think GT 620M and HD 7470M are great products? Right now, there's no reason the new parts would open up a lower power class than what we already have. I expect the new 384 core GCN parts will go head-to-head against NVIDIA's GK107 parts, but at the same time I don't expect to see any major improvements in overall gaming performance.

More important with regards to power, nearly all laptops are now using switchable graphics, so I don't really care how much power a dGPU uses at idle. It's either running full tilt to play a game -- in which case I'm plugged in and want as much performance as I can get -- or it's powered off and HD 4000 (or the appropriate Trinity iGPU) is doing the work. So at the lower end of the performance range we have GT 640M/650M/GTX 660M going against HD 8500M/8600M/8700M, and probably trading blows depending on the title. Competition is good, and NVIDIA really needs some in the mobile sector right now as they seem to be outselling AMD by a huge margin in laptops (until you add in numbers from Apple, but that's not entirely fair as Apple is a separate class and has been swapping between AMD and NVIDIA on a regular basis depending on who gives them the better deal), but going back to my three points above, it's way too early to be crowning a victor.

Finally, there's speculation -- and only speculation -- that the new GPUs will get into Ultrathins and Ultrabooks. I'm highly skeptical that will really matter, because right now I can name two Ultrabooks with dGPUs: ASUS UX32A/UX32VD, and Acer TimelineU M3/M5 -- and if you really want to stretch the definition, you have Razer's "first real gaming laptop" Blade, which is full of hyperbole, or the Apple MacBook Pro 15 Retina. Everything else is iGPU only, mostly because the majority of people aren't really playing games on Ultrabooks/Ultrathins. Plus, even if the new dGPUs use 25W, that's more than double what the rest of the system would use, and we already run into throttling and heat issues on many Ultrabooks with just a 17W CPU/APU. Considering the only viable and affordable gaming Ultrabook is a 14" Acer M5 that really pushes the limits of Ultrabook (it weighs about 4.5 pounds, for instance), I don't know that we'll see any real gaming class Ultrathin/Ultrabook laptops until the next process node -- which will bring Haswell and 20nm GPUs into the picture.

So go ahead and call this "the most welcome news we've seen in 2-3 years for mobile GPUs from AMD". I did't completely miss that point; I simply disagree with your (IMO unwarranted) optimism. I took a more or less neutral tone, as we don't have the information necessary to take a stronger stance. We have a Johnny-come-lately smaller Southern Islands GCN part, and we have a rebrand. If that's the most exciting news in 2-3 years, that speaks more about the unimpressive execution of AMD's mobile GPUs during the past 2-3 years than anything. And really, I'd say Turks at its launch was at least as impressive as the Mars "Solar System" GPU announced here.Reply

The 8800 is competing with the 7700, and the 8600 is actually competing with the 7400 40nm parts as far as pricing and segmentation go.

We don't know the performance figures, but considering every GPU in recent history has been 10-30%, I'd say it's pretty darn safe to conclude that this will be along those lines. Furthermore, given the segments of the market these are intended to, we can also deduce what they'll likely cost as well. This isn't Bulldozer we're talking about, this is ATi. They're not going to backwards, Jarred. We know how GCN performs and we know the performance of the older 40nm parts. Is it really that difficult for you?

What we CAN conclude with near certainty is:

- they'll perform better than the 40nm parts. I think to assume otherwise would be asinine.- The perf-per-watt will be substantially better.- The parts will likely cost the same (maybe a bit higher) than those they replaced. AMD isn't going to refresh their entire lineup and move it to GCN + 28nm if they don't plan on it selling, which means reasonable pricing.

Despite what you think, the low end GPUs outsell the upper tier stuff by a very large margin, Jarred. Opting to ignore the improvements here - and you can argue % all you want, but I think we both KNOW that we'll see improvements in every single metric - is ridiculous.

And that's what you did. You ignored it, and then went to the comments to defend why you ignored it. The reason? Because apparently nobody gives a damn about lower/mid tier GPUs despite the fact that they outsell the other high end stuff many times over.

I don't care about AMD. They're not going to survive the next two years anyway. What I do care about, though, is that this article was exponentially worse than the others I've read on other sites and that this article included some seriously backwards thinking.

I don't care if stuff sells like hotcakes if it doesn't make sense. There are a bunch of business laptops that ship with GPUs like the HD 7470M, but there's not really a good reason to put such a GPU into a business laptop. Either you need more performance than HD 4000, and in that case you need HD 7600M or above, or you don't need it at all. The reason it sells is because the OEMs push it, not because it makes sense.

Quit being obtuse and suggesting that I've ever stated Mars will be worse than Turks or the other 40nm parts. It's not a question of whether it will be better, but rather how much better. 28nm parts will be faster than the existing 40nm parts, they'll likely use a bit less power under load for the improved performance, and they're going to cost a lot less for AMD to manufacture. But they'll cost about the same as the current parts, and they will still be a large step down from the true midrange parts. OEMs will push them in their crappy laptops that sell thousands of units but are built of plastic and fall apart in a year or two, and in 90% of those laptops they won't even matter because the owners won't play games.

You're hyperbole is so far out of line that I have to think that this is more of a personal vendetta than a reasonable discussion. "Exponentially worse"? Oh, wait: it's the Internet. You should be happy I even took the time to respond to your attack laden message, as really you're just spouting off with little useful data to back it up. You're extrapolating based on what you think will happen, I'm saying "we'll see when product ships." If you want a good gaming GPU, the new Mars chips are about as low as you can go without having to dial way back on details. Personally, I'd rather go with the already shipping Cape Verde/Chelsea GPUs.

Based on what I've discussed in the above posts, we'll see a 10-20% increase in performance, relative to Turks, and that's good but not earthshattering. AMD on slide 25 shows a performance increase of roughly 30% going from 7670M to 8770M. If they're the same price, that's great, which is what slide 28 might suggest; but slide 28 also suggests that there is no currently slated replacement for HD 7800M or 7900M. 7900M I can understand, but 8800M is 7800M in every way we can find, probably a bit slower thanks to the lower clocks, so I don't know what AMD is thinking there. Will we have 8970M as the top-tier part and 8950M as the 7870M replacement? Maybe 8970M-X2 for the top with 8970M as the next step down?

You're reading this marketing chart as though it's the bible and is not to be questioned; my job as an analyst is in fat to look at stuff like this and question it. So I look at the charts and say, "Gee, 8800M is a rebrand of 7800M, which AMD refuses to admit; AMD pulled this launch in three weeks (from the Jan 7 official launch) and gave us a bunch of fluff. Why?" Well, MrDude says it's because the new low-end GPUs will be awesome, but my opinion -- and the sentiment of many other readers -- is that the low-end GPUs are only interesting for OEMs. At the higher end segments where having a dGPU really matters, we're getting a lateral rebrand at best. The OEMs win, AMD wins, and the sheep buying entry level to midrange laptops lose. Congratulations!

Never mind the drivers, which are still in need of work (though thankfully getting better), or the fact that NVIDIA has offered performance that is around 40% better than Turks with lower power requirements for over six months. Yes, GT 640M has been kicking the crap out of anything using HD 7670M and below since it launched. Meanwhile, 7730M is in one laptop I can find right now (Dell Inspiron 15R Special Edition) and the 7750M is also in one laptop (HP ENVY 15), while 7770M is MIA. And naturally, Enduro hasn't really worked properly/adequately until the Catalyst 9.01 beta. So AMD now has an SI chip coming in Q1 that will hopefully be able to match GK107 on performance and power, and maybe even at a lower price. But until we can buy the laptops, we can't really say for sure.Reply

A crappy mobile GPU that isn't HD4000 on a ULV machine also means no stuttering, which any Intel ULV will do, and that's without question. The only question is if you're going to stutter due to TDP constraints or will it overheat before it gets there.

And practically nothing of what you've wrote in the comment is implied or referenced in the article. And I'm not being obtuse, I'm simply pointing out your ignorance here. I'm not the one who claimed that anything 7670m and lower isn't worth mentioning. You did. You chose only to focus on the higher end parts and said nothing regarding the performance bump, which we both agree we'll inevitably see. The price points, too, were completely ignored. Because, as you said, only high end matters.

And lastly, you're confusing me with someone with a sentimental attachment to a corporation which won't see the light of day in 2014. I wouldn't take an AMD GPU in a laptop if given to me for free due to the driver issues, but if labeling me an AMD fanboy makes you feel better for a horrible article and unbelievably ignorant comments then keep up the good work. Reply

I said "Performance is something of a wild card depending on the use of DDR3/GDDR5" -- and I could add 128-bit or 64-bit interfaces (wouldn't be surprised to see some 64-bit stuff come out still, though I hope that's not the case), and the "up to xxxMHz" clauses always make me leery. I expect an improvement on the 7670M and lower GPUs ranging from about 0% to as much as 30%, depending on the title. But as I've been trying to make clear, it depends on the comparison you make.

With 8570M be faster than 7670M? It depends on final clocks, but probably a difference of less than 10%. I draw the line of "useful improvement" from a hardware update at around 20%, because a 10% improvement just doesn't matter enough in most cases. So when AMD shows 8690M vs. 7590M and gets a 30% average improvement, I have to point out the fact that they're skewing the numbers by intentionally using a higher ranking 8000M part vs. a lower ranking 7000M part, which makes me wonder what else we're not seeing.

Also note what I say in the rest of the paragraph, which is why we're really waiting for performance details on real, shipping hardware: "We'll find out in the coming months how the 8500/8600/8700M stack up to NVIDIA's midrange "GT" offerings, which interestingly are also using 384 cores."

You can bellyache all you want about me not talking about how AMD is improving on AMD, but that's not the question. If HD 8500/8600/8700M can't outperform NVIDIA's (now six months old) GT 640M/650M/GTX 660M, then this update is definitely too little, too late. And yet again, the slides from AMD compare the 8870M with the GT 650M, which is a completely lopsided match.

So, I'm not ignorant (as in, there's not something useful in this conversation that I don't know), but you *are* still being obtuse (slow to understand). If you weren't being a typical Internet troll and thumping your chest, we could have agreed pages ago and not worried about the trivial differences of opinion. You have your opinion that this is a major update; I understand that opinion and I disagree that it's a big deal -- at least from the perspective of the users. It's great for AMD (cheaper to make parts at similar prices means better profits -- something AMD desperately needs), but as I have repeatedly stated, if you want a GPU for something, you likely want more than HD 7600M/8700M hardware. I can "get by" with GT 650M, but it's not an awesome experience for the times when I need a GPU (gaming); it's at best tolerable.Reply

" Never mind the drivers, which are still in need of work (though thankfully getting better), or the fact that NVIDIA has offered performance that is around 40% better than Turks with lower power requirements for over six months. Yes, GT 640M has been kicking the crap out of anything using HD 7670M and below since it launched. "

LOL - can you say amd epic fail, over and over ?

Here, let me show you how nice the author was to you, and your fanatic fanboy stance:

" Today is a soft launch of high level details, with more architectural information and product details scheduled for January 7, 2013 at CES"

WUT!?? A "soft launch" ? No, this is a complete PAPER LAUNCH 100%.

See how that goes, with amd, they can announce and after endless years of screaming paper launch toward nVidia, it is casually announced as a "soft launch". Not a single freaking chip in sight, but it's all soft and lovely when amd does it...

Why weren't you praising the author for covering amd butt ?

" Granted, these are all preliminary results and we won't have hardware for a month or more I'd guess. We don't know power or other aspects of performance that are important either. "

Hey wonderful ! Not a chip in sight, but it's a "soft launch", landing so happily in the breast of amd fanboys...

Note that "soft launch" and "paper launch" are nearly the same in my book, and don't forget that we've pretty much never had a hard launch of mobile GPUs. You don't buy mobile GPUs separate from a laptop, so when the hardware is launched and available (e.g. the new 8000M chips), that's when notebook manufacturers are going to start putting chips into laptops. It takes a few months of testing and validation before they start releasing those to the public, and thus we have a "soft launch".Reply

The whole point of the lower-end parts is to pair with Kaveri for Hybrid XF, in which case you want it to be on the same architecture (VLIW5 with VLIW4 not being a major issue). Meanwhile, Intel laptops will use high-end ones which *are* improvements or the same (such as 88xx or 89xx) or none at all (in OEM fashion).

" The 8000M series doesn't appear to boast any unique architectural refinements, though. AMD told us that, while these parts do feature new silicon, they're still based on the original iteration of the GCN architecture, just like the desktop Radeon HD 7000 series. "

I'll give amd this much - they spin and lie and spew so much CRAP - they have their little reporter minions spinning circles around themselves.... and thus immense confusion, many hopeful and decieved once again amd fanboys start their little hearth and happy fires burning - and the direct readership is left spinning in circles.... not certain what just occurred....

This is of course amd's plan, and it works wonderfully.... so of course sanity must be injected...

You do realize that Nvidia is equally guilty for botched naming schemes and rebranding as well?

Anyway, the arguing should stop. I thank Jarrod for sharing his opinions, and letting us know about the soft launch.

On both fronts, I find the naming scheme from both AMD and Nvidia equally convoluted for mobile GPUs. In my opinion, I think it's partially the names that help turn people off from PC gaming. Unless one is well versed beforehand, how is one supposed to know a Radeon 7800 chip curb stomps the 8500M. Certainly a salesman won't tell you which is what, and Joe Average would probably buy his aspiring gamer kid the laptop with an 8500M as it's a higher number.

Retailers should step up to the plate here and help the customer differentiate the numbers, but they seem more content on keeping things pretty cryptic as well. How I wish I owned my own retailer...Reply

Frankly all the whining about rebranding makes me LOL.We have all these pinheads who pretend they are techs or intelligent, but a few card numbers throw them into a hellspin they cannot handle

They aren't bright at all, so of course they always claim they are complaining to protect the "general consumer" they never met and never will meet. LOL I've never seen such pathetic liars in my entire life.

Hows that for some reality ? You like that ?Here's some more, the idiots always complaining no doubt go out half cocked or doped up and buy the wrong thing, near constantly. LOL

There's reality. It's not the ephemeral consumer, it's their own stupid butt that's in a sling - they won't be able to remember much of anything once it's time to buy.

The first, low-end 7000M parts were rebadges of last-generation 40nm VLIW5 GPUs (specifically Turks and Caicos). These were introduced right at the start of 2012 so that AMD would have a fresh product line for OEMs (who hate selling the same thing 2 years in a row). So the 7400M, 7500M, and 7600M were all rebadges.

A few months later AMD introduced mobile parts based on their new 28nm Southern Islands GPUs; the 7700M, 7800M, and 7900M. These existed alongside the earlier VLIW5 GPUs. So it is wrong to say that the 7000M was composed of just rebadges, as it contained some GCN parts.

The 8000M series will shape up similarly to the 7000M series. It will be a mix of 2012 GPUs (today's announcement) and 2013 GPUs that will be launched a couple of quarters from now.

As for the second generation comment, that appears to be AMD being generous to itself. We don't have the full details on the new GPU being used to make these low-end parts, but so far it appears to functionally be a very late member of Southern Islands rather than a SI-successor.Reply

Thx, that makes much more sense and i record now there were indeed 28nm 7xxx mobile GPU... yes i forgot, and mixed up.

But unlike the previous announcement where 7400 and 7600 are using lower numbers for previous generation , and reserving the 7700 - 7900 for newer parts, this time around AMD actually uses it all the way up to 8800.

May be i have to wait for more details to come. But that makes today's announcement TOTALLY pointless, and far too early. They will disappoint ( again ) if they dont really a SI-successor, and i seriously doubt they do.Reply

"But, there’s nothing inherently wrong with rebranding—AMD and NVIDIA have both been doing it for some time now."Yes it is inherently wrong, because it is misleading the customer. Just because they have been doing it for a while does not make it better or acceptable. Reply

There's a subtle difference between AMD and NVIDIA here - AMD's higher end parts are all GCN whereas their lower parts are the previous mainstream generation. This has advantages (simplified product stack - you know what series you're getting above or below a certain point) and disadvantages (older tech at lower price points - less current tech or higher power usage). At no point are there two AMD products with the same number and different architectures, where there are two instances of this with NVIDIA - the 640M LE and the 640. Also of note is the 630 which is a rebranded 440 - TWO product lines back.

Let's face facts here - both companies bring their previous generation over to fill their current product lines. If you're IT literate enough, you won't get too confused. I just happen to think that it's a little clearer which series you're getting with AMD, but it's debatable whether you're getting better performance with comparable model numbers from one series to the next in some cases. NVIDIA does seem more consistent with offering better performance between, say, x60 products, though far more so on the desktop.

Buy what you want. Right now, comparing GCN to Kepler, NVIDIA has the more efficient gaming architecture, however they started a revolution in compute that they look to have backed away from.Reply

Hey, since you're such a raged out amd fanboy, why didn't you provide us all with the list for amd ?LOLLike I said, it's the amd stupid paper launch, and you rager amd people cannot stop pointing the finger away - you act like political parties. I'm not surprised you wasted your life collecting your nVidia hate post info there.ROFL - fanboy exposure extraordinaire.

"Hey, since you're such a raged out amd fanboy, why didn't you provide us all with the list for amd ?"

Somewhere in the first pages of these comments, I believe I did actually say that the 7700 to 7900 desktop series were GCN with the other parts carried over from the 6000 series (albeit not the 6900 series - VLIW4 only reappeared in Trinity; I don't class APUs as belonging to either desktop or mobile ranges as they seem to transcend both, and we won't see it again with any luck). A little later, I mentioned that that the 4000 and 5000 series were self contained, BUT I also said that the 6770 was a 5770 with an updated UVD block. If it serves to help you, the 6350 was the 5450, albeit with numbering that actually makes sense unlike what Jarred is correctly pointing out here. It is wrong for AMD to compare a lower model number to a higher one and highlight how much faster the latter is so everybody has to wait to see what the real deal is.

I'll repeat myself for clarity as regards architecture carry overs - AMD's x600 and below lines on the desktop are generally the previous generation, and as for mobile, with the 7xxx series it's the same deal. With the 6xxx mobile series, it goes as follows:

Now THAT particular series is messy and thankfully not likely to be repeated, however there's a very important point to make again - there's no instance of two separate architectures being employed for the same model number. The 7xxxM series is far cleaner, and happily the 8xxxM series doesn't back away from this as far as we can see. I'd be far happier with a fully new series but as we've seen from both companies, it just doesn't seem to happen. If yields have been as bad as we've been led to believe, I can't blame either firm for wanting to reuse silicon or chop off non-working parts. They both do it, and it's pointless arguing otherwise. The 7xx mobile series will most likely feature Kepler parts - I'm not going to rage about it because it makes sense, but what I WILL latch onto is if we end up with the GT 640 all over again. If AMD do it, you can bet your bottom dollar that I'll be all over them like a rash.

The G80 was a great card, you're spot on, but so was the RV770. It was the first architecture that ATi/AMD had done right in a good few years. Equally, Fermi mark 1 was unrefined and greedy, and the HD 2xxx series was a lame duck. There's an essence of balance here that I'm really hoping you latch onto, because you haven't done so in the past.

There's nothing "rager" about any of my posts. Why rage about something I cannot hope to change? If I'm going to make a purchase, I'll research it first so I don't get stung - how can this possibly be an issue? If I get something bad, that's my fault. If AMD brought out something really good, who here has the right to tell me I made a mistake? If I opt for a Kepler-based setup, do you really think I'm going to want to listen to "serves you right for buying rubbish in the past"?

Are AMD this big, evil corporation that you want everybody to believe? Yes, sometimes their product lines make no sense, and yes, sometimes their drivers are all over the place, but have you noticed that people don't usually come out and say "these drivers are top-notch" UNLESS it's in defence to comments such as "xxxx's drivers are buggy and slow"? It's a thankless job, I'll wager, and you have to deal with the complaints on a daily basis as opposed to basking in praise. A couple of years back, I needed a spare card in a hurry after my ATi 9800 Pro died (it'd been overclocked for 4 years with an Arctic Cooling solution), and I unpacked my old Ti4200-8x, only to find that its shader quality was through the floor with the latest drivers. I didn't rage about it because it was a very old card and I'm surprised that people make drivers that work for cards released more than 5 years ago (big kudos to NVIDIA for their LTS). I looked for a replacement and found a 1650 Pro which was a dog but more or less matched the 9800 Pro. Was it a good card? No way - 128-bit DDR2 memory, low pipe counts - but it did the job for 6 months. When I went to build my next PC, I looked at an Intel-NV combo, but neither had anything within my price range, so I ended up with an AMD setup. That XFX 4830 of mine has done remarkably well even though I couldn't overclock it despite its huge cooler - bad batch I guess. It was cheaper (over here in the UK) and faster than the competing 9800 GT in most things, and I simply wasn't interested in CUDA nor PhysX. I will upgrade to a new build eventually and as things stand, there'll be an Intel CPU in there, but I want to see how the 8xxx vs 7xx battle pans out before I make a purchase. Either company could pull out a real cracker.

What do you think would happen to the GPU market should AMD disappear? The enthusiast and higher end NV models would be a fair deal more expensive for a start. There might not even be a need for a yearly product cadence. Believe it or not, consumer choice is a very good thing, and I hope it's here to stay.

Now, there's something I'd like to ask you. Why did you spend your Christmas Day throwing out wild and untrue accusations?Reply

You spent your life doing nothing but obsessing, and lying about amd.Now they're almost dead, and you still whine.You know why I'll be deliriously happy when amd is dead and gone ?You should - you CRYBABY amd losers.You cannot even afford your crybaby cheap as dirt amd cards, and you're such cheap losers amd can't pay for itself, hasn't been able to for years. So, let's say your insane whine comes true at one tenth your claimed disaster- GUESS WHAT ? YOU STILL WON'T BE ABLE TO AFFORD THE NVIDIA CARD !LOL Plus, you're such a sour pussed hate filled amd fan, you won't even want one.ROFL Man I can hardly wait till they QUADRUPLE in price.Reply

For all your vitriol against the AMD fans, you're even worse on the NVIDIA side. We've called out both companies for rebranding for years, and we'll continue to do so. I guess I can feel I did my job well when you're here implying I somehow sugar-coated this story to make AMD look good while MrDude is complaining that I missed an amazing announcement.Reply

You do a fine job, I have zero complaints with you. You work under certain pressures and realities. You have lots of work in a hectic schedule and it's not easy doing loads of benchmarks or coming up with something that has too keep certain powerful parties above in decent shape. No complaints with you, you're doing your job.

I'm just doing my job helping the little posting amd fanboys face reality, something they don't want to ever do, and looks like soon won't ever have to if amd falls over and croaks.Reply

What makes you think their amd fanboys? Most of us gave up complaing about rebranding after the venerable 8800 went thru so many name changes.. The thing for reviewers to do is make note of it in their reviews and right now ... this isn't a review and little is known.. just alot guess work.Reply

The G80 and it's derivatives we're the greatest set of gaming cards for the entire world community of ALL TIME. They brought cards in at every price point and a hundred different combinations. They breathed LIFE into computer gaming.The amd fanboys, the sour little turds that they are, did nothing but complain.They deserve every retort and rebuttal they get, and the YEARS of archives here prove EXACTLY whom they are. Thanks for not being sentient.Reply

You need to step it down a notch. Not everyone claiming NVIDIA abuses the rebranding worse than AMD is an AMD fanboy. I wouldn't be caught dead with a current generation discrete AMD mobile graphics chip as NVIDIA currently provides a better user experience if you need discrete graphics. However, I stand by the claims of others that NVIDIA has an especially bad rebrand track record. What chip is the desktop 640 based on? There's multiple versions that come in both Fermi and Kepler.

Personally I'd just stick with Intel graphics or AMD Fusion as they're just about as fast if not faster than a fair amount of the discrete chips the OEMs throw in there as a marketing tick box even though there's no benefit to them.Reply

Since most of us here on the site/forum are gamers to any level or degree, all I/we care about with this new announcement, is just HOW on Earth AMD is going to compete with Nvidia on the high-end gaming GPU's, if they at all can at this point on, due to the latest Enduro fiasco that we all as unfortunate AMD customers can attest to. From what I am seeing within this announcement, is that there is no substantial improvements coming from AMD in terms of improved gaming (hint: Enduro side-stepping), and that Nvidia is still the way to go for 2013 and beyond.As one who is well-acquainted personally with the Enduro fiasco, I am now more than ever looking at swapping out my 7970m and going back to Nvidia in 2013.As the old saying goes- "Once burned twice shy..."Reply

This looks like the I-cant-believe-it's-not-a-rebrand-rebrand. Shift the specs slightly, tweak the die/process to reduce cost/increase yield, fix a couple of bugs, maybe add a couple of nothing 'features', maybe add enough clockspeed to actually differentiate the old from the new on a bar graph; enough to let Marketing Divison get away with calling it "New"... actually, kind of exactly like the car industry does.

Nvidia did it with their 8xxx series, all the way through 9xxx desktop and mobile, then as no GT200 part trickled down to mobile, that continued with the odd GT1xxM, the whole 2xxM series, then a few repeated and tweaked parts in the 3xxM. The GTX260M I had in my last laptop was basically a 55nm version of a desktop 9800 GTX.

TL;DR It's all just the same old game just the players shift their tactics slightly re media dealingsReply

Battery life varies tremendously based on distro, hardware support, and customization.

For the most part you should expect excellent battery life from think pads as those are heavily used by developers.

As for game performance, Gabe posted fairly recently about how surprised they were about Ubuntu posting higher fps than Windows (even after using the ogl drivers on windows). I wouldnt generalize from that but I wouldn't generalize in the other direction either.Reply