The specifications of AMD’s next generation of graphics cards seem to be there. The HD 7900 series (HD 7950, HD 7970 and HD 7990) will be certainly launched in Q1 2012 with 28nm transistors. The GPU, codenamed Tahiti (PRO and XT) will use a new architecture called GCN for Graphics Core Next (the current HD 6900 is based on a VLIW4 architecture). The other important change of the HD 7900 series, is the presence of the XDR2 memory in place of the GDDR5. The XDR2 memory is twice faster than GDDR5. The Radeon HD 7970 will have 2048 shader cores (1536 for the HD 6970), 128 texture units and 64 ROPs. And the power consumption will 190W, 60W in less than the HD 6970.

But before releasing the HD 7900, AMD will produce the GPUs for the HD 7800 series (HD 7870, HD 7850 and HD 7650) planned for the end of 2011. The Radeon HD 7870 will be powered by the Thames XT GPU (28nm, VLWI4 architecture, 1536 shader cores) and will be available with 2048MB of GDDR5 memory.

@Anon – I already have one 😉 And the purpose of money is not to spend them… it’s rather to spend them wisely 😛 Of course GTX580 > HD6970 in pure performance on stock clocks but not everywhere and not always 😉

These numbers look too good to be true for the low-end cards. The Radeon 7570 with 16 ROPs, and Power Consumption of 50W?!?! Similar spec’ed cards for the past generations ( 4600, 5500 series) have been using 8, with NVIDIA starting to use 4 ROPs on their 430,440,520 series. Sure, it is possible that this improvement can be attributed to the 28nm process, but I’d be surprised if these cards are priced in the $80-100 range.

According to documents circulating around the Internet, only the high-end GPUs will be manufacturer using the 28nm process. The lower-end GPUs will be manufactured using the 40nm process. Thus, expect cards with similar specs as AMD’s past 2 generations in the low-end range (definitely not 16 ROPs).

@pr0or1337 but people have the opposite effect, and it’s probably something on your end or a defect with that particular card, not the whole line from amd(or even nvidia for that matter). also, game devs don’t always really optimize for amd..

Thanks for posting, I have looked for AMD roadmap and didn’t find anything. I’m going to upgrade my Radeon 4850 but Radeon 6950 or 560 Ti and even 6970 and GTX 580 which I looked at, are weak in today games like Metro 2033 so I wanna something more powerful. I’m gonna wait 7900’s

@Nuk3d – most “optimizations” in games for specific vendor aren’t done by tweaked game engine but by tweaked drivers. Thanks to partnership program nv has early access to game builds and can optimize their drivers to run it smoothly even BEFORE it’s released. Now some few months after release AMD gives tweaked drivers with perf. boost in that title while nv can’t boost it up more. Of course now you can give lots of obstacles… AMD is no good for tess of whole scene with high factors – kill performance on radeons by using it. NV is slightly worse with pixel shader post-processing – apply enough effects to get 6970 higher than gtx580… That’s how it works nowadays 🙁

It looks like a nice GPU. Too bad that ATI had so many bugs and incompatiblity in games due nvidia owning the whole market. Anyways I think is a nice step forward in performance. And this will make nvidia to do even more nice gpus. And finally they are taking in account the power consumption!!

And for all fanboys.. stop fighting. Each one can buy the gpu most like. I use nvidia because mot of the game are optimized and made for nvidia besides all features I get in 3D design using nvidia with cuda, physx, etc.. But if ATI could offer something better, for sure I choose ATI. ATI is nice and cheap, good for gamers without too much money. nVidia is more for gamer entusiast and people that want use teh gpu for more than just play games.

And remember that with more GPU vs GPU, more cheap and better products we have. So let they to fight to get some nices gpus the next year 😀

Wow the kinda stuff people spew about cards is ridiculous. I have used both companies products and their both great. That being said I am currently using AMD because after 6 Nvidia cards dying and 0 Ati/AMD cards dying I figured I was wasting my money. Not saying Nvidia is to blame because none of them were reference cards but still. Also, I have had issues with both cards drivers and lost several cards to those infamous 196.75 forceware drivers that were known to kill cards. Never had that issue with Ati/AMD drivers. Both companies have failed their customers in some way or another but sadly its a duopoly so pick your side and move on. They all perform similiarly at the same price points and give and take on features. Several things stand out from reading everyones comments.
1) Both companies have been known to reuse previous gen architecture so don’t even try and use that as an excuse against the other.
2) Most cards can’t handle DX11 worth a damn. Unless your at the High-End or running multiple cards your not getting ideal performance.
3) Games like Metro 2033 shouldn’t be used as a baseline for evaluating your next purchase unless you plan on playing it all the time. Its a great benchmark and looks amazing but its not realistic to use it to judge the future of performance in DX11 titles.
4)Physx is a zombie, Nvidia is moving away from it at least in the way we know it now.
5)The argument about Nvidia being better for things other than gaming makes me assume people are talking about OpenCL or general computing. Both companies have strong showings in different applications for example, Bitcoin favors AMD very strongly. Folding@home favors Nvidia very strongly.
6) Be happy there is more than one player in the Graphics card market. If there wasn’t competition consumers would suffer. You like Nvidia? Great! You love AMD? Fantastic. Just realize that both companies have their strengths and weaknesses.

Yay! Now if only game developers could release some games that could actually utilize the speed of these GPUs. Since every AAA game is a console port these days, I feel I will be safe with my 5870 for quite some time to come. Even my old 9800GTX can max out most new games even w/ antialiasing. The only reason you would want one of these is for multi monitor setups that have really huge resolutions like 5760×1080.

lol, and nvidia 500 series were the only proper series in years from that company, and now to correct some fanboism:
gaming optimization has nothing to do with drivers, it has to do with your game, thats why so many multiplatform games work like crap on ps3 and then ppl say lulz ps3 sucks cause ps3 has less processing power, im not even bothering comparing the specs theyve been around for years and its more than clear in both flops & cpu specs which actually does better, also, if you stop for a minute and think at how dirty nvidia plays by even disabling features such as AA on nvidia games like batman as soon as an ATI card is detectedi think this statement clearer than ever, Nvidia is alot of hype and alot of crap talk, same drill on android.
Nvidia has idd higher performance in 580 vs hd6970 however nvidia AA and AF are still some sort of cheap blurring filters compared to ATI AA & AF.
The only card i actually disliked from ATI was the very first version of AMD 4870 available on market, it died on me so quickly because the vrm couldnt hold the gfx card power demand in order, reaching 93ºC on vrm while gpu was at 57ºC, it actually died on windows 7 installation.