Get ready to enter 2008 with a bang: AMD has a bunch of GPUs on the way

AMD's newest R680 graphics processor might look a whole lot
like the ill-fated R600 GPU, but the reality couldn't be more bizarre. Instead of one 80nm behemoth-of-a-GPU, the R680 consists of two
55nm processor cores.

Representatives from AMD would not confirm that the R680 is essentially two
RV670 GPU cores on the same board, though the company did confirm that each
core has the same specifications of an RV670 processor.

The RV670 graphics core, announced last November with the Phenom processor, is
the first 55nm desktop graphics adaptor. AMD does not target this card as
a high-end adaptor, though reviewers were quick to herald the RV670 as AMD's
best product of 2007.

The company also made quick mention of the RV620 and RV635 GPU cores.
These cores are nearly identical to the previous RV610 and RV630 processors,
but will be produced on the 55nm node instead.

All three of AMD's new GPUs are scheduled to launch next month.

Dual-GPU technology is not new. 3dfx's flagship Voodoo 5 family also
resorted to multiple processors to achieve its relatively high
performance. ASUS, Gigabyte, Sapphire, HIS and PowerColor all
introduced dual-GPU configurations of just about every graphics processor on
the market, though these were never "sanctioned" ATI or NVIDIA
projects. Ultimately, all of these projects were canned due to long
development times and low demand.

Cross-state rival NVIDIA isn't sitting on idle hands though,
either. The company publicly announced plans to replace all 90nm
G80 graphics cores with G92 derivatives by the end of the year. G92's
debut introduction, GeForce 8800 GT, met wild support from reviewers and
analysts alike. G92's second introduce, GeForce 8800 GTS 512MB, was met
with similar but less enthusiastic acceptance during Tuesday's launch.

NVIDIA's newest roadmap claims the DirectX 10.1 family of 65nm processors will
also hit store shelves this Spring. The chipsets -- codenamed D9E, D9M
and D9P -- are architecturally different from the G80/G92 family.

Comments

Threshold

Username

Password

remember me

This article is over a month old, voting and posting comments is disabled

Lots of people forget that ATI video cards used to be complete garbage. The entire Rage line was trash compared to 3Dfx's VooDoo cards and even nVidia's TNT line.

So why did ATI live to fight another day? Because all sorts of Gateways and Compaqs and Macs had ATI video cards in them. They were king of cheap video for the mass market.

Incidentally, for those who aren't aware, the Rage Fury MAXX was a 4 GPU video card in the days when people were getting away from the original SLI (VooDoo 2 supported SLI), in favor of bigger/better/faster single chips. I don't remember whether the Fury MAXX was competing with TNT 2 Ultra or GeForce (the first one), but whichever it was, it blew the Fury MAXX (and all other available cards) out of the water, because although 4 GPUs pushed a lot of pixels, it didn't have any innovative technology.

The MAXX was actually released after the Geforce 256 to try and compete (even though it has no T&L support) and it did actually beat the Geforce in a few tests but godawful driver issues basically made the card an overpriced piece of crap.

Also, what the GPU handles as per the rendering pipeline has increased with each generation. Saying that something cannot compete because or lack of one feature is a crass oversimplification. Such as claiming that the Radeon X1950 XT is outperformed by the Radeon HD 2400 PRO because it lacks directx 10 support.