The budget-priced trio feature full support for DirectX 10 features including pixel and vertex shader model 4.0. NVIDIA has yet to reveal the amount of shaders or shader clocks though. Nevertheless, the trio supports NVIDIA SLI and PureVideo technologies.

At the top of the mid-range lineup is the GeForce 8600GTS. The G84-based graphics core clocks in at 675 MHz. NVIDIA pairs the GeForce 8600GTS with 256MB of GDDR3 memory clocked at 1000 MHz. The memory interfaces with the GPU via a 128-bit bus. The GeForce 8600GTS does not integrate HDCP keys on the GPU. Add-in board partners will have to purchase separate EEPROMs with HDCP keys; however, all GeForce 8600GTS-based graphics cards feature support for HDCP.

Supported video output connectors include dual dual-link DVI, VGA, SDTV and HDTV outputs, and analog video inputs. G84-based GPUs do not support a native HDMI output. Manufacturers can adapt one of the DVI-outputs for HDMI.

NVIDIA’s GeForce 8600GT is not as performance oriented as the 8600GTS. The GeForce 8600GT GPU clocks in at a more conservative 540 MHz. The memory configuration has more flexibility, letting manufacturers decide between 256MB or 128MB of GDDR3 memory. NVIDIA specifies the memory clock at 700 MHz. The GeForce 8600GT shares the same 128-bit memory interface as the 8600GTS. HDCP support on GeForce 8600GT is optional. The GPU and reference board design support the required HDCP keys EEPROM, however, the implementation is up to NVIDIA’s add-in board partners.

GeForce 8600GT-based graphics cards only require a six-layer PCB instead of the eight-layer PCB of the 8600GTS. The physical board size is also smaller too – measuring in at 6.9 x 4.376 inches. GeForce 8600GT-based cards do not require external PCIe power. NVIDIA rates the maximum board power consumption at 43-watts – 28-watts less than the 8600GTS.

The GeForce 8600GT supports similar video outputs as the 8600GTS, however, the 8600GT does not support video input features.

NVIDIA has revealed very little information on the GeForce 8500GT besides support for GDDR3 and DDR2 memory. It supports dual dual-link DVI, VGA and TV outputs as well.

Both visually and in specs, this card looks to be a great buy for the 200 USD or so MSRP (if performance holds up, of course). With it's estimated/rumoured 64 shaders and 1 GHz clock (2 GHz effective DDR) memory, it looks to trump the X19x0 GT/Pro and the 79x0 GS/GT, possibly even the XTX and GTX models which would be great. It also features DX10, but let's be realistic: by the time DX10 arrives the refresh DX10 cards might even be out already, and by the time DX10 goes mainstream these cards will be antiquated.

A lot of you are worried about the 128-bit 'fiasco' but like many others have pointed out, 256-bit PCBs aren't cheap. If you jack up the clocks enough you can hide that deficit, even destroy 256-bit cards and all the while maintaining a lower cost. Don't knock 128-bit cards, see 7600 GT vs. X1800 GTO.

My only question is how will this stack up agaisnt the X2600/RV630 high end offering... My bet is that the ATI offering will cost slightly more while offering slightly better stock performance but less overclocking potential... At least that has been the trend... (See X1650 XT vs. 7600 GT, X1900 GT vs 7900 GS)

"It looks like the iPhone 4 might be their Vista, and I'm okay with that." -- Microsoft COO Kevin Turner