The budget-priced trio feature full support for DirectX 10 features including pixel and vertex shader model 4.0. NVIDIA has yet to reveal the amount of shaders or shader clocks though. Nevertheless, the trio supports NVIDIA SLI and PureVideo technologies.

At the top of the mid-range lineup is the GeForce 8600GTS. The G84-based graphics core clocks in at 675 MHz. NVIDIA pairs the GeForce 8600GTS with 256MB of GDDR3 memory clocked at 1000 MHz. The memory interfaces with the GPU via a 128-bit bus. The GeForce 8600GTS does not integrate HDCP keys on the GPU. Add-in board partners will have to purchase separate EEPROMs with HDCP keys; however, all GeForce 8600GTS-based graphics cards feature support for HDCP.

Supported video output connectors include dual dual-link DVI, VGA, SDTV and HDTV outputs, and analog video inputs. G84-based GPUs do not support a native HDMI output. Manufacturers can adapt one of the DVI-outputs for HDMI.

NVIDIA’s GeForce 8600GT is not as performance oriented as the 8600GTS. The GeForce 8600GT GPU clocks in at a more conservative 540 MHz. The memory configuration has more flexibility, letting manufacturers decide between 256MB or 128MB of GDDR3 memory. NVIDIA specifies the memory clock at 700 MHz. The GeForce 8600GT shares the same 128-bit memory interface as the 8600GTS. HDCP support on GeForce 8600GT is optional. The GPU and reference board design support the required HDCP keys EEPROM, however, the implementation is up to NVIDIA’s add-in board partners.

GeForce 8600GT-based graphics cards only require a six-layer PCB instead of the eight-layer PCB of the 8600GTS. The physical board size is also smaller too – measuring in at 6.9 x 4.376 inches. GeForce 8600GT-based cards do not require external PCIe power. NVIDIA rates the maximum board power consumption at 43-watts – 28-watts less than the 8600GTS.

The GeForce 8600GT supports similar video outputs as the 8600GTS, however, the 8600GT does not support video input features.

NVIDIA has revealed very little information on the GeForce 8500GT besides support for GDDR3 and DDR2 memory. It supports dual dual-link DVI, VGA and TV outputs as well.

May I suggest that we refrain from judging performance until there are reliable benchmarks?

Nobody here has hardware ESP (and that's NOT the Electronic Stability Program on your Saab); you can't judge performance based on incomplete technical specs. That would require intimate knowledge of the G80 architecture that we just don't have.

Furthermore, nobody from NVIDIA ever said that the 8600s would have earth-shattering performance. Historically, the top-end midrange card of the new gen (e.g. 7600GT) is right on par with the high-end cards of the last gen (e.g. 6800GT) but with more advanced shader technology. So if you're disappointed that the 8600GTS won't trounce the 7900GTX, you've got only yourself to blame.

Previously this was true, but the G80 architecture was touted as being such a huge step forward that one wasn't incorrect to assume that the midrange would offer a bigger performance boost than in the past.

There is zero reason to buy these cards now though, you are just shooting yourself in the arse paying a premium for DX10 that's not available without getting significant boost in DX9.

Damnit, DAAMIT, hurry up and release R600 so maybe the 8800GTX price drops out of the stratosphere.

So 550 to 570 is in the stratosphere for a top performing 8800GTX? Or 300 is in the stratosphere for a great performing 8800GTS 320mb? If that is the case, you are in the wrong business of being a computer enthusiast or playing PC games. Why does everyone expect a MIDRANGE card to perform so well? If you are worried, buy the 8800GTS 320, which can be had for a good price. If you can't, wait for numbers from the midrange, because I can bet that they will work JUST FINE at 1280x1024 on most new and most upcoming games.