Post Your Comment

41 Comments

How does H.264 video decoding work? Do I need to use a special powerdvd with special nvidia drivers for this to work? Or can I simply just get the latest nvidia drivers and use whatever media player I want to take advantage of this? Reply

There should be a 8700 series to bridge the gap, since the 8600 is a dissappointment. Gone are the days when you could get a $150 card and get the performance of a $300+ card. Ti4200 (overclock to Ti4600) or Radeon 9500 (softmod to 9700), etc! Reply

Does anyone else think that us enthusiasts are expecting too much out of nVidia? The 8600 line clearly beats out it's 7600 counterpart, I think it might be expecting too much for the g84 to handily outdo a 7900 or an X1950. These are the BEST of last year, ppl! The best their engineers could do and only a year ago, or less! Why should we expect a MID-RANGE card too wallop the HIGH-RANGE of last year?

quote:Traditionally, successful performance-mainstream parts at $199 price-points offered performance level similar to former flagship offerings released a year or a little more before. This was the case with the GeForce 6600 GT, which could easily outperform the Radeon 9800 XT; the same was true for the GeForce 7600 GT, which could offer performance of the GeForce 7800 GT at much lower price-point; the Radeon X1950 Pro outperformed even the Radeon X1800 XT in certain cases, while the GeForce 7900 GS provided same level of speed in games as the GeForce 7800 GTX. When it comes to the GeForce 8600 GTS, we cannot see it leaving the GeForce 7950 GT behind, not talking about more powerful GeForce 7900 GTX.

quote:At the same time, we wouldn't recommend against the 8600 series, as it does provide best in class video decode performance that will enable more computer owners to experience HD content without dropping frames.

Have you actually tested this? Seeing as it was bloody madness to get it to work half-assedly on 6xxx and 7xxx cards (not even including the fact that nvidia screwed up support for it in every other driver release, and not disclosing that fact, or even acknowledging that they had broken it), it seems not so smart to just take it at face value.

please explain me why do i need h.264 100% decoding on hardware when there are free decoders that already do that (and nvidias charges you for this cards like if you where buying a rocket) and the most important thing is that it is supposed that you will be using these cards when relatively new pcs that can already play h.264 without a single frame drop... see, dx10 is for vista, and h.264 is for hd video, so its likely that the user has a really good pc.

Are you going to include any heat, power usage, or noise levels in this eventually? Or is there a future article coming soon?

Very dissapointed in the 8600 :( But still interested to see how it compares heat, power, and noise to the 8800. Also why no 8800 GTX or GTS 640 in the benchmarks? It would be interesing to see how much more you get in games for spending 2 times the money Reply

there are already Anandtech articles comparing the GTS 320MB to the other 8800 cards. Plus an 8800GTX would have squashed the scale of the cheaper cards too much to see the differences in the graphs. Reply

Normally Anandtech will talk about heat, power, and noise when they review a specific manufacture's card, rather than a reference design. We all know most retail parts will use the same cooler nVidia supplies with the reference part, but some might not, and that (or overclocking) would change the results. Reply

How much longer until we get video benchmarks? Also, PLEASE include a 8500 in that roundup. I've no interest in any kind of 3D capability from the 8500. It will be put in a HTPC to only run Vista MCE and no other games or anything running on it.

I've been holding off on buying one until I can see how one performs in a HTPC enviorment, please make that happen, even if you have to buy one off of the egg or something.

I paid £72 for a Sparkle 8600GTS, which after tax and conversion is ~$122. Admittedly I ordered an 8600GT and they shipped sent me the GTS by mistake ... but graphics cards have been notoriously overpriced in the UK so it was by far the best option regardless.

As I'm upgrading from a Radeon 9500 system, I don't think I'll be too upset, and the video offload will be very much appreciated.

But yes, the prices should have been lower, and the suffixes different.

8600GT should have been a GS @ $129. 8600GTS should have been the GT @ $169, and there should have been a 256bit memory bus GTS at $199 (with 48 or 64 shaders).

Regardless I think the cards will be at these price points within a couple of months, with either an 8800GS or an 8600GTX card added. It's just a complete non-upgrade option for 7600 owners. Reply

With ATI's lack of a high end card, their high end card has been pushed into the mid ranged pricing model.

The bottom line is the 8600GTS is a replacement for the 7600GT in Nvidia's line up for the mid range. Both cards debuted in the same price range and both performed nearly as well as last generations high end.

In a perfect world the X1900 wouldnt be in that price range, but this is all they have to offer. Comparing the 8600GTS to the 7600GT, which this card was designed to replace. It is a no brainer which one wins. The market situation will correct itself once the 7900s flush out of the channel and AMD comes out with their own mid range card and drop the X1900s from their lineup.

That being said, i think Nvidia most likely has a card to fill the performance gap between the 8600GTS and the 800GTS 320.

The x1900xt 256 has been available for a little over $200 even before the release of the 8800 series, nevermind the 8600 series. Nvidia must take its competitors products into account, because DX10 capability alone will not sell these cards.

The reason the 6600gt was such a success is not because it compared favorably to the mediocre fx5700, but mainly because it offered better performance than anything else in its price range, even more expensive cards like the 9800pro. This is clearly not the case with the 8600gt and gts. The fact that it beats a 7600gt is nothing excetional. Reply

AMD/ATI did the same thing the last time around - they left a huge gap between midrange and high end that nvidia handily filled with the 7600 series, and one they barely filled (X1600XT) before moving on to a whole new generation. It really hurt AMD/ATI last time around, and nvidia isn't careful, they could get burned this generation too.

Heres to hoping AMD/ATI and nvidia both have great ~$130 DX10 cards later this year... Reply

Thank you for confirming the mediocrity of these cards. It is good to see a nice unbiased review that does not sugar-coat the pathetic results these cards are posting in some games and the so-so to just-okay results in other games.

The fact remains a new generation of cards should outperform the previous generation at the same price/performance level.

The 8600GT needs to CONSISTENTLY outperform the 7600GT, the 8600GTS needs to CONSISTENTLY outperform the 7900GS/7900GT, and neither of these cards can manage that because they are cut down too far. A 256-bit bus and 64 stream processsors on the GTS for the same price point and it would have been a success and ensured NVidia victory across the board. Reply

Which would basically nullify any cost advantage to production the 8600 Line has, if you add a 256Bit PCB plus the increase to 64 SP, you have a card that costs more then the 7900 line to make. There wouldn't be any point, Nvidia would have to eat the costs and still sell at the mainstream price points and make less money in the end. Reply

I'd love to see some video benchmarks using CPU's that the users of these cards would most likely have. If I were building a system with an 8600 based card I would probably match it with an E4300, E4400 or 5000+. Reply

I got your back. If you examine hardware surveys, IE Steam uses over a million unique samples, most people aren't using top-dollar processors or 2GB corsair RAM (pricey stuff). I realize the obvious benefits of using this kindof hardware for benching; cross-standardization being one, but most of us can't really afford these kind of systems.

It'd be nice to see "modest" benchmarks on an entry-level AM2 machine, or even better, a 939 socket? Pentium equivalents as well? Reply

The number one reason nVidia has had more success than ATI in the past few generations is their superior midrange cards. (The number 2 reason is product delays with new high-end part intorductions, but that isn't as severe, since the real high-end market segment will just buy the new cards anyways). nVidia has clearly dropped the ball here, and these new benches confirm what the last review seemed to indicate - that unless AMD is totally asleep, they can nail nVidia with the Radeon 2600 line.

"The problem is that there is a huge performance gap between the 8600 GTS and the 8800 GTS 320MB." - This is the essense of the problem...the much better card is just priced much to close to these parts. The overall bottom line might have actually been better if nVidia hadn't released the 320meg part at all.

"We also have multiple cases where NVIDIA's new offerings perform lower than similarly priced hardware from their own previous generation hardware." - This is when it becomes outright embarrassing. It's been a long time, if there ever was a time, when this was true for the boys in green.

I'll say it again - 64 shader part at $200 will come quickly if AMDs competing parts are any good. Reply

Actually, me too. Now that the 8000 series is on a uniquely (to PCs, anyhow) unified-shader architecture, it seems that nVidia has a chance to re-invent SLi.

Imagine an SLi engine that didn't simply split workload into half-frames or every-other frame. What if it simply pooled the shader resources for each frame? DX10 seems to give programmers a high degree of freedom (threading physics to the GPU, storing entire programs on the onboard memory, etc) maybe nVidia could fashion a special version of SLi geared for DX10?

I dunno. Just an idea. I realize such engineering is much easier said than done. Reply

Pooling the shader-resources of a pair of 8600GT/GTS cards would still only give 64 shaders in total, compared with the 96 of a single 8800GTS. No amount of improving pixel-shader efficiency in SLI is going to make a pair of 8600's faster than the 8800GTS. Reply

Much worse than the 8800GTS it would be priced against, plus requiring a more expensive MB, showing lack of performance improvements in some games, and probably making more heat and noise. SLI is ideal for the top end, not midrange. Reply

The thing I thought of was: 'wow, it took this long for nVidia to make a card that performed on par with the ATI 1950's ?'

Yes, I understand the NV 8800 series is top dog, but look at the price difference right_now.

Anyhow, I would have to agree, these comparred to the older 8800's are much weaker, but there is a niche for everyone/everything, as not everyone can afford $300+ for a good video card, and these seem like they would fill the general purpose niche very well, not to mention play back HD content decently also.

After seeing how many NV 6200's have come through our shops here, I have very little doubt, that Dell/eMachine owners nation wide will be gobbling these up left and right . . .

The MSI 8600GTS OC was in stock @$189.99 on ZipZoomFly the day of release and still available from stock. (The MSI 8500GT was also in stock at ZZF the day of release of these cards @ $99.99, but is now out of stock.) Reply