Post Your Comment

26 Comments

I don't know about a 7800Ultra, but it looks like the Quadro 4500 (based on the 7800) might be on for a SIGGRAPH launch. Since the 4400's a 512MB card, I doubt the 4500 will be a 256MB one. And hopefully *that* will bode well for a 512MB consumer card.

Fingers crossed.

Mind you, if a 4500 is a 7800GTX-based card (as the 3400 is a 6800GTo card), perhaps there'll be a 5500 (7800Ultra-based) in the manner of the 4400. By which point, presumably people will have stopped selling GeForce 5500 cards, or it's going to get confusing (other than a factor of a hundred in the price).
Reply

I understand the eVGA 7800GTX card (unusually) has a dual-link DVI connection. Since this was a feature which seemed to cause a lot of confusion among 6800-series card manufacturers, I just wondered if the reviewers (or anyone else) had the chance to test it? If it *is* dual link is an external TMDS transmitter used? What's the quality of nVidia's TMDS transmitter implementation this time round (reports of the 6800 series were critical)?

The 7800GTX is probably the best card out there for trying to render at the resolutions supported by an IBM T221 or Apple's 30" cinema display - although I'm inclined to wait for a 512Mb version for my T221; it would be good to know whether it's capable of driving one.

Regarding measuring the card's noise output and the way you measured the sound
"We had to do this because we were unable to turn on the graphics card's fan without turning on the system."

Would it be possible to try and measure the voltages going to the fan when the card is idle and under full load? Then supply the fan with these voltages when the system is off using a different power supply such as a battery (which is silent) and a variable resister.

It would also be interesting to see a graph of how the noise increases when going from idle to full load over 10 minutes (or however long it takes to reach the maximum speed) on cards which have . Instead of trying to measure the noise with the system on, again measure the voltage over time and then using your battery, variable resistor and voltage meter recreate the voltages and use this in conjunction with the voltage/time data to produce noise/time data.

Just look at the original review to see how the 7800GTX compares with older cards, they looked at a lot more games and with a wider range of settings.

This series of articles is a comparison of 7800GTX cards and is meant to focus on the differences between them. We all know a 7800GTX is faster than a 6800U Ultra so there is no point including that on the graphs.Reply

Any word from Derek or Josh(?) as to why AA was not enabled in the tests? It would certainly be a lot more meaningful than the curent set of results at resolutions as high as 2048x1536 without AA where the lowest average framerate in any game is over 70fps. The argument about 4fps more being worthwhile because it is an extra 240 frames per minute is one of the daftest things I've read in a gfx card review.

Unless you include minimum framerates and ideally a framerate graph like [H} do, and comment on playability at different resolutions and AA settings; remarks like an overclocked card getting 76fps being wothwhile over the non-overclocked one only managing 72fps are ludicrous. I bet you couldn't even tell the two apart in a test where you weren't told which was which. Turn on 4x AA and lets see how they stand up. It may come down more to memory-bandwidth but thats okay. I'm sure some manufacturer will use Samsung's 1.4nS (1400MHz) chips, or at least their 1.5nS (1333MHz) chips sooner or later, assuming the core and circuit board are up to handling those speeds.Reply

On page 5 (Heat, Power and Noise) it says under the first graph:
-----
As you can see in the graph, there's no difference in the temperature of the reference card and the EVGA e-GeForce 7800 GTX in normal mode, and it only went up by one degree when we overclocked it.
-----
...that's not quite right, as in fact both the e-GeForces were at 81C (overclocked or not) whereas teh reference card was at 80C.Reply

#9 and #17: If you want to see the numbers, maybe you should go read the original 7800GTX review. These are just vendor series and you they are comparing the vendor's performance, which is always going to be a couple of frames here and there. It's useless to include 6800 and ATI cards in there.Reply

I was reading this article http://www.digit-life.com/news.html?119317and would be interested to know if Anandtech has any comments on it. According to that, if I understood properly, it is not possible to change the clocks by 1MHz steps, but by 27MHz steps. And that different parts of the core work on different frequencies.

Interesting how will this change OCing? New challenges, yes :-) !!Reply

yeah OC'ing is an addiction...I still OC my 6800's a bit to play CSS even though I do not have to everytime I play (i never leave it set to startup for stability issues)....but I cannot play if my cards are not OC'ed

anyways AWESOME article and idea for it guys! cant wait to see BFG'sReply

If I was going to spend about $600 on a gfx card, I think I would enable at least 4x AA in games that work correctly with it. If ncessary drop down to 1600x1200 from the tested 1920x1440 or 2048x1536; my monitor does support 2048x1536 and at a good refresh rate (85hz), but the jaggies are far more noticeable at 2048x1536 without AA, than they are at 1600x1200 with 4x AA.Reply

I can't believe they put up all those charts and failed to compare it to ANYTHING other than a 7000 series card. As #2 stated, exactly how many people are going to overclock this card? How many people care?

I can tell you a lot of people care to know how well it fares against a 6800.Reply

#5 thanks for the comment on your card and its overclocking ability...some people just don't know how to comprehend the written word...maybe they need to get their shit together before they make themselves look sillyReply