Post Your Comment

43 Comments

Sorry if this was already mentioned in the article, but is there a reason that there were no pictures of the actual card? Were there distinguishing marks/logos on the card that would have revealed to Nvidia which company had leaked the card, or was it just due to legal issues? Reply

That Anandtech has posted that cards in a test (I saw this last week too, but didn't mention it then) include the Radeon X800XL, and then didn't. I'm getting really tired of it. It seems like a simple thing, but a lot of people purchased the X800XL as a performance/value solution, and the comparison is worth it to us. I'd like to see Anandtech tes with it, but if you decide not to, REMOVE IT FROM THE LIST OF CARDS IN YOUR TESTING SO THAT NO-ONE IS LED ON. Don't you guys have ANY PROOFREADERS?

(pauses, takes several deep breaths) Rant over. Carry on.

P.S. It appears that in looking closer, your tests include the X1800XL, which is not listed in the test specs. Perhaps the specs have a typo, but if so, it's rather an egregious one...maybe you could correct it? Reply

If a hardware/software combination has a problem isn't it even more important that we report it? This is an accurate measure of the performance of the different parts with equivalent settings. We checked and rechecked our benchmarks this time and there's no doubt that R4xx hardware runs B&W2 better thant R5xx hardware with the latest ATI drivers.

This is a problem and we hope ATI or Lionhead will take notice and fix the issue. Reply

The problem is both a lower product and a higher product, have the exact same memory bandwidth of 32GB/s, so it doens't look like they have much flexibility with the bandwidth. Unless you want to pull an ATI, and have memory bandwidth at different levels, across the entire line like X800 Series. Reply

Its a tough sell to even claim that the 7800 GS will ship with this pipeline and clock speed configuration, let alone to say that retail units will function similarly to this random early sample we happened to get our hands on.

Up to 8 pipelines could be unlocked on this unit with RivaTuner, but if NVIDIA starts building G70 based silicon with only 16 pipes that capability could go away. And just becuase something can be unlocked doesn't mean it will work.

Finding out how overclockable this part will be is a big reason we want to see this card make it to market. Reply

My opinion on the subject is that the truth doesn't change if someone decides to believe a something else. And if you can't take us at our word in our articles then there's nothing I can say that will change your mind in our comments.

I would hope that our track record has proven us to be trustworthy and plain spoken with our readers. But that doesn't mean I want people to stop questioning us and keeping us honest. Reply

One thing I've been wonderingabout your benchmarking: Some time ago there was a question of whether average framerate should be the sole basis for comparison (IQ issues aside). The argument was that if a card really chokes on certain parts of a scene, having 10% higher average framerate wasn't exactly great. Would it be possible to do a graph of the frames so we could see how much they vary? Or is this not really an issue today? Reply

It does matter to some extent, but it is very difficult to do a good job of representing this data. Graphs of instantaneous framerates over time is next to useless in our opinion. Its too hard to actually tell what's going on and way too difficult to properly compare a number of cards without totally destroying readability.

I keep voting for boxplots but there's still some debate about whether teaching our readers about statistical analysis is a good idea :-)

I think something that can show outliers, min, max, lower and upper quartiles, and median in as much space as a single bar in a bar graph gets my vote as a good thing. The huge problem is that our graphing engine doesn't support anything nearly this robust. We have been trying to augment our bar graphs with some line graphs generated using a spreadsheet program to show resolution scaling, but I still haven't found a good easy way to generate nice looking boxplots.

Boxplots are an EXTREMELY useful idea, especially the lower-quartile framerate. In fact, the lower quartile may be more useful than the average framerate.

The lower quartile shows you how smooth the game is in the "slower" half of the time -- which makes or breaks a game's playability and enjoyment.

Suppose card A averages 60FPS, but the worst half of the time it only averages 20FPS (lower quartile). Carb B averages 50FPS, but is more efficient in relatively complex scenes -- 35FPS lower quartile. Most likely, you want card B, despite its lower "average" framerate. What good is card A when its best 50% of scenes average 100FPS while its worst 50% average 20FPS? A better choice is probably card B, which might average 65FPS in the best 50% and 35FPS in the worst 50%. But average framerates don't show this!

Posting min and max framerates (box whiskers) is useless, since they will be meaningless (and probably random) freak outliers. Min SUSTAINED framerate is good, and it happens to be closely related to the lower quartile number. Reply

I reckon if you started to introduce boxplots alongside your standard graphs, readers would catch on pretty fast. People who want the extra information will take the time to work out what it means, and people who don't care can just look at the standard graphs. Or just stick a massive arrow pointing to the mean and leave the rest of the diagram a little more faded. Reply

My only complaint about graphs like these, against cards like these, is that it makes a part like the 7800GS seem downright midrange and mainstream... and I guess for those enthusiast about games with their budget shifted towards the video card, that is probably realistic. But I guess I'd like to see graphs that better reflect at least what is actually out there being used by gamers today ... i.e. if most of us have 6600GTs, then it would be a good comparison point! Reply

So basically, Anand Tech is reduced now to NVidia's marketing arm of gauging public interest in a new card. Nice "leaked" card from "unknown" sources...my ass. Let's not kid ourselves, we're not that stupid. Reply

Despite what you might think, I can guarantee that this article is not at all sanctioned by NVIDIA. (I'd go so far as to say they're probably really unhappy that the info got leaked to us.) I wouldn't say the same for whoever provided the card, though - some company like ASUS, Leadtek, etc. (no, I don't know where it came from) might have slipped us the card to see the reaction. Very little information comes directly from NVIDIA, ATI, etc. without an NDA. We get most of the "secret" stuff through numerous other channels.

We're hardware enthusiasts, and many people have been wondering about lower priced G70 parts. Whether or not this card actually sees the light of day soon (if at all) is most likely going to depend on chip yields for the G70. If a decent number of cores are failing tests on two quads, it makes more sense.

The 7200 and 7600 (or whatever they end up being named) are due out in the next few months. Depending on price/performance, there may or may not be a niche for this product. If it's too fast for the price, it would hurt sales of 7600. We'll see what happens. Reply

product line's getting even more cluttered, but good info nonetheless. surprised AT would be able to post this info without running into legal difficulties, even if a NDA wasn't actually signed...since the card is now 'hot property', I'd be happy to take it off your hands ;) Reply