I guess the real issue for me is that this card is a beast but ill never have it in my sli rig......i want all settings maxed at playable resolutions thats just me.........and i will not go back to crt...lol crt thats was lame dudeReply

#122 The problem with your solution regarding "all of us just getting two 6880U's" works perfectly for those with an SLI-capable board, yes? Some of us, like myself, anticpated the next generation of GPU's like the 7800 series and opted to simply upgrade to one of those when the dust settled and prices slid back a bit.

Additionally, telling someone to "STFU" isn't necessary. We can't hold a conversation if we're all silent. Knowhuddamean, jellybean? Hand gestures don't work well over the internet, but here's one for you..........Reply

LCD gamers shouldn't be bothering with new graphics cards, they should get new monitors.

kidding, I have nothing against LCDs. The real advantage of showing the card run at 2048x1536 is that it lets you see how well the card scales to more stressful scenarios. A card that suddenly gets swamped at higher resolutions will probably not fare well in future games that need more memory bandwidth.

On a side note, you can get a CRT that will run 2048x1536 @ a reasonable refresh for about $200 shipped (any Sony G520 variant, such as the Dell P1130). The only things that would actually be small in games are the 2D objects that have set pixel sizes, everything else looks beautiful.Reply

#121
lol ty for your insight....anyway like i said this card is not for lcd gamers as most have a 12x10 or 16x12.....so what purpose does this card have??answer me this batman and you have the group that should buy this card -otherwise, the rest of us should just get 2 6800u....this card is geared more for workstation graphics not gaming.....unless you game on a hi def crt and even then max res would be 1920 by 1080i..or something like that.....Reply

#116, if people in the comments thread are allowed to give their opinions, why shouldn't #114 give his too? Surely even an illiterate like you should realize that arguing that everyone is entitled to his or her own argument means that the person you're arguing with is too.

#119, some people have different requirements than others. Some just want no visible blur, others want the best contrast ratio and color reproduction they can get. Reply

Well, I for one think 1280x1024 is pretty valid as that is what a 19" LCD can do. I'd just want to see a maxed out 12x10 chart to see how it does - I know a 6800 can't do it for every game with full AA and AF. Otherwise I agree - a 12x10 with no options isn't going to show much with current games.

See, I'm considering swapping my two 21" CRTs for two 19" LCDs - and they won't do more than 12x10. I'd love to do two 20-21" LCDs but the cost is too high and fast panels aren't to be found. 19" is the sweet spot right now IMO - perhaps I'm wrong?

#114
Dude just stfu......we are here to comment what we want and say it freely......minus threats and name calling....as i said before this card is not for gamers...maybe elite gamers that have a monitor that does these resolutions but most gamers i know have went to lcd and i have yet to see any lcd[im sure there are some]do these resolutions so this card really is a card for crt elite gamers......lol with those resolutions on a 21 inch monitor you would need binoculars as glasses to play the game....the tanks on bf2 would be ant like small....Reply

Those who are complaining that they should have reviewed at lower resolutions should think for a minute. First of all you are talking about a 600 buck card, most people who have that kind of money to spend on a card also have a monitor that is capable of 1600x1200 or better. Also benchmarking at any lower resolution on a card like this in todays games is almost pointless as you are almost entirely CPU bound at those resolutions. Do you really want to see page after page of 1024x768 charts that differ by only 4-5 percent at the most?

Also give the editors a break when it comes to writing these articles. As others have said this is not a subscription site and given the number of visitors and the quality of the articles I'm amazed, and gratified, that the people of Anandtech keep putting out article after long article despite all the winning that goes on over spelling mistakes and graph errors that more often than not are corrected within a few hours.Reply

"Our Web Editor is on vacation and we are all doing our own HTML and editing for the next 10 days. In our usual process, the article goes from an Editor to the Web Editor who codes the article, checks the grammar, and checks for obvious content errors. Those steps are not in the loop right now."

" do know Derek as a very conscientious Editor and I would ask that you please give him, and the rest of us, a little slack this next week and a half"

Dear Mr. Fink,
I am sorry to hear about the problems you have had with your vehicle breakdowns. You see, our quality inspector was on vacation that week, so we just shipped our vehicles strait off the assembly line. Please cut us a little slack, as we usually build much better vehicles.

"It's taken three generations of revisions, augmentation, and massaging to get where we are, but the G70 is a testament to the potential the original NV30 design possessed. Using the knowledge gained from their experiences with NV3x and NV4x, the G70 is a very refined implementation of a well designed part."

Oh, please...nV30 was so poor that it couldn't even run at its factory speeds without problems of all kinds--which is why nVidia officially cancelled nV30 production after shipping a mere few thousand units. JHH, nVidia's CEO went on record saying, "nV30 was a failure" [quote, unquote] at the time. nV30 was [i]not[/i] the foundation for nV40, let alone the G70.

Indeed, if anything could be said to be foundational for both nV40 and G70, it would be ATi's R3x0 design of 2002. G70, imo, has far more in common with R300 than it does nV30. nV30, if you recall, was primarily a DX8 part with some hastily bolted on DX9-ish add-ons done in response to R300 (fully a DX9 part) which had been shipping for nine months prior to nV30 getting out of the door.

In fact, ATi owes its meteoric rise to #1 in the 3d markets over the last three years precisely to the R3x0 products which served as the basis for its later R4x0 architectures. Good riddance to nV3x, I say.

I'm always surprised at the short and selective memories displayed so often by tech writers--really makes me wonder, sometimes, whether they are writing tech copy for their readers or PR copy at the behest of specific companies, if you know what I mean.
Reply

98 - As far as I know, the power was measured at the wall. We use a device called "Kill A Watt", and despite the rather lame name, it gives accurate results. It's almost impossible to measure the power draw of any single component without some very expensive equipment - you know, the stuff that AMD and Intel use for CPUs. So under load, the CPU and GPU (and RAM and chipset, probably) are using far more power than at idle.Reply

I agree, starting at 1600x1200 for a card like this was a good idea. If your monitor can only do 1280x1024, you should consider getting a better one before buying a card like the 7800gtx. As a 2070/2141 owner myself, I know that a good monitor capable of high resolutions is a great investment that lasts a helluva lot longer than graphics cards, which are usually worthless after four or five years (along with most other components).

I'm surprised that no one has moaned about the current lack of an AGP version, to go with their Athlon XP 1700+ or whatever ;)Reply

I think it was spot on to have 1600x1200 as the minimum resolution, given the power of these cards I think 1024x768, no AA/AF results for 3Dmark2003/2005 which have been thrown around are a complete waste of time.

Good review... And re: the NDA deadlines and the sleapless nights - don't sweat it if a few mistakes are published. The readers here have their heads screwed on the right way and will find the issues for soon enough. And for everyone that does not do 12*16 or 15*20 the answer is simple - U Don't Need The Power!! Save your hard earnt money and get a 6800gt instead.Reply

Maybe if you could save the game, change the settings and reload it you could obtain images from exactly the same positions. In one of the fence images, the distance to the fence is quite a bit different in different screenshotsReply

Jeez...a couple spelling errors here and there...who cares? I'd like to see you type up a 12-page report and get it out the door in a couple days with no grammatical or spelling errors, especially when your main editor is gone. Remember that English study that showed the human brain interpreted words based on patterns and not spelling?

I did read the whole review, word-for-word, with little to no trouble. There was not a SINGLE thing I had trouble comprehending. It's a better review than most sites have done which test lower resolutions. I love the non-CPU-limited benchmarks here.

One thing that made me chuckle was "There is clearly a problem with the SLI support in Wolfenstein 3D". That MS-DOS game is in dire need of SLI. (It's abbreviated Wolfenstein: ET. Wolf3D is an oooold Nazi game.)Reply

Derek-
Please post benches with resolutions that are commonly used or this article becomes a workstatin graphics card article and not one for gamers....I mean really 2046x3056 or whatever the hell...lol...#1 who games at that res??? While this card is powerful it should be mentioned that unless you use a res over 1600x12000 this card is unnecessary.......lol those were some ridculous resolutions though.......and again post some benches with 1280x1024 for us lcd'ers.....Reply

#94: I guess you missed the part where they said that all resolutions below 1600x1200 were essentially identical in performance? If you only play in 1024x768, why are you reading a review about a $600 video card--go buy a 6600GT instead.Reply

The excuse of your Web Editor being on vacation is, in reality, an admission of improper planning.

A major hardware site that is dedicated to cutting-edge technology should have planned better. New high-end GPU launches happen by NVIDIA only about 2-3 times a year at most.

This was one of the HUGE launches of the year and it was messed-up becuase the team didn't feel it was important enough to get some help for the article. There was damage done to Anandtech today due to the article errors and due to the casual admission in post #83 about not caring to properly cover a "Super Bowl" type of product launch today.

86 - Trust me, most of us other editors saw the article, and quite a few of us offered a helping hand. NDAs a a serious pain in the rear, though. Derek was busy pulling all nighters and functioning on limited sleep for several days, and just getting the article done is only half the battle. Getting the document and results into the document engine for a large article with a lot of graphs can take quite a while and is an error prone process.

The commentary on the gaming benchmarks, for instance, was written in one order and posted in a different order. So please pardon the use of "this is another instance" or "once again" when we're talking about something for the first time. Anyway, I've got a spreadsheet of the benchmarks from early this morning, and other than non-functional SLI in a few games, the numbers appeared more or less correct. The text also didn't have as many typos. Crunch time and getting the final touches put on a major article isn't much fun.

Thankfully, I'm just the SFF/Guide man, so I'm rarely under NDA pressure. ;)Reply

I would love to see someone start benchmarking in widescreen resolutions! 1920x1200 begs for a fast video card like this. As was pointed out, the only real benefits of the 7800 come at high resolutions, and many people buying high resolution monitors these days are getting widescreen LCD's

and btw, my 2405fpw is sitting in a box right next to me in the office, begging me to open it up before I get home...this thing will be fun to get home on the subwayReply

As a player of eve-online, I can tell you that the game is entirely CPU dependent. On that matter, it will 100% any CPU you have. I mean ANY CPU. Also for the testing, you should use 1600x1200 max AA and AF and go into an area with many player ships on eve-online. I guarantee you will not get 60 FPS. Impractical and unscientific, but would still give better results than this review.Reply

Face it, this launch isn't gonna hurt anyone except people with minds too small to accept that there is simply one more option than there was before. If you liked pc gaming yesterday, then there is no reason why this launch should make you stop liking it today. Unless you're a retarded buttbaby who can't handle choices. In that case please get a console and stop coming to this site.Reply

Well that sucks that ya'll have lost your web editor for awhile. Especially when there is so much cool hardware coming out around now. In our research lab, we pass around our publications and conference posters to others in the group so that a fresh pair of eyes see them before they go live or to the journal editor. But of course, everyone else at AT is also busy so oh well.

Good work guys and I look forward to the "new CPU speed bump" article (or FX-57 for those not under NDAs).

Mark

PS. If ya'll have an opening for another web editor, you should hire #84 (ironchefmorimoto). I hear he can cook really well.Reply

Nicely corrected Derek, I think there are just a few typos left, like this one (**):

Page 20
Power Consumption
We measured power consumption between the power supply and the wall. This multiplies essentially amplifies any differences in power draw because the powersupply is not 100% efficient. Ideally we would measure power draw of the card, but it is very difficult **determine** to determine the power draw from both the PCIe bus and the 12V molex connector.

AND a few double "Performances" in the title (Performance Performance) starting with page 10.

Nice card nVidia!!! I hope ATi isn't too far behind though. Crossfire --> cheap SLi ;-) I need a nice midrange product out by September when it'll be time to upgrade to a nice E6 stepping S939 A64 and something to take the place of my sweet old GF2 MX (I'm not kidding, I sold my 6600GT AGP, and now I'm waiting for the right time to move to PCIe).
Reply

Derek was too modest to mention this in his comments, but I think you should know all the facts. Our Web Editor is on vacation and we are all doing our own HTML and editing for the next 10 days. In our usual process, the article goes from an Editor to the Web Editor who codes the article, checks the grammar, and checks for obvious content errors. Those steps are not in the loop right now.

The next thing is NDA's and launches. We are always under the gun for launches, and lead times seem to get shorter and shorter. Derek was floating questions and graphs last night at 3 to 4 AM with an NDA of 9AM. Doing 21 pages of meaningful commentary in a short time frame, then having to code it in HTML (when someone else normally handles that task), is not as easy as it might appear.

I do know Derek as a very conscientious Editor and I would ask that you please give him, and the rest of us, a little slack this next week and a half. If you see errors please email the Editor of the article instead of making it the end of the world in these comments. I assure you we will fix what is wrong. That approach, given the short staff, would be a help to all of us. We all want to bring you the information and quality reviews you want and expect from AnandTech.
Reply

Thanks for the refresh, Derek. I went back and took a peek at the revised graphs. I have a couple of comments on this article before you move on to the next project.

>> When the Splinter Cell page was refreshed, the graph for 20x15x4 apparently disappeared.

>> When you removed the SLI's from the Guild War page, it looks like the 7800GTX changed from 50.5 to 55.1 (which is the score previously given to the 6800 Ultra SLI).

>> Several of the pages have scores for no AA benches listed first, while other pages have scores for the 4xAA listed first. While the titles for the graphs are correct, it's a little easier to read when you stay consistent in the ordering. This is a pretty minor nit-pick, though.

"They priced themselves into an extremely small market, and effectively made their 6800 series the second tier performance cards without really dropping the price on them. I'm not going to get one, but I do wonder how this will affect the company's bottom line."

The 6800s were "priced into an extremely small market." How'd that line turn out? I don't imagine they've released this product with the intention of losing money overall. Why do you think retailers bought them? Because they know the cards won't sell and they're happy to take the loss? It's already been proven that people will pay for you to develop and sell a $300, wait $400, wait $500 video card. It's already been proven that people will pay a $100+ premium for cards that are incrementally better, not just a generation better. Sounds like this target is a natural, especially knowing it'll eventually fall into everyone else's purchasing ability.

Being able to say you have the bar-none best card out there by leaps and bounds is certainly worth something. Look at all the fanboys that are out there. Every week or month you're able to stay on top of the benches means you get more people who'll swear by your products no matter what for years to come. Everyone you can entice into buying your card who sees it as a good product will buy your brand in the future as a preference, all other options being equal. I could be wrong, but suspect Nvidia's going to make money off this just fine.
-----------------------
"I am proud that our readership demands a quality above and beyond the norm, and I hope that that never changes. Everything in our power will be done to assure that events like this will not happen again."

I wanted to offer my utmost thanks for the inclusion of 2048x1536 numbers. As one of the fairly sizeable group of owners of a 2070/2141 these numbers are enormously appreciated. As everyone can see 1600x1200x4x16 really doesn't give you an idea of what high resolution performance will be like. As far as the benches getting a bit messed up- it happens. You moved quickly to rectify the situation and all is well now. Thanks again for taking the time to show us how these parts perform at real high end settings.Reply

#72 - Totally agree. Some Rome: Total War benchs are much needed - but primarily to see how the game's battle performance with large numbers of troops varies between AMD and Intel more so than NVidia and ATi, considering the game is highly CPU-limited currently in my understanding.Reply

I would like to personally apologize for the issues that we had with our benchmarks today. It wasn't just one link in the chain that caused the problems we had, but there were many factors that lead to the results we had here today.

For those who would like an explanation of what happened to cause certain benchmark numbers not to reflect reality, we offer you the following. Some of our SLI testing was done forcing multi-GPU rendering on for tests where there was no profile. In these cases, the default mutli-GPU mode caused a performance hit rather than the increase we are used to seeing. The issue was especially bad in Guild Wars and the SLI numbers have been removed from offending graphs. Also, on one or two titles our ATI display settings were improperly configured. Our windows monitor properties, ATI "Display" tab properties, and refresh rate override settings were mismatched. This caused the card to render. Rather than push the display at a the pixel clock we expected, ATI defaulted to a "safe" mode where the game is run at the resolution requested, but only part of the display is output to the screen. This resulted in abnormally high numbers in some cases at resolutions above 1600x1200.

For those of you who don't care about why the numbers ran the way they did, please understand we are NOT trying to hide behind our explanation as an excuse.

We agree completely that the more important issue is not why bad numbers popped up, but that bad numbers made it into a live article. For this I can only offer my sincerest of apologies. We consider it our utmost responsibility to produce quality work on which people may rely with confidence.

I am proud that our readership demands a quality above and beyond the norm, and I hope that that never changes. Everything in our power will be done to assure that events like this will not happen again.

Again, I do apologize for the erroneous benchmark results that went live this morning. And thank you for requiring that we maintain the utmost integrity.

I have to say, while I'm am extremely pleased with nVidia doing a real launch, the product leaves me scratching my head. They priced themselves into an extremely small market, and effectively made their 6800 series the second tier performance cards without really dropping the price on them. I'm not going to get one, but I do wonder how this will affect the company's bottom line. Reply

I not tring to be a buthole but can we get a benchmark thats a RTS game. I see 10+ games benchmarks and most are FPS, the few that are not might as well be. Those RPG seems to use a silimar type engine. Reply

#60 - it's already to the point where it's turning people off to PC gaming, thus damaging the company's own market of buyers. It's just going to move more people to consoles, because even though PC games are often better games and much more customizable and editable, that only means so much and the trade-off versus price to play starts to become too imbalanced to ignore.Reply

I have to say I'm rather disappointed in the quality of the article. A number of apparently nonsensical benchmark results, with little to no analysis of most of the results.

A complete lack of any low level theoretical performance results, no attempts to measure any improvements in efficiency of what may have caused such improvements.

Temporal AA is only tested on one game with image quality examined in only one scene. Given how dramatically different games and genres utilize alpha textures your providing us with an awfully limited perspective of it's impact.

#44
And I thought paying $350 for a video cards was too much then or even before than there was the $200 high end and before that the $100 high end. I balked at all of those prices but I understood why they were prices as such and didn't bitch everytime the costs went up. The bar keeps being raised and the prices go with it. Inflation, more features and the fact that most of us here can afford $350 video cards pushes the cost of new PREMIUM cards higher by the year. It's only going to go up unless either people quit buying the high end cards or the manufatucrers find a magical process to reduce costs dramatically.Reply

You're quite right, there's always a premium for the best, I don't see any difference here, no-one is being forced to buy this graphics card. As usual, I'll wait until something offers me a better price/performance ratio over my current X850XT/6800 Ultra duo.

Seems to be a problem with the last Knights of the Old Republic 2 graph. Both 7800GTX setups are "performing" less than all the other cards benched. Despite all the mistakes, it still seems like I was right in that this card is made for those who play at high resolutions. Anyone with an R420 of NV40 based card that plays at 16x12 or less should probably not bother upgrading, unless they feel the need to.Reply

"In aboot 5 years i figure we'll be paying 1000 bucks for a video card. These prices are getting out of control, every generation is more expensive then the last. Dont make me switch to consoles damnit."

Funny, I can't afford the very best TVs the minute they come out. Same for stereo components. But I don't cry about it and threaten "Don't make me switch to learning the ukelele and putting on my own puppet shows to entertain myself!" Every time a better component comes out, it means I get a price reduction and feature upgrade on the items that are affordable/justifiable for my budget.

Seriously, where does the sense of entitlement come from? Do these people think they should be able to download top-of-the-line graphics cards through BitTorrent? Do they walk around Best Buy cursing out staff, manufacturers and customers for being so cruel as to buy and sell big-ass plasma TVs?

On second thought, get your console and give up PC gaming. That way you can stop being miserable, and we can stop being miserable hearing about your misery.Reply

Impressive, but I'm still happy with my X800 XL purchase for only $179. For what it seems, with a 1280x1024 display, I won't need the kind of power this card delivers for a very long time. And less than $200 compared to $600, with still excellent peformance for now and the forseeable future? Hmm, I'll take the former.Reply

I would have liked some 1280x1024 benchmarks with 8xAA from the nVidia cards and 6xAA from ATi to see if it's worth getting something like a 7800GTX with 17/19" LCD's to run som esuper high quality settings in terms of AA/AF.Reply

I'm not disappointed. For one thing the price of current cards will likely drop now, and there will also be mid-range parts soon to choose from. I think the transparency AA is a good idea for say... World of Warcraft. The game is loaded with them and too often can you see the blockyness of trees/grass/whatever.

#44 - Actually are you new to the market? :) I remember when early "accelerated" VGA cards were nearly $1000. Or more.

Everybody lambasted NVIDIA last year for the lack of product (6800GT/Ultra) to the market, so them actually making a presence this year instead of a paper launch should also be commended. Of course, now what is ATI gonna pull out of its hat?Reply

In aboot 5 years i figure we'll be paying 1000 bucks for a video card. These prices are getting out of control, every generation is more expensive then the last.
Dont make me switch to consoles damnit.Reply

Hell, for the same price as an SLI setup I can go out and get a 23 inch cinema display...And since these cards can't handle the 30" native resolution anyway, it's a win-win. And yeah, whats up with the quality control on these benchmarks! I mean really, I almost decided to wait for the ATI next-gen part when I saw this (GeForce man since the GeForce2 GTS!)Reply

The prices I've seen here in the UK for the 7800s here are around 400 pounds, the 6800 Ultras are currently around 300 pounds. So quite an increase over the NV40s but not unacceptable given the performance, I'm sure they'll come down in price once the early adopters have had their fill.

#26 - You must be new to the market, relatively speaking. I remember quite well the days when high-end new videocards were at MOST $400, usually $350 or less when they debuted. It was more than a year or two ago though, so it might have been before your time as a PC gamer.Reply

"What no Crossfire benchies? I guess they didn't wany Nvidia to loose on their big launch day."

Ummm... maybe because CrossFire was paper launched at Computex, and no one (not even AT) has a CrossFire rig to benchmark? nVidia is putting ATI to shame with this launch and the availability of the cards. Don't you think if ATI had anything worth a damn to put out there they would?

All that aside... I was as freaked out as the rest of you by these benchmarks at first (well moreso than some actually, becuase I just pulled the $600 trigger last night on an eVGA 7800GTX from the egg). However, these graphs are clearly messed up, and some appear to have already been fixed. I guess someone should have cut Derek off at the launch party yesterday. Reply

Very disapointed at the fit and finish of this article. Anandtech is supposed to have the best one, not a half baked one :( I even liked HardOCP better even with their weird change the levels of everything approach - at least it has a very good discussion of the differences between MS and SS AA and shows some meaningful results at high res as well.

"Maybe this one of the factors that will lead to the Xbox360/PS3 becoming the new gaming standard as opposed to the Video Card market pushing the envelope."
Yeah, because the graphics components in consoles don't require anything but three soybeans and a snippet of twine to make. They're ub3r and free! Wait, no, you pay for them too eventually even if not in the initial console purchase price. Actually I think the high initial price of next gen graphics cards is a sign of health for PC gaming. There are some folks not only willing to pay high dollars for bleeding edge performance, they're willing to pay even higher dollars than they were in the past for the top performers. Spurs ATI/Nvidia to keep the horsepower coming, which drives game devs to add better and better graphics, etc.

"They only reveresed a couple of labels here and there, chill out. It's still VERY OBVIOUS which card is which just by looking at the performance!"
Eh, I use benchmarks to learn more about a product than what my pre-conceived notions tell me it "ought" to be. I don't use my pre-conceived notions to accept and dismiss scientific benchmarks. If the benches are wrong, it is a big deal. Doesn't require ritual suicide, just fixing and maybe better quality control in the future.Reply

These benchmarks are pretty clearly rushed out and wrong, or at least improperly attributed to the wrong hardware. SLI 6800 show up faster than SLI 7800's in many benchmarks, in some cases much more than doubling single 6800 scores. I understand NDAs suck with the limited amount of time to produce a review, but I'd rather it have not been posted until the afternoon than ignore the benchmarks section.Reply

Okay, allcaps=obnoxious. But I do have a question. How was system power consumption measured? That is, was the draw of the computer at the wall measured, or was the draw on the PSU measured? In other words, did you measure how much power the PSU drew from the wall or how much power the components drew from the PSU?Reply

Wow, I'm simply amazed. I said to someone as soon as I saw this "Wow, now I feel bad that I just bought a 6800GT ... but at least they won't be available for 1 or 2 months." Then I look and see that retailers already have them! I was shocked to say the least.Reply

But my question was "who," was buying them. I'm a hardware goon as much as the next guy, but everyone knows that in 6-12 months, the next gen is out and price is lower on these. I mean the benches are presenting comparisons with cards that according to the article are close to a year old. Obviously some sucker lays down the cash because the "premium," price is way too high for a common consumer.

Maybe this one of the factors that will lead to the Xbox360/PS3 becoming the new gaming standard as opposed to the Video Card market pushing the envelope.Reply

The initial 6800U's cost lots because of price gouging.
They were in very limited supply, so people hiked up the prices.
The MSRP of these cards is $600, and they are available.
MSRP of the 6800U's was $500, the sellers then inflated prices.Reply

I know this article must have been rushed out but it needs EXTREME proofreading. As many have said in the other comments above, the results need to be carefully gone over to get the right numbers in the right place.

There is no way that the ATI card can go from just under 75 fps at 1600x1200 to over 100 fps at 2048x1535 in Enemy Territory.

Also, the Final Words heading is part of the paragraph text instead of a bold heading above it.

There are other grammatical errors too but those aren't as important as the erroneous data. Plus, a little analysis of each of the benchmark results for each game would be nice but not necessary.

#19
Are you new to this market or do you have a short memory? Don't you remember that the initial 6800 Ultra's cost around $700-800? I sure as hell do. Why is everyone complaining about pricing? These are premium video cards and you will pay a premium price to buy them.Reply

Yeah, not a single comment on any of the benchmarks, what is up with that?

There were alot of wierd scenarios there, why is there NO performance increase in SLI some of the time?
And why is 6800Ultra SLI faster then 7800GTX SLI??

Alot of wierd stuff, and not a singel comment or analysis about it, I always read most new tests here on AT first becasue its usually the best, but this review was a double boogey to say the least...Reply

To 18 - I have to admit, I didn't bother looking closely at them, seeing the X850XT supposedly beating all the other cards by such a margin at those resolutions showed they were completely screwed up! I didn't notice the performance increase as you go up the resolution, maybe it's something I missed on my own X850XT? ;) I wish...that would be a neat feature, your performance increases as your resolution increases.

I agree it needs pulled down and checked, not to be harsh on AT but this isn't the first time the bar graphs have been wrong - I would rather wait for a review that has been properly finished and checked rather than read a rushed one, as it stands it's no use to me because I have no idea if any of the performance figures are genuine.

To #14, the X850XT performance INCREASED by 33% from 1600x1200 to 2048x1536 according to the grahics, so to me that just screams BULLSH!T.
I think the review needs taking down, editing, and then being put up again.
Or fixed VERY quickly.
AT IMO has let people down a bit this time round, not the usual standard.Reply

oops posted before i wrote anything. Some of the results are impressive, others aren't at all. In fact results seem to be all over the board - I suspect drivers are something of the culprit and are to be blamed. Hopefully, as new drivers come out we'll see some performance increases or at least more a uniform distribution of good resultsReply

Derek get cracking, the graphs are all messed up! And the Transparency AA Performance section could use some info on what game it is tested on and some more comments. I also think that each benchmark warrants some comments for all of us that have a hard time remembering two numbers at the same time. Keep it simple folks….Reply

I agree something is wrong with these results, I thought they were odd but when I got to the Enemy Territory ones they seem completely wrong - at 2048x1536 and 4xAA the X850XT is apparently getting 104 fps, while the 6800 Ultra gets 48.3 and the SLI 6800 Ultras are only getting 34.6 fps! Especially bearing in mind it's an OpenGL game.

The benchmarks are all a load of crap it seems.
Check the Wolfenstein benchmarks.
The X850XT goes from 74fps @ 1600x1200 w/4xAA to 103fps @ 2048x1536 w/4xAA
A 33% increase when the res gets turned up. Good one.

There also seem to be many other similar things which look like errors, but they could just be crappy nVidia drivers, or something wrong with SLI profiles.

Who knows, but there's definately a lot of things which look VERY odd/suspicious here.Reply

From what i'm seeing the 6800U SLI beats the 7800GTX[SLI] in most normal resolutions. I don't know, but usually when a new generation comes out it should at least beat the previous generation. Sure, it works wonders on huge resolutions, but very few people actually have monitors that can display these types of resolutions. Most people don't have monitors above 1200x1000 resolution, much less 1600x1200.Reply

From what I see, the 7800GTX is really of benefit to you if you have a monitor that is higher than a 1600x1200 resolution. Fairly impressive though, I must say. I also wasn't expecting double the performance of the 6800's since it only has 50% more pipes. I can't wait to see the 32 piped cards!Reply