Post Your Comment

74 Comments

So do they still use a dongle between the cards? If you had 2 xfire cards then it won't be connecting to a dvi port. Is there an adaptor? I guess what I'm asking is are you REALLY sure I can run 2 crossfire ed. x1950s together? I'm about to drop a grand on video cards so that piece of info may come in handy. Reply

The crossfire card is not the same as the normal one. The normal card also has the extra video out options. So there is a reason to buy the one to team up with the other, but only if you need to output to a composite, s-video, or component. Reply

...I haven't read the article, but i did want to just make a comment...

having just scored a brand new 7900gtx for $330 shipped, it feels good to be able to see the headlines for articles like this, ignore them, and think "...whew, i won't have to read anymore of these until the second generation of DX10's comes out..."

I'm guessing nvidia will be skipping the 8000's, and 9000's, and go straight for the 10,000's, to signal the DX10 and 'uber' (in hype) improvements.

either way, its nice to get out of the rat race for a few years. Reply

yes, all tests are performed with at least 8xAF. Under games that don't allow selection of a specific degree of AF, we choose the highest quality texture filtering option (as in BF2 for instance).

AF comes at fairly little cost these days, and it just doesn't make sense not to turn on at least 8x. I wouldn't personally want to go any higher without angle independant AF (like the high quality af offered on ATI x1k cards). Reply

Looks like there's no HDCP support or HDMI connector added like I'd expect with a brand new top-end card. And, they didn't add the new quieter cooler to the X1900XT. Pity. I doubt it would cost ATI more, and it'd up the sale of cards since people hate the noisy fan ATI has been currently using.

the established exnglish expression is "you cant have your cake and eat it too", even if it doesnt make logical sense. There are many words and expressions that dont make sense in english (driveway, football, highway). Im guessing you're not a native english speaker, but that's the way the language is. now, please post about technology and not the logic of english expressions. Reply

can anyone confirm if those power consumption tests are for the entire system or just the vid cards? the highest figure was 267wts: a high end system that consumes 267wts underload is sweet! can you confirm that is indeed for the entire system (cpu, mobo, hdd, vid card... everything). thanks. Reply

I'm pretty sure that this is power use for the entire system, but Derek's results are quite a bit lower than what I got on the ABS system I tested last week for X1900 CrossFire. Of course, the water cooling and extra fans on the ABS system might add a decent amount of power draw, and I don't know how "loaded" the systems are in this test. I would guess that Derek ran Splinter Cell: Chaos Theory for load conditions. Reply

the review states that power consumption was measured at the wall wtih a kill-a-watt, during a 3Dmark run.

in addition to the water cooling, it could be he's running a more efficient PSU. in a powerful system drawing 220 watts from the power supply would draw 277 watts from the wall with an 80% efficient PSU (like a good seasonic) and draw 314 watts with a 70% efficient PSU. that's a pretty decent difference right there.

While average frame rates are interesting, I _really_ care about minimum frame rates - 300fps average is useless if at a critical moment in a twitch game the frame rate drops to 10fps for 3 seconds - this is especially true in Oblivion. Of course it's possible that the minimums would be the same for all cards (if the game is CPU bound in some portion), but they might not be.
Reply

A lot of games have instantaneous minimums that are very low due to HDD accesses and such. Oblivion is a good example. Benchmarking also emphasizes minimum frame rates, as in regular play they occur less frequently. Basically, you run around an area for a longer period of time in actual gaming, as opposed to a 30-90 second benchmark. If there's a couple seconds at the start of the level where frame rates are low due to the engine caching textures, that doesn't mean as much as continuos low frame rates.

More information is useful, of course, but it's important to keep things in perspective. :) Reply

We used factory overclocked 7900 GT cards that are widely available. These are basically guaranteed overclocks for about $20 more. There are no factory overclocked ATI cards around, but realistically don't expect overclocking to get more than 5% more performance on ATI hardware.

The X1900 XTX is clocked at 650 MHz, which isn't much higher than the 625 MHz of the XT cards. Given that ATI just released a lower power card but kept the clock speed at 650 MHz, it's pretty clear that there GPUs are close to topped out. The RAM might have a bit more headroom, but memory bandwidth already appears to be less of a concern, as the X1950 isn't tremendously faster than the X1900. Reply

I think its obvious why ATI is selling thier cards for less now, and that reason is alot of 'tech savy' users, are waiting for Direct3D 10 to be released, and want to buy a capable card. This is probably to try an entice some people into buying technology that will be 'obsolete', when Direct3D 10 is released.

Supposedly Vista will ship with Directx 9L, and Directx 10 (Direct3D 10), but I've also read to the contrary, and that Direct3D 10 wont be released until after Vista ships (sometime). Personally, I couldnt think of a better time to buy hardware, but alot of people think that waiting, and just paying through the nose for a Video card later, is going to save them money. *shrug* Reply

In this review, the test bed was an Intel D975XBX (LGA-775). I thought this was an ATI Crossfire only board and could not run two Nvidia cards in SLI. Are there hacked drivers that allow this, and if so, is there any penalty? Also, I see that this board is dual 8x pci-e and not dual 16x... at high resolutions, could this be a limiting factor, or is that not for another year?

Sorry about the confusion there. We actually used an nForce4 Intel x16 board for the NVIDIA SLI tests. Unfortunately, it is still not possible to run SLI on an Intel motherboard. Our test section has been updated with the appropriate information.

as we all should know by now, Nvidia's default driver quality setting is lower than ATi's, and makes a significant difference in the framerate when you use the driver settings to match the quality settings. your "The Test" page does not indicate that you changed the driver quality settings to match. Reply

Default driver settings between ATI and NVIDIA are generally comparable from an image quality stand point unless shimmering or banding is noticed due to trilinear/anisotropic optimizations. None of the games we tested displayed any such issues during our testing.

At the same time, during our Quad SLI followup we would like to include a series of tests run at the highest possible quality settings for both ATI and NVIDIA -- which would put ATI ahead of NVIDIA in terms of Anisotropic filtering or in chuck patch cases and NVIDIA ahead of ATI in terms of adaptive/transparency AA (which is actually degraded by their gamma correction).

If you have any suggestions on different settings to compare, we are more than willing to run some tests and see what happens.

could you run each card with the quality slider turned all the way up, please? i believe that the the default setting for ATi, and the 'High Quality' setting for nvidia. someone correct me if i'm wrong.

I think as long as all settings from both offerings are as close as possible per benchmark, there is no real gripe.

Although, some people seem to think it nessisary to run AA as high resolutions (1600x1200 +), but I'm not one of them. Its very hard for me to notice jaggies even at 1440x900, especially when concentrating on the game, instead of standing still, and looking with a magnifying glass for jaggies . . . Reply

When are we going to see a good number of Core 2 Duo motherboards that support Crossfire? The fact that AT is using an Intel made board rather than a "true enthusiast" board says something about the current state of Core 2 Duo motherboards. Reply

Intel's boards are actually very good. The only reason we haven't been using them in our tests (aside from a lack of SLI support) is that we have not been recommending Intel processors for the past couple years. Core 2 Duo makes Intel CPUs worth having, and you definitely won't go wrong with a good Intel motherboard. Reply

Did anyone else notice that the specs for some of the NVIDIA cards are wrong? For example, the core clock of the 7900GTX is supposed to be 650 MHz, not 700 MHz, and the core clock of the 7900GT should be 450 MHz, not 470 MHz. Also, the pipeline configuration for the 7300GT (according to Wikipedia anyway) should be 8 pixel & 4 vertex.

This many mistakes really makes me question the accuracy of other specs I read on Anandtech.

(And by the way, would somebody please inform the DailyTech writers that it's "Xbox 360", not "XBOX 360". And yes I'm aware of the conventions that punctuation goes inside quotes and you shouldn't start sentences with "and".) Reply

The 7900GTX/GT clock speeds that were listed were actually vertex clock speeds, not general core clock speeds, so they were technically correct (parts of the GPU do run at those frequencies) just not comparable to the other numbers. They have been corrected.

The 7300GT is indeed 8 pipes, that was a copy/paste error. Thanks for the heads up.

From the looks of the pricing structure for ATI's cards on the first page, and especially from the looks of the pricing structure for ATI's cards after they simplify their lineup, it looks like ATI is giving up on midrange cards, from $100 - $200. The 7600GT and the upcoming 7900GS both are alone in that price range (about $150 and $200, respecitively), with no competition from ATI, so it seems they really are giving that price range to Nvidia.

I wondering the same thing. Are they going to stop making any 1800's. They should be dropping in this price range nicely. Not sure how competative they are with the 7900's. And now that the 7900GS is coming out the 1800 might be just too outclassed. (you guys just missed a great deal on woot 7900GS for $145).

I hope 1800 drop is still being made and I hope it drops to $150-180 range to fill that gap. Reply

Interesting. The first review I read was at where the X1950XTX beat or equalled the 7950GX2 every time, then here the reverse is true. I think I'll have to read more reviews to decide what is going on (it certainly isn't CPU limitations). Maybe 's focus on optimum quality settings rather than raw framerate is the reason they favoured ATI, and another is the clear fact that when it came to minimum framerates instead of average framerates ( posted both for all tests) the X1950XTX was especially good.

In other words the 7950GX2 posted great average numbers, but the X1950XTX was playable at higher quality settings because the minimum framerate didn't drop so low. Hopefully some other sites also include minimum framerates along with graphs to clearly show the cards perform.

I remember a few years ago when ATs graphics card articles included image-quality comparisons and all sorts of other reports about how the cards compared in real-world situations. Now it seems all we get is a report on average framerate with a short comment that basically says "higher is better". Derek- I strongly suggest you look at how test cards and the informative and useful comments that accompany each graph. There may only have been three cards in their comparison but it gave a much better idea of how the cards compare to each other.

Anyway I'll not be getting any of these cards. My 6800GT has plenty of performance for now so I'll wait until Vista SP1 and the second-generation of DX10 cards which hopfully won't require a 1KW PSU :) Reply

It seems the comments system here uses the brackets in HardOCP's abbreviation as some sort of marker. Apologies for making the rest of the text invisible, please amend my comment appropriately. I was talking about HardOCP by the way, when I said they use minimum framerates and optimum quality settings for each card. Reply

Actually if you look at all of the reviews a bit more closely the sites the scores depend on which processor is being used for the test. It appears that nVidia cards tend to run better on Conroes(probably just means the games are slightly less cpu bottlenecked at the resolutions being tested) while ATi tends to run better on AMD systems(or when the cpu is slowing things down) Of course that is IIRC from the 5 reviews I skimmed through today. Reply

For high-end video out, the DVI port is generally more useful anyway. It's also required if you want to hook up to a display using HDCP - I think that will work with a DVI-to-HDMI adapter, but maybe not? S-VIDEO and Composite out are basically becoming seldom used items in my experience, though the loss of component out is a bit more of a concern. Reply

So if I use DVI out and attach a DVI to HDMI adaptor before attaching to a projector or HDTV, will I get a properly encrypted signal to fully display future blu-ray/hd-dvd encrypted content?

The loss of component is a bit of a concern as many HDTVs and projectors still produce amazing images with component and, in fact, I gather that some very high resolutions+refresh rates are possible on component but not DVI due to certain bandwidth limitations with DVI. But please correct me if I am wrong. I take Anandtech's point on the crossfire card offering more but with a couple of admittedly small quesiton marks, I see no reason not to get the standard card and crossfire for the second later if you decided to go that route... Reply

I suppose theoretically component could run higher resolutions than DVI, with dual-link being required for 2048x1536 and higher. Not sure what displays support such resolutions with component inputs, though. Even 1080p can run off of single-link DVI.

I think the idea with CF cards over standard is that they will have a higher resale value if you want to get rid of them in the future, and they are also more versatile -- TV out capability being the one exception. There are going to be a lot of people that get systems with a standard X1950 card, so if they want to upgrade to CrossFire in the future they will need to buy the CrossFire edition. We all know that at some point ATI is no longer going to make any of the R5xx cards, so if people wait to upgrade to CrossFire they might be forced to look for used cards in a year or two.

Obviously, this whole scenario falls apart if street prices on CrossFire edition cards end up being higher than the regular cards. Given the supply/demand economics involved, that wouldn't be too surprising, but of course we won't know for another three or four weeks. Reply

I was wondering if anyone thinks it's wise to get an intel core duo 2 motherboard with crossfire support now that AMD is buying out ATI. Do you think ATI would stop supporting Intel motherboards? Reply

Of course not. AMD/ATI isn't stupid. Even if their cross-licensing agreement with Intel didn't prevent them from blocking Crossfire on Intel boards (which it almost surely does), cutting out that part of the market would be foolish.
Reply

Agreed. I'm still running a 6800GT and have not seen much of a reason to upgrade with the current software I run. Perhaps if I saw that newer games are 3x faster I might consider an upgrade, so how about it? Reply