playback is already at 1080p with a single x1800 of any sort so i don't think that needs improvement. crossfire hardware assisted encoding might be a really good thing. i imagine a dual core crossfire setup could become a real encoding/rendering powerhouse Reply

While I'm still here, I thought I'd point out what seemed to be a strange anomaly in the Quake 4 benches to see if someone can provide an answer.

Under 4x FSAA, the GTX 512 cards are listed as performing better in 1920x1440 than in 1600x1200. Oddly enough, the results are almost right in the middle of the 1280x1024 and 1600x1200 scores, though if you re-plot the graph with the 1600 and 1920 results reversed it doesn't match the trends set by any other hardware in the list.

Is this a typo, or something more sinister? And, more curiously, why didn't Derek make any mention of it at all?

One thing I would be curious to see is how the ATi cards fare with a small tweak done under B&W2. There's a setting which can be changed in one of the .INI files which makes the game run exponentially better on most hardware I've seen it "trying" to run on - including my own X800 Pro AGP, an two mate's 6600GT AGP and 5800 Ultra AGP.

I believe the file is called "graphics.ini" in the data subdirectory - change the detail settings to be 3 1 3 instead of 3 0 3. It does disable two of the options in the ingame graphics menu (and I have heard it can result in "squares" under fields and such), but the performance increase is substantial, to say the least. Oddly enough, just disabling these two options on their own doesn't make anywhere near as much of a difference.

Sadly, once it's running well you quickly find out that it wasn't worth all the effort, but I would still be curious to see the results from tests under such conditions. NVidia apparently fixed this bug with one of their post-release drivers (hence the disparity of scores), and there's also a 1.2 patch being prepared as we speak which will hopefully level things off somewhat, but in the meantime this is the best we've got.

...card-of-the week mentality. So I finally decided to do some research to see what the B.F.D. was with having one or more $700+ video cards in a PC. I went out and bought the Lanparty UT SLI Mobo, (2) FX57s so I could find the fastest O/C'ing one, (2) Asus 7800 GTX 512s, (2) 520W OCZ Power Stream PSUs, 2 x 1024MB OCZ EB Platinum 4800 modules, a Corsair ice water-cooling system for the FX57 and Nvidia chipset (until I get to vapor cooling), an Antec P160 Performance case and an HP L2335 23" display.

Everything went together fine and I spent several days overclocking the two FX57s until I was able to run almost stable at 3.9 Gig. @ 1.625V w/34 degree cold water. And to my surprise my 3Dmark 2005 showed an incredible 18,240 score !!! WOW, I was just blown away. I was starting to understand what the enthusiasm was all about for the latest-greatest-trick of the week PC hardware. After several weeks of tweaking I now have my system stable most of the time and it simply fly's !!! Not only that but the blue LEDs look so cool at night, and my friends are impressed as H*LL that for less than $6,000 I have a PC that will cook my breakfast, bring in the newspaper, make the utility company rich, heat my house, make Nvidia rich, clean my car, wash my clothes and even do word processing. I can even log on to the Net .00000000000000001 seconds faster than my old dumbazz Athlon 939 3000 that I spent $1,000 on total and which runs rock stable at 2.4 Gig. And at a resolution of 1920 x 1200 I'm able to get a frame rate in any video game of at least 60. This allows me to sit 6'-8' away from my monitor to minimize eye strain when I play video games for 18 hours or more at a time.

Without a doubt I am one broke but very happy camper. NOW - now I understand the point of spending $700 or more on a Vid card and $1000 on a CPU and hundreds on memory, and PSUs, and trick PC cases, etc. And my friends think I am the coolest guy they know cause I got this BLING machine. Whatta life !!! If only I had known years ago... Reply

D3D benches are different than real world performance - and for just about everything (if not everything, correct me if i'm wrong), the GTX 512 blows away the GTX 256 and x1800XT. The x1800 XTPE, or whatever's next in line, is *supposed* to compete with the GTX512. Almost seems like nvidia caught ati flat footed on this one. Reply

sorry, maybe i was a bit unclear
but the thing is that asus x1800xt-top IS x1800xtpe, indeed. And as you've just said the real competitor to gtx512 according to article that i refered.
As to real world perfomance, it's still uclear to me what do you mean.
Maybe i'm wrong, but aren't the majority of the modern games using d3d? Even if not so, i think these results are fairly enough prove that the gtx is no longer the fastest.
Of course, this has to be proven further by other reviewers Reply

Come on! There are over 100 people in EVGA's step-up queue waiting for the 7800GTX 512MB, but you have a problem with ATI's availability?!

Nvidia got LUCKY with the 256MB 7800GTX that it was ready to launch with no real competition. Nvidia was able to sit on it until sufficient quantity were ready. ATI (sort of) launches the X1800XT and Nvidia falls back to the same old launch tricks. If you're going to hold one company accountable, you have to hold them all accountable! Reply

I'm saying it's retarded because when Nvidia released their cards, you could buy them that day. Unlike ATI that even SAYS it will be different this time around and STILL fails to deliver. If neither companies produce enough to meet demand then they underestimated demand and that's something different entirely. Reply

So ATI could have shipped about five cards to the top 10 retailers, and ATI would have completely fulfilled your expectations. They just would have "underestimated demand".
That's a huge load of crap you're shovelling there. Both companies are still more interested in appearing to be in a leadership position than they are actually ensuring they are making deliverable products. I just can't understand why so many "journalists" have their heads shoved so far up Nvidia's ass they can count their fillings. Reply

Has anyone ever compared SLI and crossfire performance using a dual core compared to just a single core cpu? I mean if there is enough overhead for sli or crossfire a dual core chip could improve performance. Reply

I don't know if that dual core thing would work. I mean it might but the two slower CPUs would not help in my opinion. Games are single threaded so the multi CPU wouldn't take off the overhead .. at least that's my knowledge of it. Reply

See I thought that was a big deal with one of the latest Nvidia driver releases. That it was made multithreaded so that in a situation such as when you have sli or any other kind of driver overhead it would be taken care of by the a second core if one existed. I do not know it was just a thought that i had never seen discussed, so I thought I would ask. Reply

We shall shortly soon find out whether Crossfire is serious or just a ATi marketing straw-grabbing ploy to get some suckers (er, "enthusiasts") not to buy SLI. If the compositor is fully integrated into EVERY R580 GPU, (thus never requiring a masterboard and implementing the board communications via a passive bridge a la nVidia) then we shall finally know that ATI is serious with Crossfire. It was probably a stupid cheese-pairing management decision not to integrate the Crossfire functionality fully into the R520 GPU, or else Crossfire does not have enthusiastic support from ATI engineering and is purely a ATi marketing ploy anyway. The R580 details will reveal the truth. Reply

As far as I know the only thing that has changed along the way are the addition of BF2 patches (according to the overclocking the Athlon X2 article, they are up to using the 1.03 patch) and newer nvidia drivers. I believe they are still creating a demo and running it with the timedemo option. With this being such a popular game (BF2), it seems like it would be worthwhile to confirm whether SLI/Crossfire does or does not offer significant improvements for BF2. Reply

So lets all throw a shit fit about every company that ever announced a product only to have availability weeks to years from that announcement.

Anandtech staff is just as bad as it's two year old readers who tie emotions in with silicon. Grow up and learn some patience - if you can't wait, buy someone elses product and stop your whining. Reply

crossfire does great in DOD source , Black and White 2 and it beats Nvidia 7800GTX 512MB SLI . how ever it doesn't great in DOOM 3 engine. FEAR and Chaos theory it manages to defeat the 7800GTX 256MB SLI easy but in high quality of chaos theory it keeps up with 7800GTX 512MB. X1800XT crossfire has only one problem and that it suck in DOOM engine benchmark but overall its great. Reply

I think the R580 chipset and videocard will be the real crossfire shot at SLI. This feels a little early generation SLI to me and they seriously need to get rid of the dongle, LOL. I can't wait to see the R580 in relation to the 7800GTX 512 market edition card. Reply

Problem is that current ATI Crossfire setup is way inferior to NVIDIA SLI, despite big heading "ATI MultiGPU done right", and I do not believe this will change much with hyped R580.

With NVIDIA I can:

- mix any similar cards, being it 6800GS/GT, 7800GT, 7800GTX, 7800GTX-512, thus saving costs
- get much better drivers
- get support for user defined game profiles, thus when I purchase a new game, I do not have to wait a month or two for card producer to come up with new drivers and that game support (ie. with ATI I will play new games using SuperTiling with 0% to -50% "improvement", with NVIDIA I will create a new game profile using e.g. AFR2 and get immediate 90% improvement)
- there is no huge external dongle, and I guess picture quality should be better as well with SLI
- there is much better availibility of NFORCE4-SLI chipsets / motherboards
- I can connnect up to 4 monitors to NVIDIA SLI, but I cannot to Crossfire
- I can switch NVIDIA SLI on/off without restart
- I get better performance with NVIDIA SLI

ATI Crossfire seems still like an afterthought, nothing else, while NVIDIA SLI is a technology incorporated into the core. If I want to game on 1600x1200, the only real option is to get NVIDIA SLI.

I understand that Anandtech cannot bash Crossfire too much, to keep good relations with ATI, free products, shows, trips, etc., but I belive that superiority of NVIDIA SLI is very clear here.. Reply

I don't know what you're talking about.
- you can't mix any of the cards you listed with each other, except a GTX with a GTX512 (gotta use only 256mb from each, and run the 512 model at the normal GTX speeds);
- dont see how drivers are better
- dunno if a profile tool/editor for ati exists or is in the making, so cant comment on this; but you're overestimating the improvement on nvidia
- its just a cable; your guess about the quality comes from what grounds? guess what, you guessed wrong.
- point, unless you go intel; but i seriously hope both ATI and nVidia will release unlocked crossfire drivers that work on any dual16x board
- last i checked, you can't use the outputs on the secondary card in SLI mode, you have to switch SLI off first; i might be counting on outdated information, but i doubt it.
- yeah, and you do that how often? i bet its much fun and enjoyment... no seriously, you're right that this is a convenient feature as occasionally you'd have to switch it, but it doesnt seem too important does it? plus there's no telling if crossfire can't do the same eventually.
- again an unfounded claim

Indeed Crossfire seems like an afterthought to me too, and nVidia's superiority here is (atleast was, until recently) clear. But that's the good part of competition - nVidia forced ATI to "afterthink" something out, and it will be getting better with time. Crossfire is just as serious an option as SLI, and if you stick with your "the only real option is to get NVIDIA SLI" you're just purpously closing your eyes.

Think what you want, I don't really care. But don't preach unfounded fanboyism without expecting to face differing oppinions. Reply