65 Comments

To actually reply -- the crossfire card can work alone. you don't need another card. I know this because it was running by itself while I was trying various methods to coax it into submission :-) ... Also, even if gigabyte's hardware is close to complete, ATI is in charge of the driver for the mobo and the graphics card, as well as getting a solid video bios out to their customers. These things are among the last in the pipe to be fully optimized.

I don't think I could give up the F520 even for desk space. I'd rather type on an LCD panel any day, but if I am gonna play or test games, there's just no substitute for a huge CRT. Reply

Want to know why 1280x1024 always looks wrong on CRTs? It's because IT IS wrong! 800x600, 1024x768, 1600x1200 -- all are 4:3 aspect ratio,
but
1280x1024? That's 5:4! To get 4:3, you need to use 1280x960! That should be benchmarked too, if possible.

Whoever invented that resolution should be forced to watch a CRT flickering at 30 Hz interlaced for the rest of his life! Or at least create more standard resolutions with the SAME aspect ratio!

There are a great many lcd panels that use 1280x1024 -- both for desktop systems and on notebooks.

the resolution people play games on should match the aspect ratio of the display device. At the same time, there is no point in testing 1280x960 and 1280x1024 because their performance would be very similar. We test 1280x1024 because it is the larger of the two resolutions (and by our calculations, the more popular).

I'ld have to agree with Derek. Most people that play on 1280 use 1280x1024. But it is also true that 960 is the true 4:3... While I don't know why it's so that there's 2 versions of 1280 (probably something related to video or TV), I also know that the performance difference is allmost zero, try for yourself. Running Counterstrike showed no difference on my computer.

Well Multi GPU technology to me makes sense simply when you can't get the same performance you need out of a single card. I would rather have the 7800 GTX then the Dual 6800 Ultra setup, for reduced power consumption.

You gotta hand it to Nidia able to market their SLI setup to the 6600 GT, and soon to be 6600 LE line.Reply

I have to say im running a SLI setup.
i went from a AGP based ( nf3, 754 a64 3200+ hammer ) 6800GT system, to the current SLI setup ( nf4 SLI/s939 3200+ winchester ) dual 6600GT setup.

to be honest, SLI aint all that. i miss the fact that to use my dual desktop ( i have 2*20inch CRT's ) I have to turn SLI off in the drivers, which involves a reboot, so thats hassel. Yes, the dual GT's beat even a 6800 ULTRA in non AA settings in doom 3 but they get creamed with AA on, even the 6600GT'S i have are feeling cpu strangulation, so you need a decent cpu to feed it. and not all games are SLI ready. To be honest, apart from the nice view in the case of to cards in there ( looks trick ) i would rather have a decent single card set up then a SLI setup.
crossfire is looking strong though! AIT needed it and now they have it, the playing field is much more level! karlos!Reply

Also, as far as XFire competing with SLI, it seems to offer nothing really different from SLI, and in that sense only competes by being essentially the same thing but for ATI cards. However, NV is clearly concerned about it (as evidenced by their AA improvements, the reduction on the SLI chipset prices, and the improvements in default game support). So, ATI's entry into the dual card arena has really improved the features for all users, which is why this heated competition between the companies is so good for consumers. Hopefully ATI will get their XFire issues resolved quickly, although I agree with the sentiment that the R520 is far more important than XFire (at least in the graphics arena - in the chipset arena ATI probably can't compete without XFire).

all these crt users need to wake up and join everyone else. 99% of all gamers useing crts? phhf, only gamers that also have record players because they swear they can hear the differance between a record and a cd. that difference is noise. where as almost everything in you pc is digital, you guys choose to live with the noise/cross talk, other anamolies that plague your crts because of the d/a conversion, and the non-error correcting communiation.

while i will enjoy my perfect geometery, no need to have all magnets placed miles apart, whinning at res to high. monitor flickering when res changes. and the fact is that 99% of gamers do not spend the big $$$. i mean hell like 70% of the video cards out there are on board. and i'm geussing only like 5% are high end that can go aboce 12x10 res. at 12x10 most people can play perfectly on there lcds.

grow up crt fanboys, and wake up to what is really happening, your crts are dieing, our lcds are getting better. who really wants a 100+ box with a 4 ft footprint on there desk. only gamer snoobs like you.Reply

CrystalBay (and anyone else who cares) ... you want the bf2 demo we use? Email me and I'll send it to you.

It is difficult to setup bf2 to get repeatable results, and I'd recommend that you spend some time inside EA's demo.cmd for benchmarking the game and reading various forums on the subject.

Just to let everyone know, we have to actually take the csv file bf2 generates with the frametimes for each frame and we average the last 1700 frames. DICE actually records frametimes for the loadscreen, and it's more acurate for us to use their recorded times than to use something like fraps.

That's all the "support" I'll give for the demo file, but if you still want it, feel free to ask.

there's nothing on the market that beats a crt for gaming. you have to be smoking some seriously badly cut ghetto crack to consider that an lcd is anything besides prettier,thinner and easier on the eyes than a high quality crt. And really, it is only easier on the eyes for doing 2d work, well, except for when the pixels get in the way of the font being perfectly round and clean, then it sux there too.

I have been a skeptic of SLI/Xfire and the like. I guess I just don't see the need for two cards, although I'm sure there is one or two reasons. I bought the 850pe earlier this year and I have to say I still don't regret it, even if it was overpriced. I'm thinking I will just keep using this card until R520 reaches the surface. Then I will be able to decide whether to buy the x850 master, or just do what I've always done and get the next single card in either the mid or high end.Reply

"#23, I think you may be the only person out there who uses an LCD for gaming without scaling the resolution up to native."

I never scale a lower res. I'd rather play the game with black bars around it then look at a nasty scaled image. So the guy is not alone, there's lots of us who don't like scaling lower resolutions to fit the native resolution of the screen.Reply

#39, You make a good point about people reading the article. But it is important to make it clear that spending more than $900 on graphics hardware may not be worth it unless the user wants to run 1600x1200 or higher.

-----

We've updated the article to include percent increase tables on the last page. CrossFire ends up being pretty competitive with SLI in terms of % increase.Reply

Just so most people without bazillion dollars to spend on these systems get a clue, how about posting 6800 GT, 9800 Pro, 6600 GT numbers. Maybe then we could decide to sell the house and buy the XFire solutions.

Why the moaning about not running above 1600x1200? What percentage of people reading this article will be running it above those settings , I wonder?Reply

====and how can you not see the drawback of shrinking the screen size to get a lower resolution?=====

I'm not shrinking anything, I'm putting it at its natural size. And again, I only do it to run certain games smoothly on my aging system. I see no drawback to running a 1024x768 window'd game on my desktop for games like UT2k4. Sorry, I just prefer that. I also like to be able to Alt-Tab without all the drama (time wasted) reloading of the screen that happens when you run a game in full-screen mode anyway. I'm different that way I guess. OH NOES, DIFFERENCES R BAD!

======Why turn a 19" display into a 14" display?====

Except I'm not? I have a 17" display and I'm using it all, I just put some games in a window on the desktop - you know, like you do with many other pieces of software that you don't want to take up your entire view, or in the case of some games that run better at a lower res on my system.

=======I'm sorry but my 21" crt can do any resolution and still use all of the screen space and not interpolate.====
And mine can do any resolution up to its maximum just fine as well and in a manner I am happy with that does not "interpolate" or distort. Thanks for playing. M

y system's not spec'd enough to run 1280x1024 in the latest games. Since I'm GOING TO BE PLAYING IN 1024x NO MATTER WHAT, why not do it in the most efficient manner where I can also keep an eye on other tasks? And so that's what I do. :)

======LCDs can't period and that's a big deal for 99% of gamers.======
Most gamers have machines that can run the native res of 17" or 19" LCDs. I don't anymore since some games are too much for my system. I was simply answering the faux complaint about, "OMG IT STRETCHES TEH GAMES IN NONNATIVE RES" because it certainly doesn't on mine nor on most modern LCDs that have more than simply "stretch image" as a way to display lower resolutions! *gasp* Imagine that! Technology changes and improves! Ignorance must be bliss though.Reply

#23, I think you may be the only person out there who uses an LCD for gaming without scaling the resolution up to native.

#28, we used an XT / XT combo for SLI, and the Far Cry benchmarks are not a typo. Nor are the XT PE labels. Sorry for the confusion, but it had been a while since I had run Far Cry numbers and the game favored CrossFire enough to make it worth a few quick tests on a couple other cards.

#30, at this point no, we can't test on other chipsets. This product is still in developement and has a hard enough time getting enabled on all ATI hardware.

It depends on the person. I can notice a considerable amount of ghosting in a fast paced game like UT2004 even on an 8ms LCD.

I would never go with an LCD for any kind of gaming for a number of reasons, but anyway I got a top notch CRT a month ago and don't need to worry about LCD limitations for a couple of years at least.Reply

I'm a laptop gamer. I've got an older Mobility Radeon 9700 Pro, so I can't run everything at the native screen size of 1400x1050. As such there are many games that I play at 1024x768, or even 800x600 in the more extreme cases.

Games look just fine with modern LCD scaling. That is to say, a 1024x768 game looks very good scaled up to 1400x1050. When ATI was designing their scaling algo, they obviously focused on making the pixels look square, rather than just doing a simple bilinear transform.

The point of scaling on LCDs may be moot to serious gamers though, as modern desktop cards often don't need to run games at below native res. And when they do, I can report they still look good scaled up.

I can also report that the response rate ("motion blur") is totally overblown. I've got a laptop, which obviously means I've got a pretty high response time (Probably 25ms to 30ms). Motion blur is noticeable, but isn't really distracting except in very bright areas. Desktop LCDs have improved a great deal beyond this with significantly lower response times to the point where it isn't an issue at all. I understand contrast ratios have also improved. All that is really left is colour saturation/accuracy (as a thing that CRTs are better at).

Most gamers I know with modern PCs have LCDs now. CRTs are dying, slowly but surely, much like DVDs slowly replaced VHS. LCDs already outsell CRTs, and adoption among gamers is possibly even higher than the regular computer buying public.Reply

#23 no serious gamer would play in a window, and how can you not see the drawback of shrinking the screen size to get a lower resolution? Why turn a 19" display into a 14" display? I'm sorry but my 21" crt can do any resolution and still use all of the screen space and not interpolate. LCDs can't period and that's a big deal for 99% of gamers. Oh and #21, CRTs will be around until there is a technology that is fit to take its place. LCD panels are not even close to the quality of a good CRT display.Reply

This is not quite a paper launch, but more like a paper review. Almost useless in helping me decide what my next video purchase will be other than to show that Crossfire does in fact work.
How about the important stuff? When will it ship? How much? And where is the next generation ATI card and when will it ship?
I want more competition so I can retire my 9800pro and get a PCIe card, motherboard and processor without having to morgage my house or go without eating for a month...Reply

So the point of this is that Crossfire with two last-gen GPUs (XT850s) doesn't perform quite as highly as SLI with two current-gen GPUs (7800s)? That makes sense. If/when the next-gen (current-gen) ATI cards are released I would expect them to do as well or better when Crossfire'd compared to SLI'd 7800GTXs. We shall see. :)Reply

#17 - what native resolution "thing"? I quite often play at lower resolutions than my monitor's 1280x1024 and it windows it quite nicely on my over-two-year-old LCD (meaning the technology has been around for a while now to make it look good). It shows it at the actual resolution with black around it like a letterboxed movie on a TV screen. Or I simply play it in Windowed mode if the game supports that (most do) and that way I can keep an eye on my instant messenger messages while playing the game. :) Reply

I'm glad to see Splinter Cell: Chaos Theory in the benchmarks because that game alone could justify someone's purchase of SLI/Crossfire cards. As can be seen in all the SC:CT benchmarks at Anandtech, it's quite the GPU-taxing game. Only with Crossfire or SLI'd 7800GTXs, are you looking at smooth gameplay of 60+fps in 1600x1200 4xAA (although since Anandtech doesn't post "Min" "Max" and "Average" fps numbers along with a graph of the fps throughout the testdemo, it's hard to know for sure that it never dips into slower territory.Reply

#15,16,17 - I just retired a 22" Diamondtron CRT earlier this summer. The REAL screen size was 20" and it did 2560x1600. This summer from hell made me realize how hot CRT's get. My computer room is at least 8 degrees cooler now.

CRT's do fine in gaming, but they are declining. Most new sales are LCD. They may have to pry your CRT from your dead hands but LCDS are what people - and gamers - are buying. You don't have to like it, but it is still a fact.Reply

[quote]SLI is a waste of money when you can buy a 7800GTX for $550 at newegg. The increase in power requirements, setup headaches, AA headaches is simply not worth it when you get basically the same kind of performance from a single card.[/quote]

Nope Ure wrong re: price I got my 6800gt when it was new Ill soon be buyin a second 6800gt and I will have played the whole time and still sopent less.

[quote]Finally, it's stupid to buy this setup just so you can run at 2048/1536 with 4X AA. No one with that kind of cash, runs on an old crt. Everyone with that kind of money has an LCD that's either 1280/1024 or 1600/1200 and for those, a single setup based on the 7800GTX is the cheaper and more reliable solution...unless you're an ATI fanboy. If so, then go waste your cash. [/quote]

BS - I have that kind of money and Im using my old CRT (EIZO F58) to goto 16*12@85Hz. So far have not seen 1 LCD that stood up to my standards re. color, responstime and resolution. So sorry Ure just plain wrong.Reply

I will be replacing my current PC this winter with a new one. I do not see enough bang for your buck to go with xfire, especially since a single 7800gtx and presumability a single R520 will run within 5% of the xfire setup. Plus dealing with two cards = twice the heat / fan noise. Also xfire using 800 series gpu's is no bonus.

I think this setup only makes sense if you currently own an 800/850 card. It "may" cost less to upgrade then. Wasn't the xfire 850 card going to retail for $550 (minus the $100 rebate which ati has going on) = $450... Plus $100+ for a new mainboard and we are back at $550... or you can buy a single 7800gtx for same amount and have an easy upgrade... Which would someone rather do?

Most likely I'll go with a single R520. They should be out by xmas! :)

#11
"the performance increase from a single X850 XT to CrossFire is up 43% as opposed to SLI's benefit of 34% over a single card"
I just can't wait till R520 comes out, then ATI and nVidia will have a real showdown.Reply

i always laugh at "gamers" with LCDs... the whole native resolution thing makes real gaming a pain in the ass. my 19" high end viewsonic CRT can do 2048 (but i only have a 6800gt, so not many games i can push up that high). I play almost all games at 1600, but if i got battlefield 2 (im not interested in EA crap like that tho) id have to switch to 1280, which would screw up your LCD. besides it costs a hell of a lot of money to get a 1600 LCD, much less a high res one. and honestly, the picture just doesnt look as deep as a good aperture grill CRT. and unless your paying tons of money for some 8ms LCD, youll get ghosting at good FPS levels. i will never buy an LCD as long as theres a good CRT out there. They just suck for gaming.Reply

Agreed #15 - LCD's are nice, and getting nicer. But for gaming I don't feel you can beat a quality 21' crt. And with the way prices are falling ( I just got an off lease 21' for $100 + 50 shipping) you'll have even more money to drop on a nicer video card, or even a second monitor. Reply

Nothing really that supriseing. Games where the X850XT beat the 6800 Ultra are the same games where X850XT crossfire beats the 6800 Ultra SLI, and vise versa. The 7800GTX is of course well as it's a next gen part; once the R520 comes out I'd assume to see some competition. I have to agree with those above, ATI really should have switched priorities and launched the R520 now and launched crossfire in aug/sept. Reply

It seems silly to moan that you can't test higher than 1600x1200. Reality is 19" flat panels are reasonable and a hot buy right now - and they do 1280x1024. To get higher res you have to go at leasr 20" at TWICE the price for 1600x1200 maximum res. Except for the wide scren res blips of 1920 x1200 in the 24" category, the next big jump is at a 30" LCD with the Apple Cinema 30 doing 2560x1600.

The point is 1600x1200 or the slightly higher 1920x1200 wide screen is the realistic limit for current LCD displays. It might be nice to know 7800 SLI works more efficiently at 2560x1600 but this is mostly academic since most readers here don't have a 30" Apple Cinema (we all know Anand does).

For me I was very impressed that the ATI chipset and ATI Crossfire more than held their own compared to nVidia even in this preview setup. I'm looking forward to Crossfire chipset tests and overclocking tests to see if this is my next AMD chipset for a nVidia 7800. The nF4 stills seems way too flakey to me and everything I read says the new ATI chipsets run and overclock like a dream. Bring on some chipset results.

This reminds me of when ATI released the 9700 pro, except they are taking the beating this time. They will recover, these companies always seem to. I think the R520 is going to suprise many people, they are keeping awful quite about it. Also, rememebr that ati is an expert at staying alive with sub-par high-end cards (radeon 8500 and before). The low-end cards are what keeps these companies alive anyways, look at how many pre-built systems have an X300 in them.....Reply

Why didn't the article look at how much of a relative improvement X-Fire gives compared to SLI?

Yes, Crossfire beats the 6800 SLI in HL2, but what does that mean? A single X850 card beats a single 6800 in that game too.
But 6800 SLI gives a 22% improvement over single-card SLI, while X850 Crossfire only gives roughly half that. (in the no-AA HL2 test. Too lazy to work out all the figures)

Isn't that where it gets interesting? It's no big surprise that two fast cards beats two slightly slower cards, but knowing which multi-card technology provides the biggest boost over single-card speeds, would be a lot more relevant.Reply

Xfire does give a good % increase over a single card which is promising for ATI's next generation assuming that it will be competitive with the 7800 series. Shame that the current performance doesn't look that good compared to 7800 SLI.Reply

SLI is a waste of money when you can buy a 7800GTX for $550 at newegg. The increase in power requirements, setup headaches, AA headaches is simply not worth it when you get basically the same kind of performance from a single card.

Still, it's good that they have implemented the technology, then again it's completely asynine to require a special, more expensive card to have it running. SLI is cheaper.

Finally, it's stupid to buy this setup just so you can run at 2048/1536 with 4X AA. No one with that kind of cash, runs on an old crt. Everyone with that kind of money has an LCD that's either 1280/1024 or 1600/1200 and for those, a single setup based on the 7800GTX is the cheaper and more reliable solution...unless you're an ATI fanboy. If so, then go waste your cash.Reply

Nice, Don't know what to say... It's nice to see Anandtech's involvement in the discussion!

I'ld like to point that as a halfway gamer I did get rid of a Sony F520 which did 2048x1536 at 85Hz, and it was beautiful and HUGE. It ate all the space on my desktop so when transforming to LCD which uses allmost no space there are a few issues to consider. Like responsetime, colors, artifacts yadda yadda yadda. I found a nice companion in the Nec 2080UX+ which is a 20" 16ms panel, and I have no smearing, no artifacts, vivid colors and last but not least NO lag what so ever..

Apart from that back to topic:
I feel that ATI has made a good effort, while abit late it's still nice from a consumers stand to have competition. What makes me a bit worried is the way ATI made their solution. As a consumer I'ld really not get stucked with a card that REQUIRES a slave card to work at all (correct me if I'm wrong..), that is you cannot run a CrossFire card by itself in a system. Whilst Nvidia's solution is more flexible (you can run your system with either one of the cards used in SLI).

I wonder also if CrossFire will work with cards from different vendors, as that was one of the quirks Nvidia was bothered about (same vendor and bios).

I'm also concerned about how much trouble Anandtech had getting the setup to work properly, ok it's a prerelease, but Giga said it was close to finished... shuld we be worried? Who knows... I've had bad encounters with highend ATI cards in the past, whilst my friends have had few or none. It makes me wonder if CrossFire is having the same trouble with drivers as the "old" drivers had with standard ATI cards....

Lastly, I find it annoying as a consumer to have to buy a more (in case of XT -> Crossfire) expensive card than the one I allready have because the new one also supports the faster XT PE.. (correct me if I'm wrong regarding the cost and so forth...)...

Last.. Despite all my reservations regarding ATI's solution to Nvidia's SLI I'm still very interested in the development, and not least: Will CrossFire also come for R520 (probably, but if they're having this much trouble with x850 how will the new cards fare?)