Introduction

This week, we were very lucky to get our hands on a CrossFire motherboard and a CrossFire master card from Gigabyte.

We have previously covered CrossFire, so please check out that article for more details. It all comes down to ATI's answer to SLI in the form of a master/slave card combination where the master card reconstructs the data produced by both cards and outputs it to the display. Communication between cards is done over PCI Express and a dongle that plugs into the slave card's DVI port. And today, we have the pleasure of talking about performance.

While we did get the chance to take an early look at CrossFire during Computex, we recently learned that what we saw wasn't actually full CrossFire. This time, we have an actual master card in our hands and we'll put ATI's answer to SLI to the test. Of course, due to the very prerelease nature of these products, our tests were not without some bumps and detours.

We had some trouble getting CrossFire set up and running due to a combination of factors. The first monitor that we tested doesn't work on the master card dongle with drivers installed. We weren't exactly sure in which slot the master card needed to be (we hear that it shouldn't make a difference when the final product is released), and we didn't know into which DVI port on the slave card to plug the dongle. After a bout of musical monitors, slots, and ports that finally resulted in a functional setup, we still needed to spend some time actually wrestling the driver into submission.

After getting the driver issues squared away, we got down to testing. Our first disappointment came along when we realized that the CrossFire AA modes were not quite finished. Enabling these modes drops performance much more than what we would expect and looks like the frames that each GPU renders are out of sync with the other card. We can't be totally sure what's going on here, but it's clear that there still needs to be some work done.

One thing that works well right now is SuperTiling. Except for some random display corruption when switching modes, SuperTiling looked alright and ran with good speed.

Note that each GPU will render 32x32 pixel blocks (256 adjacent quads for those keeping track).

The only random quirk that we would expect to find in prerelease hardware, which ended up getting in our way, is the fact that setting standard AA modes in the control center didn't seem to actually enable antialiasing. Games that made use of in game AA adjustments seemed to work well. Not being able to use a good CRT monitor, we had to resort to our 1600x1200 LCD for testing, limiting our max resolution. Below 16x12, many games are not GPU limited under CrossFire. Even at 16x12 with no AA, we see some games that could be pushed much further.

This brings up a point that we will be making more and more as GPUs continue to gain power. Large LCDs are very expensive and CRTs are on their way out. Buying more than one 6800 Ultra, and soon adding a CrossFire master to an X850, doesn't make much sense without being able to run more than 1280x1024. And we would really recommend being able to run at least 1600x1200 for these kinds of setups.

all these crt users need to wake up and join everyone else. 99% of all gamers useing crts? phhf, only gamers that also have record players because they swear they can hear the differance between a record and a cd. that difference is noise. where as almost everything in you pc is digital, you guys choose to live with the noise/cross talk, other anamolies that plague your crts because of the d/a conversion, and the non-error correcting communiation.

while i will enjoy my perfect geometery, no need to have all magnets placed miles apart, whinning at res to high. monitor flickering when res changes. and the fact is that 99% of gamers do not spend the big $$$. i mean hell like 70% of the video cards out there are on board. and i'm geussing only like 5% are high end that can go aboce 12x10 res. at 12x10 most people can play perfectly on there lcds.

grow up crt fanboys, and wake up to what is really happening, your crts are dieing, our lcds are getting better. who really wants a 100+ box with a 4 ft footprint on there desk. only gamer snoobs like you.Reply

CrystalBay (and anyone else who cares) ... you want the bf2 demo we use? Email me and I'll send it to you.

It is difficult to setup bf2 to get repeatable results, and I'd recommend that you spend some time inside EA's demo.cmd for benchmarking the game and reading various forums on the subject.

Just to let everyone know, we have to actually take the csv file bf2 generates with the frametimes for each frame and we average the last 1700 frames. DICE actually records frametimes for the loadscreen, and it's more acurate for us to use their recorded times than to use something like fraps.

That's all the "support" I'll give for the demo file, but if you still want it, feel free to ask.

there's nothing on the market that beats a crt for gaming. you have to be smoking some seriously badly cut ghetto crack to consider that an lcd is anything besides prettier,thinner and easier on the eyes than a high quality crt. And really, it is only easier on the eyes for doing 2d work, well, except for when the pixels get in the way of the font being perfectly round and clean, then it sux there too.

I have been a skeptic of SLI/Xfire and the like. I guess I just don't see the need for two cards, although I'm sure there is one or two reasons. I bought the 850pe earlier this year and I have to say I still don't regret it, even if it was overpriced. I'm thinking I will just keep using this card until R520 reaches the surface. Then I will be able to decide whether to buy the x850 master, or just do what I've always done and get the next single card in either the mid or high end.Reply