Its the new nVidia GTX690 which is basically a dual 680. 3072 CUA cores on one board. Price ... a whopping $1k. So now you can buy two of those and go for Quad-SLI at only 600W TDP. Sounds pretty reasonable. $2k is a lot of scratch though for two boards.

If you mean 2x dual GPUs, it's nothing very new; in fact you can go "sextuple" if you want too with 3x dual GPUs. The drivers to enable it came out in about 2006 with the 7900GX2 (althought prior to that there were I think 2 other dual GPU cards from nVidia alone - see http://en.wikipedia.org/wiki/Scalable_Link_Interface).

Also, while they do scale over and above 2 cards or GPUs, going beyond that is a seriously "trade off" situation. Those quad 690s, I reckon, will give you maybe 10-15% more speed than a single 690 for 100% more cost, or, about 2 grand. In contrast, just using dual GPUs can give you more like 50-80% more speed depending on settings, the game, and the resolution used.

Other than custom CUDA programming, is there any commercial application, apart from games, that could possibly use 4 or 6 GPU's? Does rendering, video edit/encode, simulation (FEA, CFD, etc) actually utilize GPU's yet? To me, that is far more interesting than getting a few more FPS in a game...

Other than custom CUDA programming, is there any commercial application, apart from games, that could possibly use 4 or 6 GPU's? Does rendering, video edit/encode, simulation (FEA, CFD, etc) actually utilize GPU's yet? To me, that is far more interesting than getting a few more FPS in a game...

I'm certainly interested in throwing them at bioinformatics tasks, but i'm not sure i trust the consumer versions. 2GB per GPU might be a bit limiting also. Then again, i'm scared what the 'tesla' version of the GK110 is going to cost, so i may have to make do.

I'm certainly interested in throwing them at bioinformatics tasks, but i'm not sure i trust the consumer versions. 2GB per GPU might be a bit limiting also. Then again, i'm scared what the 'tesla' version of the GK110 is going to cost, so i may have to make do.

I had to wiki bioinformatics, but that is pretty cool to know that GPUs can be used for R+D in that way. Is the software all custom code, or is there actually a commercial application for the field?

I'm still waiting on a commercial FEA solver...I remember reading about ANSYS working on it years ago, but I've yet to see anything out of the test labs. Of course I use Autodesk products at work and they will probably be years behind the big names for integrating a feature like that into simulation suite.

I'm certainly interested in throwing them at bioinformatics tasks, but i'm not sure i trust the consumer versions. 2GB per GPU might be a bit limiting also. Then again, i'm scared what the 'tesla' version of the GK110 is going to cost, so i may have to make do.

I had to wiki bioinformatics, but that is pretty cool to know that GPUs can be used for R+D in that way. Is the software all custom code, or is there actually a commercial application for the field?

There are commercial players, but the field is fairly wide and rapidly changing, so usually the most useful stuff is open source but quality varies. There's also the issue of 'once it's published, it's abandoned'.

In terms of GPU support, it's really only started to happen recently and then only with widely used, established tools - there's no point in spending 6 months porting something to a GPU that going to be obsolete by the time you get it done. Even tools which properly exploit multi-core CPU are surprisingly rare. But when you're dealing with datasets measured in the order of 100GBs, you really really want every performance trick you can get - and i'm happy enough to implement them in my own code, when appropriate.

After several years of fiddling around with SLI and then Crossfire, buying high wattage overkill PSUs, and then finding that NV and ATI almost never have driver profiles ready for SLI/CF support on games. I have reached the same conclusion as BFG.

"Keep it simple, stupid" is the way to go with gaming PCs. Just pay extra for the best single GPU card you can get. I am back to single GPU now (Radeon 6970).

After several years of fiddling around with SLI and then Crossfire, buying high wattage overkill PSUs, and then finding that NV and ATI almost never have driver profiles ready for SLI/CF support on games. I have reached the same conclusion as BFG.

"Keep it simple, stupid" is the way to go with gaming PCs. Just pay extra for the best single GPU card you can get. I am back to single GPU now (Radeon 6970).

Definitely. I used to drool at this kind of stuff years ago but nowadays when you can run most games at a standardized resolution (1080p/1200p) with mostly maxed settings on a computer you can build for under $1,200? Maxing the remaining 2-3 settings at the cost of another grand and a half just isn't worth the trade-off.

I cannot remember the last time I had a game that did not work in SLI. Tri-SLI has been a disappointment but it's impossible to know how much of that is lack of Tri-SLI support versus a CPU that can't keep up. Not without dropping a few K more into finding the answer, anyway. But dual SLI I can easily recommend.

m certainly interested in throwing them at bioinformatics tasks, but i'm not sure i trust the consumer versions. 2GB per GPU might be a bit limiting also. Then again, i'm scared what the 'tesla' version of the GK110 is going to cost, so i may have to make do.

True, but a full-performance compute-focused version of GK110 oughta be a monster...

Regarding SLI, I've had pretty much the opposite experience - never really had significant troubles with it, and I'd never be afraid to consider it. I currently have a 6870 Crossfire setup, which did give me some trouble initially when using the HDMI/DVI connectors on it which I was never able to fully diagnose (using the HDMI out on the 1st card, there was image corruption whenever Crossfire was supposed to kick in, and Crossfire itself wouldn't work).

After experimenting with various cabling (HDMI>HDMI, DVI>HDMI, & even VGA), however : the weirdest thing about it is that looks at it's best when using the VGA input on my monitor (a 1080p LCD TV). Connected up this way, I get a very clear, pixel-sharp display. It shouldn't make sense (analogue-in to a digital display), but it undoubtedly gives me the best quality image, and is even comparable to when my previous GF295 was connected up HDMI>HDMI all the way. I can't figure out why, but it's easily the best option for me.

I cannot remember the last time I had a game that did not work in SLI. Tri-SLI has been a disappointment but it's impossible to know how much of that is lack of Tri-SLI support versus a CPU that can't keep up. Not without dropping a few K more into finding the answer, anyway. But dual SLI I can easily recommend.

I can. Crysis 2 on release, devs just said turn off SLI/CF, it doesn't work, causes massive flickering and actually gives you lower performance if you use it.

I'm still waiting on a commercial FEA solver...I remember reading about ANSYS working on it years ago, but I've yet to see anything out of the test labs. Of course I use Autodesk products at work and they will probably be years behind the big names for integrating a feature like that into simulation suite.

3DStudio has had GPU accelerated rendering (through Mental Ray) for a while, and Vray has developed VrayRT which works fairly decently, though it has tradeoffs that mean I haven't used it all that much yet.

3DStudio has had GPU accelerated rendering (through Mental Ray) for a while, and Vray has developed VrayRT which works fairly decently, though it has tradeoffs that mean I haven't used it all that much yet.

L.

I meant Autodesk engineering software...I wouldn't even know how to begin to use the graphics/rendering software even if I had a use for it.

I can. Crysis 2 on release, devs just said turn off SLI/CF, it doesn't work, causes massive flickering and actually gives you lower performance if you use it.

For one, Crysis2 was a complete clusterfuck on release, and anyone who bought it full-price deserves what they got; they were providing sales to a Dev. who reneged on every point of their pre-sales advertising.

I can also tell you another game that was flawed with Crossfire - BF3. However it was merely a driver issue and a couple of weeks when the next iteration of Catalysts came out, all was well again.

Okay, so for Nvidia users, one game. AMD has had somewhat more trouble getting CF drivers ready in time. But of the dozens of games I've played the last couple years, stretching back to KOTOR II and God knows what, zero issues on Nvidia SLI setups aside from that brief hiccup with pre-DX11 Crysis 2.

Non...sequitur? Who gives a shit about one bug in one game that was patched? SLI scales almost 100%, CF scales almost 100%, and both of them work more than 90% of the time. There is no substitute for supreme graphical performance, period.

The OP's four bullet points....that's exactly my experience, and that's after blowing $1K on video cards alone. That's why I don't even think about SLI/CF anymore:

Quote:

The problem is not nesessarily that SLI support is lacking in many games, but that you often have to wait a long time for SLI support. Currently I bought Alan Wake - and again I forgot that I of course could not play that game without SLI (using the FEAR2 profile caused artifacts). The same thing happened with Need for Speed the Run. And Rage still lies in my drawer - simply too slow without SLI. So main problems with SLI:

1) Not all games are supported - with only one card you do not have this problem2) Micro-stuttering in SLI - with only one card you do not have this problem3) Some games do not work correctly in SLI (The Witcher, Rage etc.)4) While one card owners are playing new big game releases from day 1, you just sit there and wait for your SLI profile

I'm pretty sure Rage does support Crossfire, and it works. I played through it recently with good framerates and everything cranked on dual 6870s. It required a driver update (although since I only bought and played it back in May, I think, I was up-to-date anyway), but that was down to ATi, while Carmack allegedly said it was nothing to do with him when it was first released.

I've personally avoided NVidia for a while because of heat issues, and of course now I'm back looking at it.

My current setup is 2 6970s in CrossFire and honestly once profiles are available, I have very little issue beyond the occasional micro-stuttering. I've played Rage and Witcher with it and I had no issues though obviously with CF until there's a profile it doesn't really do CF though there's ways of forcing it to use a different profile which rarely works in my experience. CF profiles used to take quite a while but in recent years they've usually been out soon after the game was release or before release. I believe it honestly is partly a developer issue on why some games do not have profiles ahead of time. (Least that's what Nvidia and AMD developer support people claim.)

I'd prefer not to use dual GPUs, but unfortunately I can't really stand games that dip below 60FPS at 1080p. Even with dual GPUs some games still cannot meet this.