Posted
by
Hemos
on Monday June 05, 2006 @09:11AM
from the that's-one-hot-stick dept.

Bender writes "What would happen if you took NVIDIA's multi-GPU teaming capability, SLI, and stuck it onto a single graphics card? Probably something like the GeForce 7950 GX2, a 'single' video card with dual printed circuit boards, dual graphics processors, dual 512MB memory banks, and nearly twice the performance of any other 'single' video card. Add two of these to a system, and you've got the truly extreme possibility of Quad SLI. We've seen early versions of these things benchmarked before, but the latest revision of this card is smaller, draws less power than a single-GPU Radeon X1900 XTX, and is now selling to the public."

Nothing yet, probably. But that doesn't mean there won't ever be any.. Also, in most cases, these boards are used by people like John Carmack to come up with proof-of-concept of new ideas/technologies, or whatever cool thing he's cooking up...

While they may be overkill for your average user, for (game) developers these things will be goldmines..

Yes. But the market will consist of idiots. Because, as you say, today you can literally use such a setup for nothing.

Sure, 3 years down the road there'll be games that look noticeably better with such a setup, but heres the thing; 3 years down the road you can have this graphics-performance for 1/8th the price and power-consumption.

It's fine though, those "early adopters" (aka idiots) pay a large fraction of the development-cost for the rest of us.

i thought the same about my 7800 GTX till i tried playing oblivion with all setting....and the fps i get mostly below 30.
you get the hardware and getting the software to saturate that piece of hardware is not so hard....

Was that sarcastic? Hz is a measurement of something/seconds. In the case of video cards, that would be Frames or Refreshes/Second, and in monitors Refreshes/Second. If you have a game running at 120 Frames/second, but the monitor only has a 60hz refresh rate, the monitor's refresh rate is essentially the limit on your FPS. It doesn't matter if you turn vsync off, though doing so will let the video card spew more frames to the monitor than it can handles (leading to tearing and a higher FPS measurement than

"the monitor's refresh rate is essentially the limit on your FPS. It doesn't matter if you turn vsync off, though doing so will let the video card spew more frames to the monitor than it can handles (leading to tearing and a higher FPS measurement than you can see)."So you're saying your monitors refresh rate is the limit of your FPS but if you turn off vsync you will get more FPS. Yah that was what I was saying.

I don't care about tearing, I care about 100+FPS, I can see flickering and stutter with 75FPS or

No. No no no. If your framerate is above your monitor's refresh rate, you will not see any more frames. That's it. The refresh rate is the hard limit to the visible framerate (it's refreshes per second, and since each refresh shows one frame, it is essentially your monitor's maximum framerate). If you turn off vsync, the video card will start spewing out a lot of frames, but you won't see a framerate any higher than you monitor's. It's a placebo.Now, IIRC Doom 3 is locked at 60fps, even if your monitor has

Which game you need to run to take advantage of the equvalent of 4 graphic cards?

Oblivion at 1920x1200? Good thing I don't have an Apple Cinema Display. Personally I think Oblivion's game engine is a bit overrated. Ok it's pretty but not *that* much prettier than the other freeform 3D games that don't kill my GFX card. Right now I'm working on a HOMM5 addiction instead...

Less power consumed than the high-end Radeon, and take into consideration the heat is going to be coming from two GPU cores instead of one. If you're already on an ATI setup this will surely take your temp down a couple of degrees.

I think in the future we'll see GPUs with multi-core CPUs, with PhysX and Media(maybe IBM's CELL) co-processors and lots and lots of memory for HD-textures and HD-content.
Or maybe I just need some coffe.

You now there are people who have this thing, called "work".Where they have to put up with "recently corporate-purchased Dell, we won't feel necessary to change them before 2 years" crappy machine, that are slugs compared to what geek assembled in his garage 4 years before, out of spares.

On the other side, this thing called "work" comes with a nice stuff called "pay-check" that enables you to buy even more ultimate-leet-gear (and also buys baits for girlfriends such as "dining in a nice restaurant")

I'm not the only one that thinks 'great, just what we need' am I? I only just upgraded my graphics recently from a 5900-series to a reasonably priced 7600-series, and since doing so reviews of CrossFire[sic?] and SLI keep popping up, and now quad- is appearing. This time next year can I expect my graphics card to not even be considered minimum-spec to run new games on the PC, yet are going to be on the Xbox360 and PS3 running just fine?Who truly honestly needs this much horsepower for personal use? Seems li

This is the nature of the "hardcore" (or "enthusiast", or whatever they call it these days) PC game market. Unless you spend several hundred dollars every few years, you get way behind the curve. It's really unfortunate, as I'd love to play more PC games, but the total cost of upgrades (versus what you get out of it) is way too much.

Unless you spend several hundred dollars every few years, you get way behind the curve.

What's wrong with staying way behind the curve? It's the same tech, the same games, the same everything over time, except that you get those who think there is some important value to being at the leading edge of the curve to finance your gaming for you.

The best value in a car is a two year old used, third year of the model, but avoid the models favored by teenage street racers. They're innately overpriced for what you get and no matter how shiney the paint the internals have had the shit beat out of them.

I bought a shiny 6800GT over a year ago for $400 bucks. I'll never spend close to that for a video card again. If I can play the same games on a next gen console I'll pass on any PC upgrades in the future. It's a shame... the PC industry is only hurting self for a long term user base. Hell, you can't even get a decent baseball sim on the PC anymore... it's all going to pot.

I built my newest PC about a year and a half ago for under $800. It replaced my previous PC which I had used for about 3-4 years. My year and a half old PC is still doing fine with most newer games, I've play HL2 based games with most options turned on with no problem. I've been playing a lot of NFS:MW lately, with the graphics cranked up and it runs smooth as silk.As for a baseball sim... you've gotta be kidding. I mean, I can understand going out to a game, the atmosphere, the pop-corn and hot dogs, the c

Well Rick, some people still enjoy a good simulation of baseball. The PC used to be the king of baseball sims but since developers are making more money developing for the console they abandoned the PC (thus my reason for mentioning it). My point is that if they can make games like HL2 and Doom3 better on consoles, which they have come pretty damn close (go read the xbox reviews @ gamespot), then why put any more money into a PC?

HL2/Doom3 are better on newer consoles than their pretecesors were on earlier consoles. They are still weak compared to their PC based rivals.;)

As great as consoles are, they are still specialized machines which limits their adoption. My PC can do everything consoles can do and much more that consoles can not. And as long as PCs have that advantage and a wide spread adoption rate, there will continue to be a market for PC based video games.

"640KB ought to be enough for anybody." -Bill GatesThis is actually pretty cool... I'm starting to feel like the computer industry is warming up to the prospect of modular parallelization "at home".We are reaching a point where quantum tunneling could become a real problem and frankly, I was hoping this would happen sooner... The industry always focused on things getting smaller, but we're running into a barrier in that direction.Now we're starting to see the opposite: instead of buying a brand new system a

Seems like a case of making the product long before any real demand for it actually exists.

Thats often how progress happens. Products are developed where the demand that already exists is a very limited niche, then, once the technology exists, more uses for it are developed, and demand increases.

But then, I don't think that's really the case here; seems to me that polygon-pushing horsepower on GPUs is something that developers have plenty of uses for as much as anyone can make available, and that plenty of

Before you get all excited about the prospect of dedicating 96 pixel shaders, 2GB of memory, and 1.1 billion transistors to pumping out boatloads of eye candy inside your PC, however, there's some bad news. NVIDIA says Quad SLI will, at least for now, remain the exclusive realm of PC system builders like Alienware, Dell, and Falcon Northwest because of the "complexity" involved.

So they are going to alienate the majority of the market that would spend the money on a Quad SLI setup to keep it exclusive to system builders for whatever period of time.

Seems like a bad business decision to me, at least until (and if) Nvidia comes to their senses.

The review states:
Before you get all excited about the prospect of dedicating 96 pixel shaders, 2GB of memory, and 1.1 billion transistors to pumping out boatloads of eye candy inside your PC, however, there's some bad news. NVIDIA says Quad SLI will, at least for now, remain the exclusive realm of PC system builders like Alienware, Dell, and Falcon Northwest because of the "complexity" involved.
Did it mention anything about having to have a direct supply of electricity from your local Three Gorges Dam

I'm probably going to loose even more karma for posting with that title and subject - but i'm on a karma--; roll lately.

Graphics cards innovations for the past several months/year with SLI seem to be me mostly "i have a dual SLI system!", "yeah? well i have a QUAD SLI system!" - soo much performance that is unused it's pointless. Furthermore for the price of one of these brand new cards in the article I can build a decent gaming computer or a HDTV mythTV box.

I would rather spend $600 on much more useful things that would see use right now on pricewatch the video cards at $100 are: radeon x1300 256mb agp, radeon x1600 pro 256mb pci express, radeon x800 pci express 256mb, geforce 6600 gt pci-e 256mb

I would rather spend $600 on much more useful things that would see use right now on pricewatch the video cards at $100 are: radeon x1300 256mb agp, radeon x1600 pro 256mb pci express, radeon x800 pci express 256mb, geforce 6600 gt pci-e 256mb

So spend your $600 on more useful things with the rest of us, and let the fanatics keep driving the very high end video card market so that we can all benefit from it when it's in the $100 bin in what, 2 or 3 years.

One word:Oblivion.Three more words:Unreal Tournament 2007.I have an athlon 64 3800+ with SLI 7800GT's, XFi etc etc and oblivion still grinds to a halt if I push the settings up much beyond their medium levels. Even FEAR only just runs at a decent rate at full whack on my rig. I don't even want to think about the horsepower UT2007 will need.

You want a game that looks like crap and runs like crap, fine. Buy an X1300 or 6600GT. Those of us who want a better looking, faster responding high-end game can use all

And just doing a quick back-of-the-envelope, that rig probably cost you well over $3000. That's a heck of a lot of money to spend and still have Oblivion 'grind to a halt' at max settings. Today's gaming market has just gotten ridiculous. What ever happened to the days when you could get good performance from the latest games on only $1000-$2000 worth of hardware?

Really? 'Cos I could have sworn it was just around 2001/2002 when even the best video cards still cost less than $400 and I could crank Deus Ex or ST:Voyager Elite Force up to max settings on a GeForce 4 and get playable framerates. Now I have an ATI X1600 Pro and BF2 runs at a crawl on all but the lowest settings, and it doesn't even look that pretty.

Like the original dual Voodoo cards, multiple video cards is just one of those things that keeps going out of style (but like old fads, makes its appearance every decade or two).

The cost to implement and manufacture multiple video cards is ridiculous. Who honestly would spend $1400 just to have two video cards, and then only get at most 20% performance improvement.

With the current trend of multiple cores, I figured it would be just a matter of time for the SLI and Crossfire solutions to switch back to a single video card. Either they would dual core the GPU, or simply put two GPU on the same card.

I just makes sense to keep a video card as a single card. You dont have to duplicate the production costs and all the other components that are wasted in a dual card configuration, you also dont have to duplicate the bus technology on the motherboard in order to implement dual video cards. Overall, this will be a much cheaper configuration that will actually bring high performance video technology into the realm of being practical.

Eventually, 4 way GPU cards will be released, and eventually nVidia and/or ATI will start to dual core their GPUs, those spending money on their expensive dual or even quad based SLI configurations just wasted a bunch of money.

With the current trend of multiple cores, I figured it would be just a matter of time for the SLI and Crossfire solutions to switch back to a single video card. Either they would dual core the GPU, or simply put two GPU on the same card.

Actually, the G71 processor used in that beast has 32 pixel pipelines already, which in their context are similar to cores on a CPU. (Sure, they form a SIMD architecture unlike CPU cores, but so does SLIed GPUs sort of as I have understood it.) When CPUs get more cores, G

Well, at least it isn't this one, because in this one they have screwed two boards together. The GPUs don't share any memory bus or memory. It is basically just two cards SLIed on the same PCI slot, unless I am mistaken.

You can go RTFA yourself. I read TFA and TFA says "A single GX2 plugs into one PCI Express slot, but it actually has a pair of printed circuit boards". Last time I checked, a "pair" means two. From what I can tell, they have basically just screwed two cards together in a SLI configuration and made them share the PCI slot.

"Did you know that 93% of statistics are made up on the spot?"5950 to 7675 (3dmark scores) is over 28%. There were better and worse scores than that, but since that was the overall 3dmark core, I figured it would be good to go with.

Yes, there are individual tests that are lower than 20%, but to say 'at most 20%' when there are no games designed to USE that kind of hardware and the current benchmarks ALREADY show higher results... That's just wrong.

is that it doesnt work for GPUs.Instruction Parallisation was never a problem there, so the cores are inheritly as parallel as the die-size allows. If you could squeeze twice as much transistors on a chip, your GPU would have 64 instead of 32 pixel piplelines, for example.Plus dual core does nothing for the bandwith problem... (and no, going to 1024 bit memory or something isnt an option

Eventually, 4 way GPU cards will be released, and eventually nVidia and/or ATI will start to dual core their GPUs, those spending money on their expensive dual or even quad based SLI configurations just wasted a bunch of money.

You are missing the point that GPU are highly parallel operation processors. What you call "dual core their GPU" has been done for the past 5+ years in the graphics industry. They call it a new product.Every new generation had more pixel pipelines. What do you think those are? You can

1. I have an SLI motherboard and GPU. I'm only running one card so I can upgrade it down the line (3+ years) when the card hits 50 bucks.
2. Who honestly would spend $1400 just to have two video cards, and then only get at most 20% performance improvement?
Actually, SLI can be had for as little as 300 (mobo not included). Also, you will see a much more than 20% performance boost. Check your numbers next time.
3. Yeah, i saw it merging onto one ca

Nice card...err...cards. I would buy one if I had the $$$. But if you look at the price point of the 7900GTX and this new card the price difference isn't that big. Still though, thats a pretty penny just to make games look better. My 6800GT is still hanging in there.

I'm still getting by with my ATI RADEON 9700 PRO. Still plays just about anything I can throw at it. Oblivion gives it a hard time, but it's still adequately playable.

I'm going to hold off as long as possible until the card can't play the latest games, at which point I may get one of these quad SLI setups. by that time, we'll have DDR3 memory and quad core CPUs too.

I just bought the budget edition of 'Deus Ex' the other day. What I really like about it is that I needn't think twice about wether it will run smooth or not. I have an Athlon 2100 XP + and a Geforce 4 Ti 4-something, I can crank up the grafics to full and needn't worry about lag or something.That's allways the more fun way to go IMHO.

because for the large numbers of us with laptops, it's really hard to upgrade our video cards, given space constraints, but quite easy to pop in a "stick" video card so we can run the latest graphics apps.

Sigh.

See, if I'd bought the "latest" computer, I'd already be out of date - by choosing to just buy a cheap $500 laptop, I'm just as out of date as I was a month ago.

With so much of the highest-level CPU design going into GPUs, and so many of the most wily consumers of the fastest GPUs going to any lengths possible to trick them out, I'm surprised there's not a lot more development of GPGPU [gpgpu.org], harnessing these processors for general purpose computing.

Given the qualifications and interests of that joint community, I'd expect to see a "PCI network" that parallelizes MP3 encoding on much cheaper MFLOPS GPU HW by now.

You know, I've been stuck behind the kbd for many, many years and I'm ready for a change that seems obvious to me as needing to be done.

Get the Gfx 'card' out of the computer. Add a GPU socket to the motherboard and expandable video-ram slots.

I could spend an hour on why I think this solution would be better but here are a few of my reasons:1) As fast as PCI-E is, a direct motherboard interface would be faster2) Directly upgradeable memory allows you to afford the better chips and expand the ram as you have the money instead of 'settling' for a lower card because the higher memory version doubles the price.3) The ability to use the same memory and JUST upgrade your GPU since many revisions happen to cards while the memory stays the same.4) You could use standard CPU cooling on the GPU to have a much more efficiently cooled GPU instead of adding more weight to a relatively flimsy PCI-E connector saving the occasional card/mb damage.5) A forced standard all chipmakers would have to produce chips under the same interface standard for new boards and motherboard mfr's as well as CPU mfr's would have to be on the ball too. A GFX chip that you could buy for one year would still plug into new boards 5 years later as would the vid-ram, CPU and the system ram. Also, once any of them are upgraded the bios would need to auto-set to handle the faster speeds...so I want them to predict the speed of the GPU/CPU/RAM 10 years from now and at least try to make motherboards that can support the changing times for a realistic amount of time.

Sure, have boards with dual GPU's or more but it's time to get off the slot and move into a better format.

I know, the motherboards would cost more because the expectation would be that you could use the same motherboard for 10 years and frequent upgrades to the CPU/GPU/Ram/Gfx-Ram but I'd pay more for a board I didn't have to keep freaking changing while still being able to keep my game on and upgrade only the pieces that need upgraded, as I can AFFORD them.

On one side that's something that we'll be seing in the near future thanks to the HyperTransport format. Slashdot recently announced programmable chips (FPGA) that could be plugged into dual opteron motherboard and that could use the HT bus.Also recently announced on slashdot, the developpement of a standart hypertransport connector [slashdot.org] (as part of the HT 3.0 revision).

So maybe in a near future you'll see motherboards featuring HyperTransport connectors, in which you could directly plug CPU/DDR board, GPU/GDDR

Right on the first page of TFA it says that it is HDCP compliant, so you need the latest HDTV "set" in order to run it. So it's not like there was much of a chance of me purchasing one of these in the first place, but I'm not going to buy a DRM crippled product.

If your in firefox and it's just the text size that's bothering you, do CTRL-+ (Press the control and plus buttons) It will increase the font size, but not change the font. (you can use CTRL-- CTRL and minus buttons to change it back)