Post Your Comment

54 Comments

Ashu Rege is coming this February to India's first and independent annual summit for the game development ecosystem - India Game Developer Summit (http://www.gamedevelopersummit.com/)">http://www.gamedevelopersummit.com/) to talk about the novel uses of GPU computing for solving a variety of problems in game computing including game physics, artificial intelligence, animation, post-processing effects and others. Reply

SonicIce has left links that, according to WOT Firefox extension, redirect you to a malware website called jord.nm.ru. I wouldn't go there, especially if you're using IE. Just thought I'd warn you. There's nothing of interest in his links anyway in my opinion. Reply

If this is the chicken needed to to get us the eggs we need and increases stereo/3D adoption and support to critical mass, wonderful. However, a truly immersive experience--something that puts it beyond a novelty--is going to require much more, such as HMD's with decent resolutions, FOV, binocular overlap, etc. The price/performance/quality of those has improved significantly as OLED and related microdisplay technology advances, so we may be close... *IF* the game/app support is there. Unfortunately, I don't see much compelling about NVIDIA's offering, but if nothing else, they deserve an E for effort. IMHO I expect this will be at best a waypoint; in a couple years we'll have truly immersive HMD/VR systems that are affordable and will provide a compelling improvement in experience (again, IF the game/app support is there). Reply

As I recalled, NVIDIA has made a lot of reference cards with stereoscopic outputs since the earlist Quadro. And who can forget all those ASUS TNT(2) cards that came with stereo glasses options? Considering that virtually ZERO game developers even care about making games using 3D glasses since the day of Descent, I fail to see how will NVIDIA suddenly able to convince game developers to make games that require people to by a $200 accessory on top of $400 video card for optimal experience. That kind of market is too small to be viable for any commercial game developers. Reply

I can understand NVIDIA wanting to branch out to the gamer in 3D rendering, but what I don't understand is why they don't leverage off of their strengths and do a Joint venture. What I'm getting at is this:
1) NVIDIA is exceptional in creating Graphics rendering processing. Nvidia is not so good at developing physical Optical systems and understanding the Human Eye.

2)Vuzix is an expert in HUDs, and also has a viable (competitive) commercial HUD for watching movies. I speaking with a rep last year they were also interested in stereoscopic displays but could not pursue it since there was not a lot of market support or venture capital

These two guys sound like a great match to me. Toss in the fact that Vuzix is Rochester, NY where they have the University of Rochester institute of optics, and RIT's imagaing science center.

Have I painted a decent enough picture yet Nvidia? I can flowchart it out for the corporate suits if you like. Reply

That is a picture of the driver panel from my nVidia drivers before they dropped support for real 3D.

I would love to spend $1000 on these glasses, and a new system, Vista, new video cards, and of course a new monitor to use them on.

I find this stupid because the quickest way to get this product a success is to appeal to people who have been vocal about their previous good 3D support, not pull the rug out from under these people with no warning, no comment, and no incentive. Reply

I believe Gary turned us all on to ginger root -- taking a good bit of it before playing Left 4 Dead with it's disorienting source engine FOV is the only way I can survive normally ... actually, you know what? I was able to play the game without taking anything with the glasses and I didn't even think of that til now. It seems the 3D Vision may have actually fixed my nausea with tight fov games ... I'll have to do some more testing to see if this pans out ... Reply

So how is this different from my ELSA 3d shutter glasses from 1999? The glasses I paid $50 for back then are just as good as this $200 setup in 2009? Great job re-inventing the wheel and charging more for it nVIDIA.

There is a reson shutter glasses didn't catch on. Ghosting being the worst problem, along with compatibility, loss of brightness/color accuracy, performance hits, the need for high refresh rate, etc etc etc.

If you are thinking of buying these, don't. You will use them for a few weeks, then just toss them in a drawer due to lack of game support and super annoying ghosting. Reply

I'm perfectly satisfied with the current refresh rate of LCD-panels (60Hz). However what you forgot is the following: if the 3D glasses open and shut 60 times per second (for a 120Hz Panel) the old flicker of CRTs is effectively back. Therefore raising the refresh rate of the monitor to 240Hz would reduce the per eye flicker to an acceptable 120Hz. Not the monitor itself is the culprit here but the 3D glasses reintroduce flickering like in the old days of CRTs (and they are directly dependent on the refresh rate of the monitor).

Wouldn't a ray traced image work far better for stereoscopic viewing? From what I understand the rasterizing technique used by today's graphics cards uses all kinds of tricks and effects to create the perception of a "real 3D world". That's why the drivers have to be customized for every game.

Ray tracing uses a far simpler algorithm to get good results. Every light ray is calculated separately and every game that uses ray tracing should therefore - in principle - easily be customizable for stereoscopic viewing.

I'm thinking of the announced Intel Larrabee which will maybe offer ray tracing acceleration for games and could therefore be much better suited for stereoscopic viewing.

Not sure if I'm right with these thoughts but it would be interesting to see if games that are already available in a ray tracing version (like Quake 4) could be easily adapted to support stereoscopic viewing and what the result would look like.

Apart from that I also think we would need faster LCD-panels (240Hz) to get non-flickering pictures for each eye.

Check out some of the other initiatives, notably iZ3D, who have offered a free driver for all AMD products and XP support (double check the nVidia support for XP, non-existent much?)

nVidia's idea is too little, too expensive, too late. I have built my own dual-polarized passive rig that works great with $3 glasses, unfortunately nVidia has dropped all support (the last supported card is from the 7 series, so "gaming" isn't really an option.)

Thankfully iZ3D has stepped up to provide drivers, but thanks to nVidia's lack of support I have lost so much money on unsupported 8 series hardware that I haven't looked at a game in a couple years.

nVidia has killed my will to game. Dropping support of 3D is not the way to sell 3D (do some research, nvidia has dropped XP, supports only vista, and not even any of the cool displays you can cobble together yourself for less than the $200 this stupid package costs.)

I had one of these... and I had it with the 3d glasses!!! it was a 8bit console, with bad-looking games, the 3d glass was conected to the console via a cable, and the pace of changing the eyes was so slow you could se it if you pay enough attention. but it worked. and worked with any simple TV out there. however it was only FUN, no good images in reality... it's nice to see this technology come back to life! Reply

60hz should be the MINIMUM. Not the STANDARD. Even @ 60hz you tend to get a good bit of eye strain. I don't know how the monitor/tv industries get away with the mere 60hz. I for one STILL get headaches. Doesn't anyone else? Reply

120hz LCD panel is probably enough to say where your testing went wrong and your problems with ghosting and other issues began.

You must use a display with a native almost instant response, and no LCD panel to date can provide that (regardless of how much overdrive is given to nasty poor-quality but fast-response TN panels). You should have went old-school and used a high-quality CRT at 120hz refresh-rate, like many pro-gamers still use, or if available an OLED display as they would also be able to cope properly with 120hz refresh. Hell, I've got an old 15" CRT sitting on my desk which is capable of 640x480 @ 120hz which would almost certainly have done a better job of testing your 3D goggles than whichever LCD panel you used.

Ghosting would almost certainly have been a non-issue with a CRT running at 120hz, and having the left and right-eye images not having some of the other eye image also still visible (because of LCD response-time) would almost certainly have made it look a lot better. Reply

Not that kind of ghosting ... it didn't have to do with the panel -- everything looked fine on that end. I'm using the samsung 5ms 120Hz 1680x1050 monitor. the image looked smooth.

after talking with nvidia, it seems the ghosting issues were likely from convergence being off (where the look at points for each eye camera are set) causing problems. convergence can be adjusted with hot keys as well, but i didn't play with this.

eye strain didn't appear to be from flicker either -- it's more about the exercise of focusing on things that aren't there ... tweaking the depth (separation) and your distance from the monitor can make a big difference here. a CRT would not have made a difference. i do have a sony gdm f520, but its just not at the lab ... Reply

Anyone interested in this should also check out and compare it with the competitor solution from iZ3D http://www.iz3d.com/">http://www.iz3d.com/ The 2 solutions each have their pros and cons, but iZ3D is significantly cheaper (MSRP $400 versus $600 ($200 glasses + $400 120Hz monitor)). iZ3D works with both ATI and NVIDIA video cards, and ATI users get an extra $50 rebate. Reply

This looks very promising, if nvidia really want to push this rather old technology forward again I'm sure they can do so.

OpenGL have had built in support for the buffers you need to create stereoscopic images for years, in fact since version 1.1 if im not mistaken, so that is really no excuse for developers not using it.

And the suggestion that nvidia should just make a 3d monitor, what technology do are you refering to here, because as far as I know there is no technology capable of creating 3d images on a tradiional flat 2d monitor. Reply

One of my roommates in college had a VR helmet he used to play Descent, and was interning at a company designing (then) state-of-the-art updates to it. It was pretty wild to try, and hysterical to watch the person in the chair dodging and moving as things flew at them. It was really dodgy on support though, and gave most of us a headache after about 10 minutes. Now it's over 10 years later, and it doesn't sound like much has changed. Reply

Am I the only one getting a feeling this is a start of something designed to suck up more GPU power and/or sell SLI as a mainstream requirement? After all, resolutions and FPS increases can't alone fuel the growth Nvidia and ATI would like. Reply

I think you are being hopelessly negative about why nVidia would be doing this.

What advantage do they gain by a move towards stereoscopic 3D glasses? Okay, increased 3D rendering power is needed as each frame has to be rendered twice to maintain the same framerate, but GPU power is increasing so quickly that is almost a non-issue, so SLI is irrelevant... NOT.

The main problem with stereoscopic rendering is each consecutive frame has to be rendered from a different perspective, and only every second frame is directly related to the one before it. That seems to be so nicely connected to what SLI AFR mode provides that it is too good to be true. One card does the left-eye in SLI AFR, the other the right-eye, and with suitably designed drivers, you get all the normal effects which rely on access to the previous frame (motion-blur etc) but in a "3D graphics system" sell twice as many cards as one card is doing each eye. They're not daft-- stereoscopic display is going to make dual GPU cards not just a luxury for the high-end gamer, but a necessity for normal gamers who want a satisfactory 3D experience. Reply

... for nvidia to co-operate with monitor manufacturers and implement 3D in the display itself instead of these half-baked attempts at depth. Nobody really wants to wear special glasses so they can have 3D depth perception on their computer.

The only way you are going to standardize something like this (because people are lazy and ignorant, lets face it) is to do it at the point where everybody gets it so it is standardized - everyone needs a monitor with their computer, so it would make sense to work towards displays that either:

1) Are natively 3D or
2) Built the costly stereoscopy into the monitor itself, thereby reducing costs through economies of scale.

I really think current shutter based stereoscopic 3D is a hold-over until we start to get real 3D displays. If I was nvidia I'd want to do it on the monitor end, not as an after-market part targetted towards gamers at a $200 price point. Reply

Is there the possibility to just use an SLI-system to get rid of these problems about the visual quality. So would it be possible to let every Graphiccard do the calculations for every eye and so you could the same Quality as on one card? Reply

"What we really need, rather than a proprietary solution, is something like stereoscopic support built in to DirectX and OpenGL that developers can tap into very easily."
Would be nice, but whatever works and is actually implemented.

Nvidia could come up with a "3d glasses thumbs-up" seal of approval for games that get it right, and it could be displayed on the packaging. This would furter encourage developers to get on board. Heck, NV could have traveling demo rigs that sit in a Gamestop/Best Buy for a week, playing games that have superior compliance. Good for sales of the game(s), good for sales of the glasses.

I've done the old shutter glasses, was a neat novelty, but wears thin as Derek says. Sounds like these are currently only a bit better with current titles in most cases. *IF* they get this right and all major titles released support the system well, I'd buy the glasses right away. The new monitor too. But they have to get it right first.

This might work for the next generation of consoles too, albeit if hooked up to a high-refresh monitor possibly. Great selling point, another reason to get this right and off the ground. Of course Sony/Nintendo/MS might just make their own solution, but whatever gets the job done. If only one had this feature implemented well, it could be a big tie-breaker in winning sales to their camp. Reply

Been waiting for the next revolution in gaming and after all the bugs have been worked out, this looks like it could be a contender. I'm typically an early adopter, but I'm fairly happy with a physical reality LCD at this point. Will wait in the wings on this one, but I applaud the Mighty nVidia for taking this tech to the next level. Reply

Although I am great supporter of 3Ding of virtual worlds, there are really huge drawbacks in this technology nVidia presented.

First of all, the reason why LCDs did not need to keep as high refresh rate as CRTs was the fact that LCD screen intensity doesn't go from wall to wall - 100% intensity to 0% intensity before another refresh (the way of CRT). This intensity fluctuation is what hurts our eyes. LCDs keep their intensity much more stable (some say their intensity is totaly stable, though I have seen some text describing there is some minor intensity downfall with LCDs as well, can't find it though). Back back on topic... we either went 100Hz+ or LCD to save our eyes.

Even if we ignore software related problems there is still problem... The flickering is back. Even if the screen picture is intensity stable these shutter glasses make the intensity go between 0-100% and we are back to days of old 14" screens and good way to get white staff sooner or later. Even if we have 120Hz LCDs, every eye has to go with 60Hz pretty much same as old CRTs. This just won't work. For longer use (gaming etc.) you really need 85Hz+ of flickering not to damage your eyes.

Another point I am curious about is how the GeForce 3D Vision counts in screen latency. It's not that long AT presented review of few screens with some minor whine about S-PVA latency coming way up to like 40ms. Thing is this latency could very easily cause that the frame that was supposed for left eye gets received by right eye and vice versa. I can imagine nausea superfast TM out of that (kind of effect when you drink too much and those stupid eyes just can't both focus on one thing).

I believe this stereoscopy has a future, but I don't believe it would be with shutter glasses or other way to switch 'seeing' eye and 'blinded' eye. Reply

The answer is simple, move from LCD technology to something faster, like OLED or SED (whatever happened to SED?).

Both of those technologies are quite capable of providing a true 200hz refresh that truly changes the display every time (not just changes the colour a bit towards something else). A 200hz display refresh (and therefore 100hz per eye refresh) should be fast enough for almost anyone, and most people won't have a problem with 170hz display (85hz flickering per eye).

I do think 120hz split between two eyes would quite quickly give me a serious headache as when I used a CRT monitor in the past and had to look at the 60hz refresh before installing the graphics-card driver, it was seriously annoying. Reply

"A 200hz display refresh (and therefore 100hz per eye refresh) should be fast enough for almost anyone, and most people won't have a problem with 170hz display (85hz flickering per eye)."

Almost is the key word here. I'm okay with 75Hz CRTs unless I'm staring at a blank white screen (Word), and by 85 I'm perfectly fine.

However, my roommate trained as a classical animator (which means hours of flipping through almost identical drawings) and could perceive CRT refresh rates up to about 115Hz. (She needed expensive monitors and graphics cards....) Which would demand a 230+Hz rate for this technology. Reply

That's because the CRT blanks while the LCD stays on. With an LCD panel, every refresh the color changes from what it was to what it is. With a CRT, by the time the electron gun comes around every 60Hz, the phosphorus has dimmed even if it hasn't gone completely dark. 60Hz on a CRT "flashes" while 60Hz on an LCD only indicates how many times per second the color of each pixel is updated. Reply

Yes but LCD's have ghosting, unless you can purge that image completely the right eye would see a ghost of the left eye. If you ever looked at the stills from testing LCD ghosting, you will see that even the best LCD's ghosts last for 2 frames.

The best TV I can think of to use this with is the 7000$ Laser TV from Mitsubishi.

Why can they not use dual videocards for this, Have one frame buffer be the left eye and the other be the right, then even if the car has yet to finish rendering the other image just flip to the last fully rendered frame. Reply

I think the result would be quite bad. You could easily end up in situation where one eye card runs 50 FPS while other eye card would be on 10FPS (even with the same models... the different placement of camera might invalidate big part of octree causing the FPS difference. Not sure how the brain would handle such a difference between two frames, but I think not well... Reply

They're skinny because it's easier to do it if the image isn't too wide. These pics are for PARALLEL viewing (left eye sees left image, right eye sees right image.) If you use the CROSS-EYE technique you need to flip the images. Reply