Fire up your GPU: VR takes hold at Nvidia’s GTC 2016

This site may earn affiliate commissions from the links on this page. Terms of use.

As all of us who are excited about the launch of Oculus and Vive are learning, virtual reality is all about GPUs. While many PCs have enough CPU horsepower and memory to handle a VR workload, very few have GPUs that are up to even the minimum suggested specs for VR playback — let alone development. This nearly insatiable need for GPU horsepower makes VR a natural area of focus for Nvidia, as it showed at this year’s GPU Technology Conference (GTC). With 37 VR-related sessions, dozens of demos, and a good portion of CEO Jen-Hsun Huang’s keynote dedicated to VR, it was, along with deep learning and autonomous vehicles, one of the three biggest themes at the conference.

Look at me: Eye-catching “real world” VR experiences

Much of the excitement around VR is built using amazing demos of virtual worlds. Perhaps fittingly, virtual worlds are actually easier to portray than real worlds in VR, because it is possible to model exactly what a person would see from any point of view, and with any lighting. Creating stereo views is also relatively simple — basic stereo camera support can be added in game engines such as Unity merely by checking a box. Now, though, we’re starting to see full-fidelity experiences based on modeling real-world locations. Huang showcased two of the most elaborate during his keynote: Everest VR and Mars 2030.

Everest VR was put together by Solfar and Rfx using 108 billion pixels of actual photographs, that were turned into 10 million polygons, which in turn drive a VR experience that is essentially photorealistic. Game-like physics are used to generate drifting snow for additional “reality.” What makes immersive experiences like Everest much more powerful than an ordinary 360-degree video is that the user is not limited to a particular location, but can move through the environment.

Similarly, Mars 2030 is largely based on massive numbers of photographs our spacecraft have sent back from there, allowing NASA, with help from Fusion VR, to model eight square kilometers of the planet’s surface. The studio then went to work enhancing the model, including creating 1 million hand-sculpted rocks, and 3D versions of mile-long underground lava tube caves. Hoping for some star appeal, Huang brought Apple co-founder Steve Wozniak up on screen so we could see him be the first ever to experience Mars 2030. Everything went well for the first couple minutes until Woz said he felt dizzy, and needed to stop before he fell out of his chair. That was definitely awkward, and symptomatic of the “queasiness” issue that continues to intermittently trouble VR rollouts.

Making fantasies seem real: iRay VR and iRay VR Lite

Nvidia’s iRay is the ray-tracing package of choice for many of the top 3D modeling packages already. This year the company will be extending it with additional capabilities to support the unique requirements of VR. Before VR, producers of computer-generated video content could choose between either time-consuming photorealistic rendering, like that used for feature films, or real-time, plausible rendering needed for interactivity in video games. VR experiences are creating a demand to achieve the best of both — realistic, immersive, experiences that are high-quality, 3D, and allow the user to move around. That means they can’t be entirely pre-rendered. Unfortunately, moving from a 3D model to a photorealistic experience is too processor-intensive to do entirely in realtime. So views of the model need to be rendered, typically using ray tracing to mimic lighting and reflections in a physically accurate way. Traditional applications like movies or print only require high-resolution 2D images to be produced, but immersive VR requires the creation of an interactive experience.

That’s where iRay VR comes in. For those willing to buy, or rent time on, a high-performance computing cluster (like Nvidia’s own DGX-100), iRay VR can generate a navigable model of a scene. The user can move around in the scene, and turn their head, while getting physically-accurate lighting, shadows, and reflections as they move. Nvidia demoed this with an interactive VR model of its planned headquarters that was quite convincing. Unfortunately, even the viewing computer needs to be pretty massive — requiring around a 24GB frame buffer like the one in the 24GB Quadro M6000 to run.

Even with a supercomputer at hand, rendering for VR requires some compromises. Lucasfilm’s Lutz Latta explained that, for example, the Millenium Falcon model used for Star Wars is made up of over 4 million polygons. Perfect for the ultimate in cinematic reality, when it can be rendered one frame at a time, but too complex for a Star Wars VR experience. In addition to simplifying it, the studio has worked on a way to have a unified asset specification, so that models can be built once and then are available for use in a variety of different media like film and VR.

For those of us on slightly more limited budgets, iRay VR Lite will allow you to upload a model to Nvidia’s servers, where they will generate a photorealistic 360-degree stereo view — but one that you can’t walk around. For a full-fidelity experience, even the Lite version will take advantage of 6GB or more of frame buffer, but the experiences will also be viewable on low-end devices like Google’s Cardboard. Nvidia expects to have iRay VR Lite available by mid-year, with iRay VR to follow.

iRay VR is only one part of Nvidia’s VRWorks set of tools for VR development and delivery. Other parts of VRWorks also had updates announced at the show. In particular VRWorks SLI will provide OpenGL across multiple GPUs and there is now a VR support in GameWorks.

Realities.io: Immersive environments made practical

Everest and Mars are both very expensive, long-development-cycle, efforts, more or less the equivalent of making a feature film. That limits their creation to large organizations, and their subjects to those that are likely to attract millions. Startup Realities.io has developed a system that allows it to relatively quickly, and inexpensively, create photorealistic environments from “everyday” locations. Using around 350 photographs of a scene, it can use a process called photogrammetry to create an interactive model that the user can walk around.

The level of detail is pretty amazing. You can stoop down and see trash on the floor, or walk over to a wall and see the brush textures in the graffiti. Realities also captures scene lighting, using light probes, so reflections change realistically as you move. If you’re one of the lucky few that have a Vive, you can download Realities for free via Steam.

Even more exciting, Realities founders David Finsterwalder and Daniel Sproll hope to further democratize the process of creating immersive experiences even more by enabling others to go out and capture the images they can then process and produce.

Orah 4i camera: A stitch in real time

A more common way of creating VR experiences is using 360-degree camera rigs. There are plenty of those on the market, ranging from consumer units with a couple fish-eye lenses to studio-quality rigs like the ones from Samsung and Jaunt VR. However, all of them require a lot of post-processing to accurately stitch the images together. One company, Videostitch, has made a good living providing stitching solutions, but at GTC they announced they’ve gone one step further. They introduced their own turnkey camera + stitcher, the Orah 4i. Available for pre-order for $1,800, it has four high-quality cameras with 90-degree lenses, which feed synchronized data to a small processor box that stitches them and streams a live 360-degree image. In addition to the obvious use for event coverage, the system will also allow excellent real-time previewing for those doing high-end 360-degree production work.

As an aside, the Orah, like most 360-degree camera rigs, does not produce a stereo image. And because it only has one camera facing each direction, you can’t really generate one after the fact. Higher end rigs with more cameras, like Jaunt’s, do allow post-processing to generate depth maps and stereo views.

However, almost all of these units — whether mono or stereo — are bundled together under the umbrella of “VR.” Similarly, many “VR” experiences are static 360-degree photos that don’t allow you to move around the scene (but may or may not be stereo). It’d be great to start to get some standardization on terminology here. In my case, I’ve started to use immersive only for experiences that are both stereo and allow movement within the scene. I think it’d also be helpful if we only used VR to refer to experiences with a sense of depth (e.g. stereo), but the term is too popular as a marketing tool for that to be likely to happen.

VR gets down to business

VR at the show wasn’t all about entertainment. Two of the demos I experienced were all about commercial applications. ZeroLight was showing off an amazingly realistic Audi in VR — part of a customer-focused sales experience that Audi will start rolling out later this year — complete with Vive headsets in dealer showrooms that allow you to configure and virtually experience your car.

WorldViz has extended standard VR functionality to make it ideal for many industrial and commercial applications. It allows users to see each other and work together on a task in a virtual environment — even if they have different brand headsets — and it supports much larger “room scale” environments. For example, one client created a virtual hospital in a gym, so doctors could test out the work environment before it was built. One of the scenarios I ran through involved working on a helicopter rotor. It really brought home the power of the Vive’s touch controllers. They are a dramatic step forward from trying to use a small remote or gaming controller to manipulate objects in 3D. I hope Oculus gets their version out soon.

Is VR right for you?

VR at GTC was deliberately about nearly every application other than games — after all we just had VR-frenzy at GDC last month — but for most individuals, VR in 2016 will either be about gaming (I’m a racing sim fan, so my favorites that are available so far are Project Cars and Dirt Rally, but there are tons of others) or involve fiddling around with 360 experiences using a Gear VR or Cardboard. For those who do buy a high-end headset for gaming, sure the other experiences are cool, but I don’t see anyone upgrading their computer and plunking down another $800 just to walk around a virtual Mars for a bit. Beyond that, I’m hoping Google announces something amazing under the Android VR banner in May, that can bridge the gap between the current low-end mobile phone offerings and the current crop of gamers-and-hackers-only PC-driven headset offerings.

Sigh. Probably not. I think it is almost identical to one of the ones I just ditched as part of upgrading the GPUs here at home. Only bought 1 VR-ready (Nvidia-based 970, which is barely enough, figuring maybe there would be a 1080 by end of year, so I didn’t want to spend the money for a 980 Ti and then worry later), and a couple others just to keep up with modern video streaming).

Shigura

Thanks for the reply, while I have your attention, any advice for someone looking to build a new system right now on what to go with? I’ve got a budget of around $1000 or so (possibly slightly more if I HAVE to.) Already have a case to recycle, don’t need a monitor, etc etc. I’ve been doing my research but trying to figure out the best balance of price/performance/future compatibility.

Wow, I’m not nearly up to speed enough on all the options to opine, but if you want VR-ready, our sister pub PC Mag just did a walkthrough on exactly this. Unfortunately, I can’t find the link right now, but it featured original ET and uber-PC-builder Lloyd Case.

That does look like a good deal. FWIW, I’m running my main VR machine with a 970, and it is okay for what’s out there today, but I’m guessing we’ll see 120fps / 4K headsets in 12-18 months and I’ll be wishing I had more. However, my thought was that instead of spending double on a 980 Ti now, I’d get the 970 and then splurge on a next-gen card around when the next round of headsets come out.

Jason

Well, the next gen card will be out far sooner than the next round of headsets. There’s supposed to be a MAJOR jump in GPUs when the 1070/1080 drops in July. That’s why I got the 970 now.

Cruddy Bapz

You should of went for a 390, it is generally agreed that radeons are better at VR and have a longer life span

Mitt Zombie

AMD cards tend to die an early heat related death though in my experience

e92m3

Actually, luxrender already had functional VR support over a year ago. A cluster of GPUs (4-8) with decent OCL support and fp32 compute performance was quite capable of rendering some very high quality environments.

No idea about the comparative latency, but the basic principal was possible years ago and didn’t require a ‘supercomputer’.

Sounds like they used photogrammetry for Everest model and textures. That would be a pretty cool thing to see.

Good to know about VR in luxrender. Several researchers I know have been using pbrt as an open-source ray tracer, but it seems like maybe luxrender has picked up where pbrt left off (GPU support, etc.) Is that correct, and do you think it’d be straightforward to switch?

On the supercomputer bit, yes, depends on the model of course. The model of Nvidia’s upcoming HQ was very large & complex. Same for the interior demo they used. Then again, Nvidia calls anything bigger than a breadbox (or even smaller in the case of the Drive PX 2) a supercomputer, so we’re not always talking room-size Crays:-)

onstrike112

So? VR isn’t going to magically make all gaming different. It’s going to be a slow, awkward, annoying crawl that a lot of people, like myself won’t care about. I want more realistic graphics, but I don’t want to wear a fugly head-mounted-display. The Galaxy Gear VR, Oculus Rift, HTC Vive, etc all look fugly, and not something I’d want to be seen wearing once, never mind actually use them on the daily.

Jason

Have you actually tried a Rift or Vive? I think you should before being so sure.

onstrike112

I’ve looked at the Gear VR in person, and OSVR, Vive and Rift online. They look stupid, and I’d feel stupid wearing them. No thanks. I’ll play my games like a normal human being, on an LCD screen.

Jason

Looked at, or WORN? Because it’s a totally different ballgame (speaking specifically of the Rift/Vive). The sense of presence is extremely compelling, but you can’t really feel it until you try it. It’s a pretty big deal to ACTUALLY be able to feel like you’re exploring the surface of Mars (see the recent Mars 2030 demo), something you can’t even come close to with a non HMD game.

onstrike112

Who cares when you look stupid wearing them and have to have someone babysit you while wearing it?

Jason

You. You won’t care.

It’s kind of like sex that way.

onstrike112

Don’t make me laugh. They’re stupid, and there’s no way you’ll convince me that they’re worth using. The Virtual Boy had better looks than these stupid hmds.

Jason

Yeah, there is a way – try it. Not trying it proves nothing, just like not reading my comments would. Nothing could convince you if you just turned off your computer now and never read another internet post (that might convince you). ;)

onstrike112

Okay, tried a Gear VR at Best Buy. It a: looked stupid, and b: didn’t feel more immersive than my LCD panel at home or the BlackBerry Priv I’m typing this on held inches from my eyes.

Vzguy

The gear vr is the worst vr head set….and shouldn’t be compared to the rift nor the vive…

In some defense of the Gear VR, if you put a good phone in it, and use good content, it can be pretty reasonable (IMO). However, since it doesn’t track head motion (although it does track rotation), it is definitely not “immersive” the way the Vive & Rift are. It also isn’t running at the same frame rate, so I’ve found it most useful for content like 360 photos & video that don’t move around quickly. Trying to do racing games on it made me pretty queasy, pretty quickly, while they don’t on a Rift or Vive. I think the market for the Gear is pretty much folks with a nice Sammy phone who either get it bundled, or (like me) figured $100, sure, why not?

PS I also really like that it is not tethered. I use mine with Bluetooth ear buds so it is totally wireless. But as you point out, the quality certainly isn’t as good as a Rift or Vive, and it doesn’t let you do any room scale or similar stuff.

Vzguy

I’m not 100% positive but doesn’t the gear vr only work with new Samsung phones? Your argument is valid it’s definitely the best at that price.

For the vive and rift: I’m interested in seeing how they can make it wireless while maintaining a relative light weight design and also maintaining fluidity and power.

New and recently new:-) S6 & S7 family + Note 5. So none are “bad” but S7 > S6, and both have better detail resolution (PPI) than the Note 5, but Note 5 has wider FOV because of the larger screen. So there is some variation & experience.

However, the Gear Innovator Edition, which is still quite numerous (and was sold at Best Buy for $200), worked with older phones, so if people aren’t really careful, it is easy to confuse the Gear VR consumer with Gear VR Innovator (just like there have been several versions of the Oculus “Rift” that get confused).

As to untethered, all of the wireless I’ve seen have the processing on board (Gear, Hololens, ODG, etc.) That has its own issues, of course, including cost (Hololens dev kits are 3K, for example)

Good point. You can actually plug the Gear+Phone into a microUSB for charging while in use, but unless that cable runs to a small battery pack, you’re now tethered. Of course, it isn’t such an awesome experience that I see people using it for longer than it takes to overheat the phone:-) OTOH, I’ll be curious what the battery life winds up being on Hololens when it ships. I wouldn’t be surprised if it too winds up often getting used with an external battery pack.

Jason

A friend of mine who is very active in the VR scene was talking about some bit of wireless spectrum that has gone unused because even though it’s great for high bandwidth applications, it’s pretty much limited to line of sight. It’s being looked at for wireless video for VR, where you’re either staying put in a chair or walking around a room with no objects in it. So you keep the processing off the HMD and all you need is battery for driving the display, doing tracking and receiving the wireless video.

That’d be cool. There are certainly some solutions for wireless HDMI, but I have no idea whether they’d be applicable here.

onstrike112

It’s also indicative of how un-immersive these hmds are, as well as how fugly they are. The only one I even close to care about is OSVR, and it’s still fugly and stupid looking.

Vzguy

Why do you care what it looks like lol it’s not like you’re going out in it….”hitting up the club in the vr headset like where my ladies at”

onstrike112

So you admit they’re fugly. Good to note. Even VR suckups admit they’re fugly.

Vzguy

Cause it doesn’t matter lol its what’s inside that counts….if you base your buying decisions on just appearance you’re missing out on a lot. ….on a unrelated note same goes for people.

onstrike112

Right, what’s inside the fugly shell that counts. Hmds make you look stupid, and when you’re gaming out and about you’ll look REALLY stupid to everyone around you. Add to the fact that you have to have someone babysit you while wearing such a fugly hmd.

Vzguy

You don’t need someone to babysit you I own a vive and play by self all the time. Again it doesn’t matter what it looks like. I think the PlayStation looks fugly That dosnt mean I shouldnt play it. Also for a first generation mass release device I think it looks exceptional…. obviously virtual reality isn’t for you but I support Innovation and change in the gaming environment, gaming was starting to get very stagnant. I do think you should try either the vive or the rift before you go shouting about how “terrible it is” or “fugly”

onstrike112

If you call copying the VirtualBoy “innovation” then it’s not something I’m for. lol

Jason

Basically, I’d have to guess he can’t afford it so is trying to find some post hoc reasoning why it isn’t good. Even if he’d be wearing it alone in a room with no one seeing him, how it looks is incredibly important? Yeah, hard to believe anyone would actually expect someone to take that argument seriously. He/she could also just be a troll. Never discount that possibility.

Trollish-ness aside, currently VR definitely has a problem of economics (kind of like early electric vehicles). You’ve got to pony up for a pretty spiffy system, and then drop $800 more on a 1st gen headset (fwiw, even though the Rift is $600, I’d include the cost of 1-2 of the upcoming touch controllers), most of which will be out of date or superceded in 12-18 months. So you have to be pretty committed to take the plunge. That said, both Oculus and Vive are massively back-ordered, so clearly lots of folks want to get in now.

Jason

Yeah, I hear you. I broke down and ordered the Vive and just ordered a system for it. Luckily, my friend sent me a link to a sell HP is having where I could get a very nice system with a 970 in (but with a 500 watt PSU so I can jump on the higher end Pascal GPUs in July) for $1000. Otherwise, it was looking like the $1500 range.

Not sure I agree with you on the 12-18 month thing, though. I think if anything they’ll have some iterative changes, but considering the market is so small and the price so high, I’m not sure who they’d sell most of those new headsets (that made the old ones out of date) to in 12-18 months. Even if they sliced the cost in half, you’re still looking at in the $1k range. That’s STILL in the very early adopter stage. I’m guessing it will be at least a couple of years and maybe three or four before they release new headsets that will truly supersede the old ones. I just don’t see the market otherwise.

Enjoy your Vive. It is an amazing product! Sounds like you’ve got a good strategy. As to the 12-18 months, I do think there will be an update/next gen, but like with GPUs & smartphones, that doesn’t make the current ones obsolete. They’ll just be “more” for the same price, and probably some that are the same as what we have now for less money.

“all look fugly,” – actually they don’t. But you wouldn’t know, seeing that you never even touched any of these devices.

onstrike112

Not even close!

No, I just don’t want to look stupid. That’s how they look.

Yes, they do. They look fugly and stupid.

Robb49

It has lots of potential that won’t be achieved until the Internet is upgraded to handle the massive of mount of bandwidth that is generated by VR systems. It seems like all of these tech companies live in denial and expect someone else to fix it. In the mean time, the Internet continues to bog down more and more. A lot of potential customers for VR systems will probably not invest until the Internet is fixed.

Jason

They don’t send the VR environment down the pipe. Much like with MMORPGs, most of the data and processing is on your own machine and you relay very little data (positional and actions) to the server.

Robb49

It will depend upon the amount of activity and how many people are sharing a virtual reality. For instance, if you want to have a family reunion in virtual reality, you might have to represent hundreds. Or if you share a reality that has a lot of action, that will require more bandwidth. Since the Internet is the way much of information is shared, people will be passing around virtual reality data packages which are likely to be much larger than even HD videos. Right now, I can barely down load a video from YouTube at various times of the day without it freezing up. VR has plenty of potential. It can do far more than make for better video games. But our overcrowded Internet will limit the applications and the distribution.

This is definitely a “red alert” research area for VR. Right now encoding and transmission is not well optimized, since the medium is so new. Fortunately for single-person stuff, it is usually possible to download first (if you plan ahead!). As you point out though, anything requiring real time rendering on the back end or sharing, needs to deal with streaming (live events for example).

This site may earn affiliate commissions from the links on this page. Terms of use.

ExtremeTech Newsletter

Subscribe Today to get the latest ExtremeTech news delivered right to your inbox.

Email

This newsletter may contain advertising, deals, or affiliate links. Subscribing to a newsletter indicates your consent to our
Terms of Use and
Privacy Policy. You may unsubscribe from the newsletter at any time.