Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

muterobert writes "The Oculus Rift has competition, and it's incredible. The InfinitEye has a 210 degree field of view (compared with the Oculus Rift's 90) and surrounds your peripheral vision in the game completely. Paul James from RoadToVR goes in-depth with the team behind the new device and finds out how high-FOV Virtual Reality really works. Quoting: 'At the present time, we are using 4 renders, 2 per eye. Left eye renders are centered on left eye, the first render is rotated 90 left and the second looks straight ahead, building two sides of a cube. Right eye renders are centered on its position, the first is rotated 90 degree right and the second looks straight ahead, two sides of another cube. We then process those renders in a final pass, building the distorted image.'"

Oh come on, i thought we had got over the whole "strapping-two-screens-to-your-head-like-we-did-in-the-70s" thing

this has far more potential and could actually be mounted into glasses you could wear unlike the screen-strapping tech like Occulus and its clones which haven't progressed much since the 70s (complete with Fresnels lol)

First, for the apples to apples portion of the discussion, that display technology is 45 degree FoV. Given the article is about a project largely of interest because they were ambitious to 210 degrees which is much higher than the still respectable 90 degrees of Rift, bringing in a 45 degree FoV product into the discussion isn't immediately helpful. Now you *could* be suggesting that the technology could do better if they wanted, but until that's demonstrated it would be risky to assume that. The most optimistic reference material I could find about that sort of design said '100 degree FoV could be possible' based on designs that acheived 60 degree FoV' (but that's not exactly an apt comparison, since that material predates DLP which means it isn't quite talking about the avegant solution). In short, Avegant is aiming squarely at private consumption of video content rather than immersion.

Second, a significant driver for these new projects is a realization that HMD isn't a market that can drive a lot of custom, one-off design work right now. In order to get to a technology that people can actually *get* at an approachable price, they are working to leverage mass-market display technologies that are largely paid for by their use in tablet and similar form factor applications. DLP into the eye is a bit more custom and will probably not be as cheap.

Also, this discussion is solely about the display technology, but a very large part of the work that Oculus is focused on is motion tracking, which is a pretty critical component.

Finally, at least that prototype doesn't exactly look like the poster child of 'glasses you could wear', it's still pretty bulky.

I'm not saying that Avegant should pack up and go home, it could be very promising, but that's not a good reason to tell Oculus and InfiniEye that they are on a dead end path either. Avegant doesn't waste available resolution like the alternatives, but currently there is no solution that leverages the full resolution of the utilized technology while also providing an immersive FoV, but the former point might be moot if the tablet manufacturers continue their one-upmanship to the tune of 3840x2160 7" displays.

Curved LED displays are possible. There's a few demo mockup's of watch form factor displays out there. It's still going to be tricky to build a lens system to project the image as if it is further away.

Curved display will be the future, yes. The *far* future.As said by other in this thread, Occulus and InfinitEye project try to make the technology cheap affordable and mass produced. Thus they concentrate on the cheapest and most available hardware.Currently, 5" to 7" *flat* displays are ubiquitous in tablets, and thus are very easy to source for Occulus and IntinitEye (and thanks to the "Retina" fad launched by apple, these screens have high resolutions too, thus looks nicely even if blown up to the full

The "accommodation reflex", i.e.: "how much your eye need to focus" is only used for depth perception at very short distance.

e.g.: If you read a book, are busy hacking/reparing stuff on your bench, etc. your brain notice that the object are near, not only because it needs to cross the eye a lot to focus on the short distance, but also because you eye's len need to contract a lot to accomodate (well at least for those who are of a still yound enough age to be able to do it. Older geeks with presbyopia [= rig

If immersive efefct i what you'tre going for, the potential for this technique is in fact severly limited.First let's strip away some marketing mumbo jumbo:

The "projecting directly onto the retina" pitch is bull.Unless you want to venture into eye surgery, you can't bypass the optics of the cornea etc ("lazers" or no "lazers"), so any light looking like it comes from a particular direction has to actually arrive from roughly that direction. It follows that and some part of the chain has to physically cover

Yes, among other things. They use 2 renders per screen, so you should be expecting about 16ms more latency than oculus. Also they have double the vertical pixels compared to oculus, so you might expect more latency compared to the oculus screen from computations too. Latency comes down to several factors, rendering latency, screen latency and everything in between. pixel to pixel latency is very much dependant on the screen type. Oculus has said theyre looking into OLED screens that have minimal latency (fe

Specifically, I'm more concerned with the lag between accel/gyro positional reading and screen update. If this lag gets too high, you get a tearing effect which can seriously degrade the overall VR experience.

Both oculus and this one seem to use ~1000hz tracking. so its not an issue, tearing comes from display refresh and it's not about the tracking. In a video i saw some guys at occulus break down the latency and it was something like 2ms for usb, 16ms for game engine/rendering, ~16ms for screen refresh@60fps/hz and 16ms for pixel switching.

I remember the first VR fad in the 90s... it seemed like such a neat idea.
However, the graphics were horrible, frame rates sucked, head tracking was laggy, headsets were bulky, screens were blurry, FOV was too small, and people were still trying to figure out 3D movement control schemes.
I've felt that ever since around 2004 we've been ready to give VR another shot, now that we've fixed or have the technology to fix every single one of those problems. And it seems like a lot of different companies are go

So many things wrong. TFA is comparing horizontal and vertical FOVs. Both sets have about 90 degree vertical FOV.
Oculus rift has about 110 FOV and human is maxed out aroud 180. Also peripheral vision is not very sharp. Oculus is really not having too much problems with the FOV anyway. The main visual candy problem seems to be the DPI among few other things. This item does not even improve on the real issues of oculus, like motion sickness, positional tracking and others. TFA seems to be an advertisement,

Fully agreed. I'll be happy to trade off FOV and some PPI if it means we can keep screen refresh rate up, sensor latency down, and resulting rotational error low. Bonus if these things can be done on battery power, and not require being connected to a wall outlet all the time.

yes. for the tdp in question. you are aware you're comparing a ~100w processor to a 4w soc? Also graphics is where the power is needed and in this case it's needed in stereo. so just adding a second gpu to the soc would do a lot.

You also seem to forget that cpu power has not been a real issue in desktops for quite a while.

Also to further press the point we did have 3d games in 386 era too. thats quite a few generations before p4. it's more about optimisation and less about stupid remarks about what you could do with a p4. Not forgetting that the point was "they will be there few years from now"

Why couldn't the video be wirelessly sent to the screen. The WiiU has a screen that gets it's picture wirelessly sent to it by the system. It runs off of battery and the WiiU console has to be able to drive two separate video outputs at once, one for the TV and all the players using the Wiimotes, and a separate video signal sent to whoever is using the game pad. It doesn't seem too far off from a head mounted display. Just send both pictures to the HMD.

It could, but the latency would be way too much on current systems. 60fps@1080p is a minimum. and total latency is around 45-50ms right now (from movement to photon) and it needs to get lower. being wireless would add to that latency too.

Agreed, some time in the not too distant future this concept could be a portable console. beefy enough battery and future mobile chips will be able to run a decent game in 1080p in stereo. couple of years and it will be so, if oculus (or someone else) manages to break trough in the pc market.

With these somewhat asymmetric FOVs, a single number doesn't provide enough information to understand what you're getting.What's needed now is the "inside angle" and the "outside angle", where:- inside angle = how much either eye can see toward the other eye- outside angle = how much either eye can see away from the other eye(in either case, measure the angle from "straight ahead" over to the cut-off point where you can no longer see anything)In a symmetric system, both of these numbers are the same (or pre