iFixit shows what makes the VR headset tick, display, and refresh.

iFixit posted a teardown of the Oculus Rift headset Wednesday to see what, exactly, the virtual reality headset is made of. The teardown reveals the types of screens and controllers the Oculus Rift uses, and though the score is preliminary, iFixit gave it a 9 out of 10 user repairability score—unusual in the glue, tape, and Torx screw times we now live in.

The Oculus Rift uses one 1280×800 LCD that is split down the middle to show one image each to the right and left eye to create a 3D image. The display is an Innolux HJ070IA-02D 7-inch LCD panel, provided by the same distributor rumored to be Apple’s source for replacement iPad mini screens. A custom-designed Oculus Tracker V2 board pings to track the headset's motion at a 1000Hz refresh rate.

The chips inside the device include an STMicroelectronics 32F103C8 Cortex-M3 microcontroller with a 72MHz CPU and an Invensense MPU-6000 six-axis motion tracking controller that has both a gyroscope and accelerometer. There is also a chip named A983 2206, which iFixit suspects is a “three-axis magnetometer, used in conjunction with the accelerometer to correct for gyroscope drift.”

In terms of how the headset connects to a PC, iFixit notes that the dev kit box includes HDMI, DVI, mini-USB, and DC-in ports. The teardown notes that the input to the headset is DVI, but the control box is capable of converting HDMI to DVI, and VGA is not currently supported.

iFixit states that the 9 out of 10 repairability rating is preliminary, since the dev kit is a preproduction unit; the consumer-facing unit may be more locked down.

51 Reader Comments

I did not realize the resolution on this was only 640 x 800 (per eyeball). That being said, the refresh rate is much more important than the resolution for creating an immersive 3D experience, so this should be aight.

120Hz isn't a standard refresh rate, it's an enhanced option some displays use to double up and make the motion appear smoother. Most videophiles immediately disable such "enhancements"

You are talking about TVs that interpolate frames in 60fps content to make it 120fps. That's terrible, but it has nothing to do with 120Hz displays any more than the other terrible post-processing options TV manufacturer give people. A 120Hz monitor connected to a PC can most definitely give you true 120fps, and it's a very noticeable improvement over the 60fps that traditional monitors can manage.

I did not realize the resolution on this was only 640 x 800 (per eyeball). That being said, the refresh rate is much more important than the resolution for creating an immersive 3D experience, so this should be aight.

This is the dev prototype. By the time a consumer version appears higher resolution screens will be available (look at the recently announced smartphones), and videocards will be powerful enough to drive those resolutions at the 120fps with no bumps necessary for good VR.

A custom-designed Oculus Tracker V2 board gives the screen a 1000Hz refresh rate, which is significantly higher than a normal TV (120Hz) and 3DTV (240Hz).

The tracker board controls the accelerometer/gyroscope/magnetometer, which reports the head motion at 1000 Hz, but doesn't control the LCD panel (which I believe has a refresh rate of 120 Hz). Besides, I can't even imagine the beast of a computer that could put out 1000 fps.

Also, most (not all) TV's have a refresh rate of 60 Hz, and 3DTV's generally have a 120 Hz refresh rate.

A custom-designed Oculus Tracker V2 board gives the screen a 1000Hz refresh rate, which is significantly higher than a normal TV (120Hz) and 3DTV (240Hz).

The tracker board controls the accelerometer/gyroscope/magnetometer, which reports the head motion at 1000 Hz, but doesn't control the LCD panel (which I believe has a refresh rate of 120 Hz). Besides, I can't even imagine the beast of a computer that could put out 1000 fps.

Also, most (not all) TV's have a refresh rate of 60 Hz, and 3DTV's generally have a 120 Hz refresh rate.

120Hz isn't a standard refresh rate, it's an enhanced option some displays use to double up and make the motion appear smoother. Most videophiles immediately disable such "enhancements"

Ok, 120Hz is a standard refresh rate, although uncommon. Videophiles immediately make USE of such enhancements. Why? Because film, which is typically 24 FPS, does not recieve equal display time per frame at 60Hz, it requires 24/48/72/96/120Hz display rates. As for video games the difference between 60Hz and 100Hz is very noticeable (100-110, kinda noticeable, and 110-120 alot less wow factor). Simple and easy tools to overclock your monitor's refresh rate exist for this very reason, if it gets an extra 12Hz you are at a point movie jitter is gone and gameplay has the opportunity to be noticeably much smoother.

A custom-designed Oculus Tracker V2 board gives the screen a 1000Hz refresh rate, which is significantly higher than a normal TV (120Hz) and 3DTV (240Hz).

The tracker board controls the accelerometer/gyroscope/magnetometer, which reports the head motion at 1000 Hz, but doesn't control the LCD panel (which I believe has a refresh rate of 120 Hz). Besides, I can't even imagine the beast of a computer that could put out 1000 fps.

Also, most (not all) TV's have a refresh rate of 60 Hz, and 3DTV's generally have a 120 Hz refresh rate.

For TV (and sadly even monitors) advertising a 120/240/whatever Hz monitor, it may just be marketing trickery based on a 'technically' true feature.

The display device may be refreshing the LCD panel at 120Hz so it is 'technically' a 120Hz monitor, although it may not be able to even accept 120FPS signal, or it may even accept the 120FPS and just simply drop/skip frames (there are simply checks for frame dropping), or only be able to accept 60FPS standard signals and only accept 120FPS 3D input. Also 3D tv's have several ways to combine frames together (checker board, interlaced, side-by-side) so they do not need 120Hz refresh so not all 3D tv's (yes, even active glasses) can be generalized to be 120Hz.

The manufacturers have a lot of wiggle room with wording in this which is quite sad they make use of that wiggle room to deceive people.

For TV (and sadly even monitors) advertising a 120/240/whatever Hz monitor, it may just be marketing trickery based on a 'technically' true feature.

Bureaucrat masterbinky, you are technically correct -- the best kind of correct.

Actually, a viewer like this could easily show you a 3D image without ever refreshing. Unlike televisions or monitors you are actually looking at two different images the whole time ... like a ViewMaster.

I did not realize the resolution on this was only 640 x 800 (per eyeball). That being said, the refresh rate is much more important than the resolution for creating an immersive 3D experience, so this should be aight.

This is the dev prototype. By the time a consumer version appears higher resolution screens will be available (look at the recently announced smartphones), and videocards will be powerful enough to drive those resolutions at the 120fps with no bumps necessary for good VR.

Oculus have actually stated it will be a higher res screen in the consumer version.

The dev version really is that, just a dev version. It is to get developers making games for it and is likely to be entirely different to the final version, Oculus have tried to discourage people from buying this one, telling them to hold out for the consumer version and constantly state the difference between the two to try to stop people from buying it for consumer usage. Right now it has stuff like being able to adjust the distance the screen is from your eyes with a screwdriver, that's not exactly something you'd find in a consumer device.

It's an interesting side effect of such a public development process, normally this stuff happens with NDAs and behind closed doors, but being such a public project you have consumers buying it, teardowns and opinions on the actual tech in something that consumers are never intended to use.

Does anyone have a link to an analysis of the ocular health effects of these devices?

As an ergonomy geek, I wonder what the eyes are up to when using one of these. What is the focus distance? The convergence?

In other words, does the eye behave like it's looking at an object at the virtual distance intended by the image, or does it behave like it's looking at a screen a few inches away from your nose? This will make a big difference in eye fatigue.

I sorta answered my own question: the Oculus website says: "With the Oculus Rift, your eyes are actually focused and converged in the distance at all times. It’s a pretty neat optical feature."It'd be nice to have a more thorough explanation, though.

iFixit gave it a 9 out of 10 user repairability score—unusual in the glue, tape, and Torx screw times we now live in.

Glue and tape is certainly a problem for user repairability, but Torx? As long as they are not security screws, it is a plus: Screwdrivers are readily available, even low quality screwdrivers are less prone to stripping (and a stripped screwdriver doesn't destroy the screw as easily as Philips and, to a lesser extend, Pozidriv does), and it is much easier to keep the screws on the bits, even without magnetic drivers.

Edit: Oh, and the sizes are usually proper standard sizes with exact match between screws and drivers, instead of the not-quite-fitting ranges of drivers to screws in Phillips/Pozidriv.

Would a Rift 2.0 benefit from the ac standard, or does this require too much bandwidth? One less wire can mean only good things.. Imagine making the base unit a "card" that can be installed in a computer case even, and do away with wires altogether (sort of)...

They list the response time as being 15/20 (Typ.) (tr/td). Granted, they're listing off specs for the HJ070IA-02C instead of the HJ070IA-02D but I'd be surprised if two models in the same family have radiacally different response times. I don't believe it's possible to reach anything even close to 120hz refresh rate with those kinds of response times. I'd love to be proven wrong though.

I suspect there will be howls over this, but I'm constantly amazed at how much hardware still comes with a VGA port.

Hell, my brand new work-issued laptop came with a 56K modem.

I'm not implying that there aren't people that need/want these things, but in 2013 it seems like these should be less than 1% of new equipment buyers.

It's the opposite problem to the usual complaint though. People complain about VGA ports not being on laptops to support peripherals (the VGA only projector at a university with a 400ft cable), but has a GPU powerful enough to support this ever been made that doesn't support a digital output? I put a 5 year old 8800GT in a PC the other day and was mildly surprised to not see a VGA port on it, only DVI/s-video (there's probably an adapter somewhere to go with it, but still).

I could see some usages where there might not be a digital port (it'd be a great replacement for having to occasionally plug a screen into a server or an old laptop with a broken screen) but it's definitely not the intended market, nor likely to be anywhere near 1% of users. There'll be cheap, low res, non-3D glasses filling that role within a few years anyway.

I suspect there will be howls over this, but I'm constantly amazed at how much hardware still comes with a VGA port.

Hell, my brand new work-issued laptop came with a 56K modem.

I'm not implying that there aren't people that need/want these things, but in 2013 it seems like these should be less than 1% of new equipment buyers.

It's the opposite problem to the usual complaint though. People complain about VGA ports not being on laptops to support peripherals (the VGA only projector at a university with a 400ft cable), but has a GPU powerful enough to support this ever been made that doesn't support a digital output? I put a 5 year old 8800GT in a PC the other day and was mildly surprised to not see a VGA port on it, only DVI/s-video (there's probably an adapter somewhere to go with it, but still).

I could see some usages where there might not be a digital port (it'd be a great replacement for having to occasionally plug a screen into a server or an old laptop with a broken screen) but it's definitely not the intended market, nor likely to be anywhere near 1% of users. There'll be cheap, low res, non-3D glasses filling that role within a few years anyway.

Most GPUs play very nicely with DVI - VGA adapters, though I know our work devices often come with VGA since a lot of conference rooms still use those hookups (we're just seeing them switch over to HDMI now).

The one thing I am a bit bummed about is the lack of display port plugs on monitors / TVs / etc (moreso than missing VGA). A lot of GPUs have more DP / mini DP ports than HDMI. And that's a bummer since DP can do full 1080P 3D while HDMI 1.4 doesn't have the bandwidth. (bit off topic for this discussion I suppose).

I am interested to see where the rift winds up, it's a pretty neat concept and I've heard some great things - but feels like a bit too much of a hastle from my perspective. Strab stuff on...get tethered up... I want to be able to just plop down on the couch and enjoy the evening.

I am interested to see where the rift winds up, it's a pretty neat concept and I've heard some great things - but feels like a bit too much of a hastle from my perspective. Strab stuff on...get tethered up... I want to be able to just plop down on the couch and enjoy the evening.

Well I, for one, have been eagerly awaiting a viable stereoscopic video game system ever since I used a Virtual Boy demo in a store when I was a kid. I'm not really a gamer, but I expect I'll buy one of these when it comes out just to play around with it, especially if it looks like a decent game ecosystem will develop around it. I hope it does, because I'd like to see stereoscopic games become mainstream. I feel like the immersion and head tracking such a system provides would make most games so much easier and more intuitive (everything from FPS's and RPG's to Grand Theft Auto type games).

Gaming aside, it would be awesome if this could be set up to replace your desktop monitor for normal computing as well. Especially if you could add virtual extra screens to mimic various monitor sizes and layouts.

120Hz isn't a standard refresh rate, it's an enhanced option some displays use to double up and make the motion appear smoother. Most videophiles immediately disable such "enhancements"

Ok, 120Hz is a standard refresh rate, although uncommon. Videophiles immediately make USE of such enhancements.

Maybe he meant cinephiles. I was watching some TV shows & movies at a high resolution and refresh rate and it made me think of old daytime soaps. I think TV mfrs do some interpolation that causes this & thus a lot of people turn off the enhanced rate. Damn companies and their "value adds".

Would a Rift 2.0 benefit from the ac standard, or does this require too much bandwidth?

Bandwidth is not the primary problem (although it could be when the resolutions get pushed up), it's latency. Current PCs already have more latency built-in than is optimal for VR, and it's going to require some changes to drivers and how things are processed to get it down to acceptable levels. Wireless would add way too much latency on top. It's conceivable for sometime in the future, but not anytime soon. You also will need a wire for power, as weight is a huge concern, and a big heavy battery is unacceptable.

Quote:

Gaming aside, it would be awesome if this could be set up to replace your desktop monitor for normal computing as well. Especially if you could add virtual extra screens to mimic various monitor sizes and layouts.

I'm sure this will be one of the first apps created once the units start shipping to devs in volume. I don't think it will have any practical value with this prototype though, the resolution will be passable for games, but you aren't going to want to write a word document in it.

Does anyone have a link to an analysis of the ocular health effects of these devices?

As an ergonomy geek, I wonder what the eyes are up to when using one of these. What is the focus distance? The convergence?

In other words, does the eye behave like it's looking at an object at the virtual distance intended by the image, or does it behave like it's looking at a screen a few inches away from your nose? This will make a big difference in eye fatigue.

I sorta answered my own question: the Oculus website says: "With the Oculus Rift, your eyes are actually focused and converged in the distance at all times. It’s a pretty neat optical feature."It'd be nice to have a more thorough explanation, though.

[edit: missing word / edit2: added Oculus quote]

They'll have the lenses set up to produce a virtual image. Your eyes will focus on this image, which will be located relatively far behind the actual location of the screen. It's exactly the same mechanism as when you look in a mirror - the object you see appears to be located behind the plane of the mirror.

Would a Rift 2.0 benefit from the ac standard, or does this require too much bandwidth?

Bandwidth is not the primary problem (although it could be when the resolutions get pushed up), it's latency. Current PCs already have more latency built-in than is optimal for VR, and it's going to require some changes to drivers and how things are processed to get it down to acceptable levels. Wireless would add way too much latency on top. It's conceivable for sometime in the future, but not anytime soon. You also will need a wire for power, as weight is a huge concern, and a big heavy battery is unacceptable.

Quote:

Gaming aside, it would be awesome if this could be set up to replace your desktop monitor for normal computing as well. Especially if you could add virtual extra screens to mimic various monitor sizes and layouts.

I'm sure this will be one of the first apps created once the units start shipping to devs in volume. I don't think it will have any practical value with this prototype though, the resolution will be passable for games, but you aren't going to want to write a word document in it.

Unless you go with lossy compression, bandwidth would be a significant problem even under good 802.11ac conditions: Even at 1280x800, that's just over a million pixels. If R, G, and B each get 8 bits and the refresh rate is 120Hz, each pixel eats 24*120, 2880 bits/second, so on the order of 2.8-2.9Gb/s.

Uncompressed digital video interfaces are very, very, very fast busses by the standards of anything that you don't need to call your rep for a price on and get a VP's signature on the PO...

Unless you go with lossy compression, bandwidth would be a significant problem even under good 802.11ac conditions: Even at 1280x800, that's just over a million pixels. If R, G, and B each get 8 bits and the refresh rate is 120Hz, each pixel eats 24*120, 2880 bits/second, so on the order of 2.8-2.9Gb/s.

Uncompressed digital video interfaces are very, very, very fast busses by the standards of anything that you don't need to call your rep for a price on and get a VP's signature on the PO...

Would a Rift 2.0 benefit from the ac standard, or does this require too much bandwidth?

Bandwidth is not the primary problem (although it could be when the resolutions get pushed up), it's latency. Current PCs already have more latency built-in than is optimal for VR, and it's going to require some changes to drivers and how things are processed to get it down to acceptable levels. Wireless would add way too much latency on top. It's conceivable for sometime in the future, but not anytime soon. You also will need a wire for power, as weight is a huge concern, and a big heavy battery is unacceptable.

Thanks. I knew I was missing something.

Quote:

Quote:

Gaming aside, it would be awesome if this could be set up to replace your desktop monitor for normal computing as well. Especially if you could add virtual extra screens to mimic various monitor sizes and layouts.

I'm sure this will be one of the first apps created once the units start shipping to devs in volume. I don't think it will have any practical value with this prototype though, the resolution will be passable for games, but you aren't going to want to write a word document in it.

A redefined 3D "desktop" would be phenomenal, once the resolution is high enough. Forget multiple screen! (well...)

Don't forget that this fills your whole field of view, closing you off from the ouside worls. That is good for the immersion, but you cant see the controls/keyboards/buttons/other periferals you use. So using it for word processing (for example) is only going to work if you can type blind, and for games you better be able to find the buttons without looking, or things might get embearassing (not a problem for FPS games, or even most MMO's, which are largely mouse driven, but virtual drivers of pilots, with a cockpit mock up might be in trouble).

They'll have the lenses set up to produce a virtual image. Your eyes will focus on this image, which will be located relatively far behind the actual location of the screen. It's exactly the same mechanism as when you look in a mirror - the object you see appears to be located behind the plane of the mirror.

Super interesting link. Thank you!

I can't wait to see what devs will do with these, outside of games (not that games are bad). Imagine being able to move inside a 3D rendering of an architectural project, and/or combining these headsets with a Kinect or similar device, allowing manipulation of virtual objects with hand gestures, or "pulling" yourself to different parts of a simulation. It would also be amazing to be able to virtually visit real-world locations that have been laser-scanned.

Don't forget that this fills your whole field of view, closing you off from the ouside worls. That is good for the immersion, but you cant see the controls/keyboards/buttons/other periferals you use. So using it for word processing (for example) is only going to work if you can type blind, and for games you better be able to find the buttons without looking, or things might get embearassing (not a problem for FPS games, or even most MMO's, which are largely mouse driven, but virtual drivers of pilots, with a cockpit mock up might be in trouble).

Do people not learn touch typing anymore? Even if they don't know how yet, it's amazing how quickly some skills are learned once you are immersed in an environment where they are required...

I am fascinated by this product. I do have one question that I have yet to see answered. Can people who wear glasses use this? Some of us are blind as bats without glasses, do not qualify for lasic, and can not wear contact lenses.

I am fascinated by this product. I do have one question that I have yet to see answered. Can people who wear glasses use this? Some of us are blind as bats without glasses, do not qualify for lasic, and can not wear contact lenses.

From what I recall, they have eyecups that are interchangeable & I think they provide prescription eyewear compatible versions of the eyecups.

The optics of an HMD generally have you focus at a distance of around 20 feet or so. Most professional HMDs actually let you adjust this somewhat, though this adjustment is generally meant to enable people with eyeglasses to use the device without glasses (ie, it's just like the diopter adjustment on a pair of binoculars). That said, most HMDs provide enough room ("eye relief") to allow typical glasses to be worn.

As far as the convergence, that depends upon both the generated imagery as well as what you try to look at. This represents one of the problems with stereoscopic display devices, since normally, your eyes want to change focus as you change convergence. With a stereoscopic display, you have to "train" your eyes to not do this, assuming you wish to look at objects closer than the focus distance (and not have them be blurry). Some people can do this easily, and other people can't get the hang of it.

If you do this for a long time, then you actually need to retrain your eyes to work normally after you stop using the display and look at real world objects. This is also why such displays are not recommended for very young children, whose visual systems are still developing.

Like I said, though, the game content can make this a somewhat moot point, assuming it never shows you anything that's "too close", and always puts the stuff you want to look at near the focus distance. (This applies to 3D movies as well.)

As far as the question about 120Hz, it was not part of HDMI 1.0 - 1.2a. It became incorporated into the spec as of HDMI 1.3 (initially without 1080p, but with 1080p in HDMI 1.4). Probably not too many displays actually support 120Hz formats, but rather advertise a 120Hz feature based on subframes interpolated from a 60Hz signal.

But, as mentioned, the refresh rate is not as important as the image latency (computed as the time between user input--ie, head motion--until the visual effect of that input is seen).

In fact, the 120Hz feature that's based of 60Hz input is actually detrimental to latency, since the display needs to wait for a future frame to arrive before it can calculate and display the interpolated frame. In generally, for gaming, you want to disable all such extra processing and enable the display's "game mode" which means "just display the pixels as-is when you get them."

Interesting post. Could a HMD track the user's eyes and change the focus distance of the virtual image according to what's being looked at, in order to provide convergence and focus that are naturally concomitant?