This is not a news that would get me excited. This is a prototype and a lot of people might have too many things that can be done using this.

I am waiting for a time when I can wear it as a normal pair of spectacles all day. Use it like: a smartphone, get navigation assistance while driving, without taking my eyes of the road, Get Citylens kind of information about building and monumnets near by, build or repair things at my home using DIY tutorials and holograms so on and so forth. I really want it to me everyday device. It should even act as normal spectacle for whole who wear corrective lenses. One shoudl be able to watch full movies on it while travelling, Take DSLR like pictures and share them online without even using their hands, read books and study all your subjects though virtual classrooms. The list is just endless for me.

That would be my ideal Hololens. I don't know if all, or even some of that will be done in coming years. But this is a vision I have for such a device. (Did I forget to mention that it should be incredibly light, should run atleast one full day without a need to recharge and look cool too?)

It is unbelievable how many Hololens "experts" are on this thread. A completely new platform and an amazing number of folks "know" it needs more RAM, resolution, 64bitCPU, etc, etc. How do you know what the resolution of a hologram is? The efficiency of the HPU, do you even know what a HPU really is? Specs on a new device like this are totally irrelevant since there is no basis for comparison. To compare this to current tech on the market would be like comparing a P51 Mustang to a F22, yea they are both airplanes and fighters but totally different technologies.

Lets let the devs. come up with some cool software and Microsoft work on Ver2 before its trashed and the Apple/Android fanboi's are provided with ammo to pooh pooh this.

That's because "holographic lightbeam lenses" is not actually explained, no one knows what it really means, and may be a marketing term for "does what every other see-through HMD does" whereas the other specs are things people are familiar with.

I would be surprised if it hits consumers in the next two years. I think they want at least a year at the enterprise/education/scientific level and then some refinement and cost control for a consumer device. But then, if the SpaceStation trip works out well maybe it will be moved forward and we will see it sooner.

I'm really exceited by this device, however, looks to me that these holographic lens are going to need more than 32-Bit CPU and 2GB of RAM, Occulus Rift requirements are much higher like desktop class Nvidia GTX 970i + 16GB RAM + Intel Broadwell or similar CPU

From my understanding Holograms are better than virtual reality, since they mix reality with pixels, while VR is full pixels, so I think is more difficult to build Holographic apps like Minecraft.

Not really. Large scale "pure" VR has been done where the user can walk around a room (my lab in Grad School was ~15 ft per side tracking area, and we were limited by the room size, not the tracking technology) and people did not lose their balance.

One "problem" with Oculus is that the tracking area is only a few feet on a side (4 to 2.5 meters away from the sensor, with a restrictive frustum in whichhe headset can be tracked) -- https://developer.oculus.com/documentation/pcsdk/latest/concepts/dg-sens... -- ​that makes it a "sitting" or "standing still" device with looking around, but no physical locomotion. Hololens is designed to be worn and walked around with.

Oculus specs are high because, in aiming for a large FOV and high frame rate (90fps), they do a lot of heavy lifting on the software (GPU) side to predict motion, distort the image to compensate for the optical distortion inherent in large FOV Head-Mounted Displays, and even warp the image at the last fraction of a second to compensate for the system lag between the sensor reading and the display time, avoiding simulator sickness. Oculus is also higher than 1080p resolution per eye, whereas this is less. Since Hololens is optical see-through, any optical distortion is a deal-killer, so instead of that Hololens has a smaller FOV with zero-distortion optics, which are expensive to make and cannot be made better with new CPU tech along Moore's Law. Also because of the optical see-through, the "real world" will not lag behind quick head motions, and the "holograms" can probably tolerate a bit of swimming, so all the software for lag compensation isn't needed either.

A limited run HGPU has got to be expensive and everything else as a custom build. Plus theres the dev support bundled in and no-one knows if that is a factor in the price either. prototypes are expensive no matter who makes them.

I'm really curious about the custom HPU. Can't wait for the nitty gritty details that will no doubt start floating around. Especially this is pretty much the first untethered AR headset to hit the market.

As well excited to see what devs do with this, it has alot of potential.

Hopefully, this puts the kick into getting battery tech upto scratch with modern tech.

most likely modern tech will come down to batteries.... High energy density batteries are plagued with problems and I dont think that will change anytime soon. and yes it takes yrs and yrs to really test new solutions that you may use up against someones head? 7nm and below may be the fix for the current problems (ooh a pun!)

And as several naysayers (including Paul Thurrott) have said, the 'good' FoV in the first demo is actually the same FoV as in the later demos that many people (including Paul) claimed was actually smaller.

No, it's not 'efficient'. 64bit processor can address more memory. That's all. And that is, if there is any more memory.
2gb ram can be addressed using a 32bit processor.
And why 2gb? Maybe the device don't need much.

You really should consider doing some research. Just like Quad-Core ARM processors, 64-Bit is a more power efficient way of enhancing performance; more processing performance for every Hz of CPU clock frequency. Thus potentially increasing battery life, which is a significant factor in this product.

Again, processing speed has nothing to do with processor register size (32bit or 64bit). It does depend on the clock speed, can depend on number of cores and majorly depends on the architecture.
And keeping clocks, cores and architecture same, 64bit is definitely gonna use more battery.

Plus Microsoft is making this product. Of course they'll know what they're doing.

Who said that its camera would be used to record videos and processor to do multitasking?
And since when did Microsoft release this product to consumers and they've complaint about its performance?
If bigger ram and processor makes a better holographic experience then we better put our smartphones on our eyes.
I can't say if its specs are good or bad. No one can. It's a new category. There's no standard right now about how much RAM a holographic device should have.
Again. It's a new category. Many other things matter here than just RAM and processor.

What exactly do you think the RAM is meant for? What makes you think a system running apps wouldn't need to do multitasking? It'll proabbly need to do even way more than SmartPhones, 'cause it also has to keep track of not only the status of the apps in the background, but also their geo location and position.

You really think they put camera in there just to record videos? In such devices it is used as an 'eye' to see the surroundings. Video would be a secondary use, because it's a camera after all.

About RAM, you can argue when you've used it as a consumer, and it starts hanging because of so much load.

Right now, interactivity with the real environment matters the most. Not how's the video quality and how many apps can we use at the same time.
Maybe at one point RAM and camera would matter. But not now.

When it's consumer ready and in the markets, maybe we'll see a better version.

Yes, new does not always equal better. However, making judgments about new technology that you have absolutely no idea how it works is kind of pointless. Unless you have some inside knowledge about a HPU you have no idea what it can do, how it does it and what resources it needs. Since you don't know any of those things how can you know, or even intelligently guess, that it might be a "gimped experience" or limited components.

To me, a better thought process is to see if new technology can do something new or better than what is out there not compare specs and second guess what might be based on what the current state is.

Yeah you're right about that when it comes to the CPU architecture and RAM, +1. However, regarding cameras, I think they could have done it better. We'll just have to wait and see what the end product is specced like.

Hmm.. I'm sure they'll bring up more of it, with coming developments. First they wanted to try it out for the developers, and in the near future, they might be able to provide it with more specs, and less price.

That's exactly why we should care. If we're gonna walk around wearing PCs on our heads, we need to be able to run software for them, and a lot of software developers have moved on to 64-bit.

Needless to say you won't be installing iTunes on this. Not that anyone is going to be buying one to install iTunes...

Just to be clear, I'm not horribly worried about how capable/incapable the device is, as it will supposedly work along your PC or Xbox for any really heavy lifting. The original Oculus Rift devkits looked pretty cruddy but the effect was still amazing.

I was curious about the same thing. I feel like it would actually benefit, given all the advanced spatial calculations it has to perform per-second to map out the environment around the user, and then display holograms appropriately. Definitely sounds like something that could benefit from extra bits to work with.

Obviously Microsoft knows about this more than I do, I'm just curious what their reasoning is.

Seriously? For a first Gen DEVELOPER, device it seems pretty awsome. Since I'm guessing its going to be at least 2 years before a consumer version is released I'm pretty sure your concernes will be addressed.

I know it's not directly comparable, but can we get a feel for the resolution? In one review I read there was disappointment that the resolution was low - like 720p low, even with the 40 degree FOV. I've not yet been able to demo a unit but was curious if the anecdotal experience is shared and whether we can make a resolution comparison.

It's a first gen developers-only model. For most applications within that gen 720p is still overkill one inch from your eye. Give them time to get the FOV better and I'm sure the resolution will increase as well.

its almost impossible to comment. Do note that you watch your 4K TV screen from a few feet away, and this lens is almost an inch away from your pupil. So I don't think it can be comapred with the specs of anything we use in everyday life, except maybe the camera. But then again, this is more like a prototype of the final device. By the time the developers develop some usefull applications for normal usage, the hardware too will undergo tremendous changes and who knows even the dimensions might change drastically.