HoloLens is still, for my money, one of the more interesting headset devices available at the moment.

While the likes of Oculus Rift and even Samsung’s Gear VR are exploring the possibilities of full immersive (read: “VR”) experiences, Microsoft is pioneering what I’d argue is the more immediately useful proposition of AR (augmented reality), or “mixed reality” as people are now becoming fond of calling it.

Related articles

At last year’s Build, I took part in a cut-down ‘coding’ session with HoloLens – actually little more than a case of ticking boxes to active pre-built routines in Unity 3D – and it proved a real eye-opener. But one thing was missing there: the experience was entirely solitary. A group of us roamed round a space enjoying our own personal AR scenarios, but short of accidentally stepping on each other’s feet, there was nothing in the way of the kind of shared experiences Microsoft is so fond of showing in demos and keynotes.

And that was the purpose of my return to the Holographic Academy. Another box-ticking Unity session, but a year on, a great indicator of Microsoft’s growing confidence in the experiences possible with this piece of $3000 kit.

Anchors

Things started in a similar fashion to last year – having replaced the pre-prepared “Energy Hub” object’s camera with a small piece of code to make HoloLens itself the camera, the viewer’s “gaze” controlling its position in space, we quickly exported it to the device via Visual Studio.

In a slight improvement from last year, our supervisory coders were no longer jealously guarding the USB cable that connected the dormant HoloLens to the Windows 10 development PC – HoloLens was now quickly and seamlessly receiving project information wirelessly. Though we were encouraged to plug it in between active sessions to keep the battery charging. I even forgot once, and my helper discretely plugged it in for me. That supposed two-and-a-half battery life is presumably a constant threat.

It also seemed to me that, while the viewing area of the AR graphics themselves was still worryingly narrow (Any plans to make it bigger? “Well, what do you think?” replied my demonstrator. Fair enough).

After deploying the “Energy Hub” – and its charming canned animation of an unfolding, robotic terminal of some kind – on coffee tables, floors and cupboards around the room with the traditional one-finger ‘tap’ motion, it was time to kick up a gear.

A little more box-ticking in Unity, and eight of us were about to be networked together into a shared experience.

The method for achieving this made a lot of technical sense. Working via, in this case, a Surface Book as a server-style machine to create a network, one HoloLens user would export the project and execute the program first, being the “Anchor” by which the other seven HoloLenses orientated and organised themselves.

Jacking in, we were now met with a menu of cutesy little ‘avatar’ robots in four different colours. Tapping to select blue, red, orange or pink, the floating droid would then become a sort of sidekick to the other players in the group, following them around next to their head and – stylishly – mimicking the HoloLens user’s head motions. A cute touch, and a nice way to punctuate multi-user immersion through digital representation of real-world activity.

A couple more passes through Unity, and the Energy Hub was now tethered to our spatial locations in the room.

It was also now redeployable with the spoken phrase “Reset target”. Result? The whole group had to work together to reorientate the Energy Hub using our bodies as reference points. Positioning it on the table became an exercise in physical and vocal team work – using both the real world and the virtual world. Another good visualisation of possibility.

Afterwards, we were equipped with projectile shooters, activated by pinch gestures, and could start taking potshots at floors, walls, couches and even each other’s avatars before – similarly to last year’s – an arcane rift opened below the Energy Hub and we got to see other little robots whizzing around in a bizarre subterranean factory. Neat, but again – nothing I’ve not seen before.

A Whole New World

For me, the real beauty of Build 2016’s updated HoloLens demo was the promise of possibility. To reiterate my message from last year – this stuff just simply works. It integrates easily with coding environments even children can – and do – use. It takes a good deal of tricks easily in its stride, not trying to achieve the utterly impossible, but still instantly revealing easy and obvious use cases for collaborative productivity that it’s not difficult to posit huge future advantages.

After the coding workshop, I had a quick wander round the surface of Mars, on virtual terrain created by spatial and photographic data spat out from the Mars Rover. Stunning in practice, but not difficult to put together in theory.

In many ways, HoloLens is a sensible culmination of a load of stuff Microsoft’s been working on for years, and now has a real handle on. From programming environments it either built (Visual Studio) or has close professional ties with (Unity), to hardware projects that have hit the skids, but that valuable lessons have been learned from (spatial mapping gaming peripheral Kinect), HoloLens feels like the practical result of a lot of hard work and hard knocks.

Now it’s gone out to developers, it’ll be exciting to see what the rest world makes of a device that Microsoft has already done a great job of convincing me has a huge amount of possibility for the enterprise.