The chip giant’s CEO discusses its headset, plans to digitize sports arenas, and other news from IDF.

There was a time when you attended the keynote at the Intel Developer Forum conference in San Francisco to learn about what was new in the world of PCs.

advertisement

These days, as the gravitational pull of the technology industry no longer involves everything rotating around PCs, IDF keynotes also serve a different purpose: They tell us how Intel is reengineering itself to succeed in an era when making most of the world’s PC processors is a huge business, but no longer an engine of growth.

At this year’s IDF San Francisco keynote, which Intel CEO Brian Krzanich hosted on Tuesday morning, you can pretty much predict one of the big topics: virtual reality. The company announced a headset platform called Project Alloy, and explained “merged reality,” a term it has coined to describe the experience that Alloy is capable of creating.

Despite being from Intel, Alloy–unlike the Oculus Rift and HTC Vive–is distinguished in part by the fact that it doesn’t rely on a PC for processing power. Instead, it’s self-contained, with everything required to create the experience built into the goggles themselves.

“For 90% of what people do with VR, it’s perfect, and it’s untethered,” Krzanich told me when I got a sneak peek at Intel’s keynote plans from him. (High-end gamers might want something a bit more computationally powerful he says, and there may be a PC-tethered variant of Alloy at some point.)

Alloy is an open-source platform that any company can use as the foundation for a VR device; Intel will also work with some specific manufacturers to build headsets based upon it. It will run Windows Holographic, the AR/VR-enabled version of Windows that Microsoft recently said would be available for hardware beyond its own HoloLens. Intel expects products based upon it to be available next summer.

Besides not requiring a PC, Alloy is designed to do away with the sensors that some VR platforms use to figure out where you’re standing in the virtual world, and the handheld controllers they offer to let you use your hands to manipulate objects. Instead, Alloy leverages Intel technologies known as RealSense to let it figure out where you are and what you’re doing with your hands, no additional hardware required.

advertisement

RealSense started out as a suite of products designed to give PCs new forms of input, such as the Minority Report-esque ability to turn the pages of an on-screen book by swiping your finger through the air. “The keyboard and mouse have been around for decades now and hadn’t really progressed,” Krzanich says. “We started to talk about how else you could interact with your computer.”

It turned out that PC users are pretty happy with their keyboards and mice. As a PC-based technology, “it’s not clear RealSense will ever take off,” Krzanich admits. But the technologies that Intel had been working on also had applications in VR, where positioning people, their hands, and even individual fingers in 3D space is a big part of making VR feel real.

Along with letting Alloy figure out where you are and what you’re doing, RealSense allows the headset to identify objects in a room, such as a table—either to bring them into the VR experience or help you avoid crashing into them. You’ll also be able to manipulate physical objects such as your own real-world tennis racket inside a virtual world. (Alloy could even depict your particular racket inside VR.) It’s this experience–VR, with a sophisticated understanding of your real-world surroundings and the objects in it–that Intel calls merged reality.

(Merged reality is not exactly the same thing as mixed reality, the phrase Microsoft uses to describe the experience of using HoloLens–and I’m curious whether actual consumers will pick up on either of these terms.)

At IDF, Intel also shared an update on its efforts to help organizations such as sports arenas capture real-world action in real time so it can be turned into a 3D replica and explored from any angle. The process, developed by an Israeli company named Replay Technologies that Intel acquired in March, records a scene with ultra-high-definition cameras, then turns the raw video over to a really, really powerful Intel-based computing system to crunch the footage and re-create it as a 3D space that can be rendered like a video game. “It takes a massive amount of compute power to do that–200GB per minute,” Krzanich says, “It’s a big compute program, and perfectly suited for Intel.”

The NBA used the technology during the play-offs for 360-degree replays, but sports are just the beginning. “We’re finding the applications for this are quite large when you fully digitize the real world and allow yourself to manipulate and work within it,” Krzanich says. A mystery movie shot for VR, for instance, might “position the murder scene from different angles depending on who you are–including the victim.”

advertisement

In order to work more closely on the Replay technology with the entertainment industry, Intel will be opening a studio and lab in Los Angeles.

Beyond VR, Intel announcing a gizmo called Euclid at IDF. It will pack the computing power and sensors required for machine vision into a package that Krzanich says is about the size of a Snickers bar. Engineers can use it to easily add computer-based sight to their projects. “If you wanted to build a go-kart and make it self-driving, you could do that,” he says. “It’s a progression of the belief that when you add computer vision to devices, it really enhances their capabilities.”

I ended my chat with Krzanich by asking him whether the fact that Intel hasn’t dominated the smartphone era like it did the PC one has had any influence on its thinking about how to tackle VR and other technologies that make up the next wave. (Almost all current phones use processors based on technology created by ARM, a British firm.) He told me that one big lesson was that Intel needs to work with everyone else in the business on cross-industry standards rather than aiming to establish its own independent vision.

“We’re trying to make sure we’re part of the bigger ecosystem, with things like 5G [wireless networking] he says. “In areas that are emerging, we’re making sure we’re there at the beginning, and being innovative, too.”

About the author

Harry McCracken is the technology editor for Fast Company, based in San Francisco. In past lives, he was editor at large for Time magazine, founder and editor of Technologizer, and editor of PC World.