Jim Marggraff isn’t your typical virtual reality CEO. The middle-aged Silicon Valley entrepreneur is best known as the creator of Livescribe, the "smart pen" that quickly built a cult following on college campuses, and as the creator of the LeapPad, the popular edutablet. Now, he’s turning his attention to a new invention: Hardware that lets users control virtual reality environments with their eyes.

Eyefluence is a VR/AR startup that doesn’t want to make its own headsets or glasses. Instead, Marggraff says his company’s hardware and software can understand the intent behind a user’s gaze, let them control an environment with only their eye motions, and feed them relevant information automatically. He imagines it will be something that won’t just be for gamers—it’s an invention that will make the jobs of EMTs, firefighters, factory floor workers, and others much easier.

The company’s hope is to have virtual reality companies license their technology and software for use in their own products. Several investors have made considerable bets on the firm.

Intel Ventures invested in the firm in 2013, and Motorola Capital led a $14 million series B in 2015. Motorola’s investment is a bit unusual: The fund, a product of Motorola’s corporate split in the 2000s, primarily invests in technology for law enforcement or public safety.

I was curious about why a police tech venture capital fund is investing in virtual reality. The answer, it turns out, relates to the work of EMTs and the specific challenges of answering 911 calls, along with the on-the-job challenges of police officers.

Imagine if police officers could get information on an unfolding crime scene without visibly moving a muscle. Instead, they would use only the motion of their eyes behind sunglasses, leaving their heads up and their hands free to manage the scene and take quick action.

As for Eyefluence’s technology, it tracks eye motions and uses them to control mobile applications. For now, Marggraff believes the emphasis will be on workplace uses rather than gaming. During our conversation, he told me that "If you're carrying a hose or wearing thick gloves, it will be difficult to access information on building layout for instance with anything but your eyes."

Eyefluence claims that interaction times for the eye controls can be measured in the tens of milliseconds. David Stiehr, the company’s cofounder, says they’re working with consumer electronics manufacturers and manufacturers of head-mounted displays to figure out ways to integrate eye-tracking technology into their products. Eyefluence’s product is the technology and accompanying algorithms which are then licensed out; the company is also creating APIs (Application Programming Interfaces) and SDKs (Software Development Kits) for external developers to create new products. A proof-of-concept demo is shown in the clip above.

The technology itself is based on research by Eye-Com, a company Marggraff purchased in 2013. Eye-Com was initially funded by the Defense Department and other government agencies; its product portfolio mainly focused on technology to control wheelchairs with eye motions and for fatigue detection via eye motions. Eyefluence seems to be applying much of the underlying technology to an area that is experiencing an economic boom at the moment—virtual reality and augmented reality.

In demonstrations, Eyefluence is currently showing its technology in modified versions of an Oculus headset and smart glasses; Marggraff says the company is planning for its technology to be integrated into the next generation of virtual reality and augmented reality products.