Google has filed a patent application for its controversial Project Glass eyeglasses, suggesting the search giant wants to use the glasses not just as a heads up display for life, but as a universal remote as well. And I do mean universal.

Any device that you can look at, Google wants to use Glass to control. Or, if not control directly, then at least provide information on how the wearer can control it him- or herself.

Google Glass “head-mounted displays" (HMDs) are expected to go on sale some time this year, for $US1500 a pair. The glasses contain a tiny projector that superimposes information onto the wearer’s view of the world. Linked back to a smart phone, the glasses could provide navigation information, Wikipedia information, dating information - anything that can be beamed over the internet, these glasses can beam into their wearer’s field of vision.

In its patent application, dated March 21 2013, the HMDs will use any number of methods to recognise what their wearer is looking at: they’ll use image recognition; they’ll read any bar codes that happen to be on the device; they’ll ping the device for RFID identification; they’ll listen for a “beacon", such as a WiFi transmission or an NFC tag (though you’d have to get pretty close for NFC to work).

Once an HMDs has figured out what the wearer is looking at, it will then figure out what can be done with the device. If it’s controllable remotely, the HMD will present the wearer with a “virtual control interface" that can be operated with voice commands, or even with fingers poking about in empty space.

If it’s not controllable, the HMD will present the wearer with status information, or perhaps an instruction manual, pertaining to the device.

From the patent application:

Related Quotes

Company Profile

The wearable computing device may allow for control of the target device via interactive gestures with the virtual control interface. For example, the virtual control interface displayed by the HMD may include one or more user interface elements, such as virtual buttons, that allow the wearer to control the target device. The virtual buttons could appear to be on the surface to the target device, or they could appear in a way that is not physically connected to the target device (e.g., in a “head-fixed" virtual control interface).

The wearable computing may recognize movement of the wearer’s fingers towards a virtual button as a control instruction for the target device. As one example, a virtual control interface for controlling a refrigerator (such as adjusting a temperature set-point) may be superimposed upon the refrigerator surface. In order to control the target device, the wearer may attempt to touch the virtual control interface at the apparent distance of the virtual control interface. For example, the wearer may touch a location on the refrigerator where a virtual button in the virtual control interface appears. The wearable computing device may recognize this touching motion as a control instruction and transmit the control instruction to the target device.

Pretty neat, what? Now you just need to find where you put your glasses.