According to Wired, a newly public patent shows a way to use hand gestures with Google Glasses. The patent is for a “wearable marker for passive interaction.” In English that means a small, invisible to the human eye, infrared identifier that users can put on a ring or fingernail.

Google Glasses can see the IR identifier and track it as users perform gestures. The hand gestures would then control the UI in the head mounted display. Certain gestures can launch applications or open documents, for example.

The IR identifiers would also be unique to each individual user. With them users can put on any head mounted display and just look at their hand with the IR identifier. The glasses would identify the user and adjust everything to that person’s setting.

Hand gestures make sense with Google Glasses, and could work well when paired with voice recognition. The biggest downside is users would look at least a bit silly walking around waving their arms in front of them and talking to their glasses. Gestures could also be tiring depending on how many tasks the user is performing one after another.

While the Google Glasses demo video did look pretty cool and futuristic, we’re not sure we’d want to walk around gesturing in the air if this does end up being a control method.