How you hold things can reveal a lot about how you will use them. This stylus can detect it.

“If you see a stylus, they blew it,” Steve Jobs once famously said about tablets. Yet despite Jobs’s best efforts, the stylus is not dead. Designers are obsessed with them. Apple’s started embracing them too: not only does the average Apple Store sell dozens of third-party styluses, but iOS 8 has added important functionality that makes using a stylus on an iPad better than ever before.

advertisement

advertisement

This is just the beginning, says Ken Hinckley of Microsoft Research Labs. We’re on the edge of a major stylus renaissance, in which pen computing will be just as common as touch computing. But the styluses of the future won’t be dumb sticks with capacitive foam glued to the tip. They will be intelligent smart pens that can read your mind and anticipate what you want to do with them. And he even built one to prove it.

With Hinckley’s concept, the barrel of the stylus is encased in a multi-touch grip sensor that can detect how it is being gripped. And it is paired with a tablet that also has similar grip-sensing abilities. Together, this stylus + tablet combo can not only precisely measure how a user is touching it, but where a stylus is in relation to the screen.

If you want to tell what card a poker player has, don’t look in his eyes. Look at his hands.

Why is that important? As it turns out, the easiest way to read a human being’s mind is to see what they’re doing with their hands. “Humans encode a lot of information in how they grip things,” Hinckley tells me. “For example, if you want to tell what card a poker player has, don’t look in his eyes. Look at his hands. Look at how he grips his chips, how he holds his cards, how he places a bet. If you know what you’re looking for, the hands tell all.”

Pencil, by 53

This is why Hinckley is so interested in teaching our computers how to read the language of our hands. Just as a person picks up a pitcher differently depending on whether or not they intend to pour themselves some water or throw it in someone’s face, a stylus that can detect how it’s being held can predict how it will be used. For example, if you held your tablet on one side like a notebook while gripping your stylus like a pen, the tablet of the future might know you want to write or draw. Hold it like a brush while holding the tablet away from you, and it would realize you wanted to paint. Or holding your stylus like an Exacto knife might allow you to enter a automatically enter an editing mode, allowing you to “slice out” content you don’t want.

There’s surprisingly little research on how people hold things.

But making all that happen is a bigger challenge than it seems. “There’s been surprisingly little research on how people hold things,” says Hinckley, who has spent the last three and a half years trying to understand the biomechanical language of grip. It’s a complicated problem. First, you need to understand what all 27 bones, 30-plus muscles, and 50-odd nerves in the hand are doing when they pick up a stylus. Then, you need to need to know what that database means. And then, you need to have a system in place that is intelligent enough to adapt that to the body language of an individual user, which might have its own unique adaptations.

Livescribe 3 Smartpen

So far, Hinckley and his team at Microsoft Research have identified around 30 different ways people grip things, but he says that most of those are outliers: you’ll never grab your tablet like a baseball, for example. A stylus that can understand the context of just a handful of grip patterns would be enough to revolutionize tablet computing, effortlessly detecting whether or not someone was trying to draw on a screen, paint on it, or edit a document, just by how they were holding their stylus.

advertisement

“At the end of the day, we might only need to have styluses that can understand three or four grips,” says Hinckley. “Practically, that might be enough to strip out a huge amount of complexity from our interfaces, as well as give a more natural and intuitive pen computing experience.”

The stylus isn’t dead. It’s just about to get a lot smarter

Whatever the case, Hinckley thinks it’s absurd to assume that the stylus will ever go away. From the lead pencil to the dry erase marker, from the quill to the Smart Pen, styluses have been an integral part of the way we work for thousands of years. We’re just hardwired to use them.

To read more about Hinckley’s stylus research, check out his most recent paper here.