Posted
by
Soulskill
on Saturday November 15, 2008 @01:24PM
from the pants-are-input-devices-now-too dept.

An anonymous reader writes "Researchers at Carnegie Mellon University's HCI Institute have developed a new input technology that allows mobile devices to use surfaces they rest on, like tables, for gestural finger input. This is achieved with some clever acoustic tricks — basically taking advantage of high frequency sound propagation through dense materials. Their video highlights some neat applications, such as controlling an MP3 player by scratching on a wall and muting a cell phone by scratching on a table. Further details are available in the academic paper (PDF)."

Whoa, mods - glad to see it was modded interesting, but it was a joke.

Have any of you heard fingernails scratching a chalkboard? It's one of the most popularly unpleasant sounds imaginable and a source of much humor when it's compared to your wives' voices. Granted almost all of them have been replaced with whiteboards and dry-erase markers but I'm still too young to make you get off my lawn!

Nevertheless, it would still be interesting to see what sort of accuracy you could achieve given an increased number of sensors (how many you can add before adding more becomes useless probably depends on the particular situation, but it seems unlikely that it is not useful to have more than 3 sensors) and perhaps a more advanced calibration process than the standard touchscreen "touch the dots in the corners of the screen" routine.

Also, there are plenty of surfaces made out of other materials than wood for

I'm sure 3kHz is slow enough for interferometry to work. If we can do it with visible light sound is probably not an issue.
Also most scratchy hissy sounds are around the 16kHz mark and I'm sure there are at least a few overtones in the ultrasonic range (still wavelength of 5-80cm).
I'd imagine echoes from the borders of the table would be involved and other non-trivial bits, so I'd say a drawing surface is plausible (maybe not for detailed sketches but for something like a whiteboard).

If we come to expect interfaces to devices to be hidden and embedded in desktops and surrounding walls we are going to spend half our lives scratching and poking at inanimate things.

I am all for integration of technology but things like this and the hidden table things will just make us look stupid.

Nah, it'll just be like Bluetooth

.

Ten years ago, someone walking down the street talking loudly to themselves was assumed to be crazy. Now they're assumed to be talking on the cellphone.

So it this catches on, there will be a brief period where people using these devices look like they have some sort of nervous tick, followed by a long period of time in which people with certain kinds of nervous ticks blend in better because everyone just assumes they're writing a note to themselves

... can come with a full keyboard. Just set the thing on the table, and a laser diode outlines your keyboard for you on the table, and you type.
Yeah, you'll get fatigue from too much typing like this, but it'll be tons faster than point and click.

This has potential for a door lock. A cute application would be something that opens the door when the dog wants to go out.
But scratch recognition without location information is going to be very limited in application.

With some positional information, it could be more useful.
The speed of sound in wood is high, so you're going to have to correlate waveforms, not just time events. But that's not hard to do. I have no idea how much accuracy you could get, but it's not an expensive experiment to find out. Try four microphones at the corners of a table and correlate to line up the waveforms. (Three are enough for position, but with four, you get redundancy and
can eliminate totally bogus position results.) Multi-touch is going to be hard, though.

It might be fun to set up a DJ mixing rig this way. No turntables, just a flat surface, maybe with outlines of turntables and faders to guide navigation.

Today we have prior art with a specific date published. Any huge company which did NOT buy this from the guys who uploaded this video have zero chance in court, no matter how many lawyers they have... Publishing your idea to a public media (like youtube is today) is rock solid demonstration of prior art.

It seems hard to find on their site a specific mention of gestures, but I had an interview there and specifically asked if they were able to track "drags" and not only "clicks" and they said they were able to follow a finger on the surface.

The article says this researches has accomplished "gestural finger input". Although this would be very cool, it is not the case. The only thing this researcher has done is listening for sound, and if there's sound, do stuff. Compare it to the microphone input of the DS: if there's noise, you can do stuff, if there's no noise, do nothing. What the researcher added was a bit of complexity: a short noise changes mode, and a long noise activates the mode. That's nowhere near gestural input.

The sound varies by acceleration, so it can determine the unique pattern of drawing a circle or drawing a V, for example. If you watch the video where he controls the song, he uses different gestures for different commands. The only thing it doesn't do anything with is how far you are from the mic.

This is, however, not gestural input. "If there's a lot of noise three times in a row, a triangle has been drawn" is different from "noise moves from (0,0) to (0,1) to (1,1) back to (0,0)". So maybe a set of figures can be distinguished, but once you move beyond that (limited) set, forget about it. Sure, what these guys have done is pretty complex too, but it's not "gestural".

I see so many new applications, improvements, and developments out there, and nine-nines of them never make it to consumer use, or even to specialized uses in specific labs and industries.

I realize (from above) that this isn't new, but it's new to me, and it looks like a very, very interesting interface. Virtual keyboards? Too esoteric, even for a geek like me. Gestural control? No. I've tried those mouse-gesture control thingy's, and they're okay, but too intrusive. But if I could merely tap, click or trac