Posted
by
CmdrTaco
on Monday March 29, 2010 @09:41AM
from the control-the-spread-of-cooties dept.

forgot_my_username writes "The MIT Media Lab is developing a motion screen computer. It looks back at you. It measures light and gestures, and uses those to control the interface. 'Imagine every pixel on your LCD screen emitting light could also be receiving light,' said Ramesh Rakar, an Associate Professor at the Media Lab. They even mention the health benefits of not touching displays."

sitting infuriatingly still becomes a requirement for your computer to continue doing what it was doing.

Well, it should be possible to ignore extraneous gestures in the same way as you can tell your interface to ignore accidental trackpad input on a laptop.

This aside, I would welcome a no-touch screen as an alternative to having to clean greasy pizza smears amd other unnecessary paw-prints off my screens, if I were required to use such an interface. Personally, I don't believe touch/non-touch screens o

The bandwith of 10 fingers is a lot higher than a mouse with just one pointer and a few buttons. You can potentially transmit a lot more instructions in a lot less time using your hands, if only we figured out a proper way to make it work.

The bandwith of 10 fingers is a lot higher than a mouse with just one pointer and a few buttons. You can potentially transmit a lot more instructions in a lot less time using your hands, if only we figured out a proper way to make it work.

You mean like a keyboard? Yeah they perfected that technology back in the 70s with TRS-80s (#1 computer at the time), Apple IIs, and Atari 400/800s which replaced previous toggle-switched computers with a 10-finger interface where you could type words directly on a CRT! (or TV). It was a great advancement in personal computers.

The problem with that 10-finger interface was the high learning curve which made people have to memorize all kind

I would not have both "perfected that technology" and "Atari 400/800" in the same sentence when referring to keyboards unless there was some kind of negation involved. That plastic membrane keyboard was, ummm, bad.

Yes, and the hardware needs to be reasonably widespread before the hot-shot UI designers work out how best to use it.

I thought pinch zoom/rotate was inspired when I first saw it, but I think we've only scratched the surface (heh) of what intuitive interfaces can be achieved with multitouch. Add pressure sensitivity and the palette becomes so much richer.

I think if you could somehow make the 1cm in front of your screen something you can interact directly with would be great - giving you 'hover' semantics on

How much more energy does it take to keep something hovering over a surface as compared to landing said something on the surface? I would imagine that fatigue (of the fingers, hands, forearms, and etc.) would be a much bigger problem should non-touch, gesture based navigation become widespread. Right now it's our wrists, imagine waving your arms in the air for 6 to 8 hours a day.

How much more energy does it take to keep something hovering over a surface as compared to landing said something on the surface? I would imagine that fatigue (of the fingers, hands, forearms, and etc.) would be a much bigger problem should non-touch, gesture based navigation become widespread. Right now it's our wrists, imagine waving your arms in the air for 6 to 8 hours a day.

It's definitely an issue, but not an insurmountable one.

Firstly, it's clearly not practical to use a touch screen in the eye level position my monitor's currently at. My shoulder would be in agony after 10 minutes. You'd want the screen to be in a similar position to where you'd put a notepad or sketchbook when writing/drawing.

Secondly, you don't want to be hovering your entire hand for long periods. If you're writing or drawing, you usually rest the wrist or the heel of your hand on the paper. Hence, a goo

I wish websites would understand this. I can actually input text that is not my name or address. I don't want to click click click. I want to click, type or better yet tab then type for input. For instance, the idea of navigation is so ingrained than typing the word "pass" to get to the password features has been replaced with "search screen for place to click, now click, now click again, once more, now type".

Sure, except many websites don't handle tabs in a sane manner. Some end up jumping to different input fields seemingly at random, some move from an input field to the little "What's this?" link next to that input field, some move to some completely unrelated link, or to the submit button even though you're only halfway through the form, or any number of zany things. If websites were designed properly, keyboard shortcuts like tab would work as intended. Too bad so few websites are designed with anyone but

Sure, except many websites don't handle tabs in a sane manner. Some end up jumping to different input fields seemingly at random, some move from an input field to the little "What's this?" link next to that input field, some move to some completely unrelated link, or to the submit button even though you're only halfway through the form, or any number of zany things. If websites were designed properly, keyboard shortcuts like tab would work as intended. Too bad so few websites are designed with anyone but an

Yes I understand that. What I really meant was the focus on visual cues using icons and links in place of an input text area. I'd love to type in "cinema" to a text box than drag down a drop down menu with 100 different categories. Just one example of where the keyboard is overlooked in favor of the mouse. I'm not a UI expert though so maybe I'm completely off. They try to water everything down to pictures and clicks it seems.

Some already do. Almost any ecommerce site, and many 'support/help' sections will redirect based on specific searches. What you want to know is what they are, and for them to more often to support account function queries.

I prefer the keyboard. It's still the most effective input method and the fastest way to manage your computer and smartphone (provided you learned the hotkeys and commands)

That entirely depends what you're doing. If you're drawing a picture, the keyboard is usually a terrible interface

Even when there are exceptional cases -- for example, I've not found a better way to produce sequence diagrams than the text-driven http://www.websequencediagrams.com/ [websequencediagrams.com] -- you can hypothesise a nicer interface based on touching/clicking and dragging.

Personally, I prefer to use the best task for the job. I launch programs with the run prompt, navigate websites with the mouse, enter data into forms with the keyboard, manage small numbers of files spatially, and manage large numbers of files using commands.

You could use a hammer for everything, and save yourself the trouble of learning how to use multiple tools efficiently. It's certainly less work in the short term. Plus you provide everyone else with schadenfreude when you try to select a large su

John was pretty thorough about trying to get input from others to make sure his stuff did not read like others too.

When we find stuff, like the main story here, we try to get it posted to the blog right away to give proper credit to the people who think about these things and make them reality. Especially of they were working on it before Jan. 2009.

That comment pushes me to the off topic thought that it is a travesty that more schools don't offer sign language as a foreign language. Learning a second spoken language is basically only useful for talking to people who don't speak the same language as you. On a day to day basis, this is only useful for a small percentage of the population. Sign language on the other hand, would (just like a spoken second language) let you talk to other people who only spoke that language, but it would also be useful f

Boy, the level of discourse on/. has certainly deteriorated in the past couple of years.

Apple was clearly investigating just exactly the tech that TFA is talking about. But when the GP mentions that they have a patent application (I don't think it has been granted yet) on that, the best TWO posters can do is to make childish snipes at "The Cult of Apple".

I mention Apple's superior display technology, and then you accuse me of only being able to snipe at Macs? You're a myopic fanboy toolbag. Go buy some more shiny electronics, and leave me alone. I hear Britney Spears has sold a lot of CDs, does that mean she's the best, like your beloved iTampon?

No more Cheetos-fingers when a friend asks for computer help and you don't insist on using your own mouse/keyboard... *shudders* Though orange cheddar powder is always one of the better "mystery coatings".

I wouldn't mind it being a camera, just for the simple fact that then I could video conference and have it look like I'm actually looking at the person, instead of the screen BELOW (or above) the camera.

I've seen a concept of this once where the screen was what they called a 'surface camera' I think. The idea was that you can use it as webcam, input device but also as a scanner... you just put a piece of paper against your screen and you have an instant copy you can edit. And i can imagine they could also extend this with an infrared pen or something like that to create a touchscreen that can also be used as a high-resolution drawing tablet. Just wait until Wacom builds a screen with tech like this and people will go crazy for it.

And we'll finally have to let go of a classic tech support phone call. No more mocking the secretary with the piece of paper held to the screen.

Oh well, we'll always have cupholders.

Now the idea of extending it to include a light stylus... THAT I like. I don't think that was mentioned in the Apple patent. Make it a diode laser in the stylus and you could get the focus down tight enough to be per-pixel accurate. That sounds very useful.

Tactile feedback does not sell new technology. What's more, waving your hands about in front of the screen is absolutely certain to be less confusing to your average computer user than a keyboard and mouse, which are more or less clearly labeled. We already have trouble with locating the 'any' key. There is no telling what kind of issues this technology might bring.

Tech support: ok, ma'am, slow down, just tell me what happened and we'll get this problem sorted out.Customer: Well, I was reading email when my

Claiming buttons are good does not publish any papers; everyone knows this. If you claim some kind of button substitute is good, you can publish two papers. One making the claim and then another comparing it with buttons in a user study and showing that, actually, buttons are better. If you're really clever, you can then publish a third paper on the methodology for evaluating button substitutes, and a fourth paper on potential problems with future button replacements. Guess which route academics prefer.

Forget this "touchless touchscreen" nonsense. Combine this with the various clever stuff being done in lenseless digital imaging, and we will finally achieve the dream... A Telescreen in every house.

Look for it in the next dubiously compatible revision of HDMI: "Secure audience reporting protocol" an HDMI spec extension allowing your TV to report the number and approximate demographics of viewers to your Blu-ray device or cable STB. Pay-per view programs can now control the number of viewers, V-chip 2 can now detect child-size viewers and automatically halt display of R-rated content(sorry midgets, its for the good of the children)! Neilson will be completely obsolete!

Somebody else suggested advertising that watches its watchers. I said kill it with fire.

I should have the same reaction to this idea, and I do, but I'm not even going to say it. Because this is inevitable. This isn't funny at all. It'll happen, as surely as the RIAA will sue somebody. It will be delayed while Apple behaves as Apple usually does and radically overvalues their own patent, to the point that Taiwanese and Korean manufacturers refuse to license it, but it will still happen. If Samsung pays

As for the advertising that watches its watchers, The enemy is already here. [quividi.com] There may be others. At least, for the moment, their technology is only economic for emplaced signage, not just any device with a screen(yet).

As for the second, they are working on it [google.com]. Current techniques, for movie theaters, all seem to be based on using IR cameras to detect the distinctive reflectivity of camcorder CCDs, along with IR or visible light lasers to flood the sensor(totally safe for the audience's vision, they haste

That seems like the obvious way to interact verbally with the computer. At least for parts of applications where you are selecting or entering text.
Touch and gesture has its niches for visual information, where pointing is more succinct than talking or typing about the action.

have you ever worked in a cubicle next to a sales person? When they "rightsized" our department they relocated us to in-between a call-center and a sales floor. It was pure hell people yammering on at the telephone alllll day. Now add to that everyone also yelling shit at their computers, and see how much productivity you gain.

If it won't work for business, it will never catch on.

Now how about voice control for home? Will it work while I watch a movie? because I never game without a movie or at lea

I think I would slit my wrists if I had to work in an office where everyone spent all day talking to their computers rather than typing and/or mousing to communicate with them. There's a reason people don't work that way, and it's not the technology (which is pretty much there now). It's that people don't want that much yammering going on all the time.

I tried out the voice activation on my new netbook just this past Saturday, and the first thing I discovered was it won't work when the TV is on. So much for using it at Felber's with all the noisy drunks, except maybe outside in the beer garden.

A loud clatter of gunk music flooded through the Heart of Gold cabin as Zaphod searched the sub-etha radio wavebands for news of himself. The machine was rather difficult to operate. For years radios had been operated by means of pressing buttons and turning dials; then as the technology became more sophisticated the controls were made touch-sensitive — you merely had to brush the panels with your fingers; now all you had to do was wave your hand in the general direction of the components and hope. It saved a lot of muscular expenditure of course, but meant that you had to sit infuriatingly still if you wanted to keep listening to the same programme.

....

Another voice broke in, presumably Halfrunt. He said: "Vell, Zaphod's jist zis guy you know?" but got no further because an electric pencil flew across the cabin and through the radio's on/off sensitive airspace. Zaphod turned and glared at Trillian — she had thrown the pencil.

Hypothetically, if I knew someone who may have at one point or another visited a site with risque adult material....
well, I see potential interface issues...
right, I am off to register touchlessporntube.com

Get the public used to the idea of computer vision input for everyday tasks, and suddenly adware starts including drivers for the cameras (just in case you forgot to install the drivers yourself). "People seem to keep looking at the upper right hand corner of the ad window. Move most of the content there, and the subliminal content to the bottom. Horizontal mirror image for Mac OS X and Ubuntu 10.4"

I wouldn't mind a solution that solves the sanitary benefits of getting rid of hand scanners. I'm sorry, but I work in a colo, and there's a hand scanner next to the badge reader (you scan you badge, then your hand and the first door from security to the datacenter opens up). I carry a tube of hand sanitizer because I have no interest in following 500 people with whatever bacteria/viruses they are carrying since entering the facility, ret

It would be better to invent people who are not total idiots and assholes regarding proper hygiene. Day to day, there are no diseases you can pick up through touch that your immune system cannot handle with ease. Unless, of course, you are a total idiot that never exercises his immune system.

I don't think that this will die for security concerns, though it does have tremendous potential to cause them. I think this will die because it's unnecessary, impractical, and redundant. Because this requires gestures much larger than would be required with a mouse or traditional touchscreen; and it doesn't really provide much in the way of increased functionality for the cost of the experimental technology. Lastly, it's because we can already make webcams small enough to fit right above laptop LCD screens

I work in the avionics industry, some of our products use touchscreens. I find touchscreens are much slower/less productive in use compared to a keyboard/mouse setup. Also they get very icky very quickly. Sometimes its even hard to read the screen after a few people have used the system, apparently some people like to rub their hands in dirty sump oil before using touchscreens. Personally I choose just about any other form of input over touchscreen when possible.

A webcam (or maybe 2 for 3d gesture recognition) is the only hardware device needed, no special hardware that senses with not very fine resolution where are your fingers and usually not with how much intensity are pressing. Think in Microsoft Surface, or better yet, in Sixth Sense technology. Moving the game to mostly software land gives a good potential for features, at least if cpu is enough.

I guess i should say i like the concept of a touchscreen. I have a blackberry storm and an Archos 9 because I'd like to be able to do without a keyboard and mouse. The problem is fidelity. Even after a LOT of practice it is impossible to get the speed or accuracy of a plain old KB and mouse/touchpad.

So now they are talking about stabbing at the air with your fingers and it sounds so cool in a Minority Report kind of way. I like to think that they can get this right but how can stabbing at the air ever be better than stabbing at a keyboard?