Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

I think that the same reason why touchscreen technology never caught on until recently is the same reason why motion controllers like the Wiimote never caught on until the Wii, because the previous of both concepts where crap and didn't offer anything over a mouse, keyboard and joystick. Actually, what it really comes down to is that if a form of control doesn't give anything over the defacto form, then its pointless. How many stupid microphones and control devices have been released in the past 30 years that were no better than just pressing the button on a joystick. If it doesn't really understand that I'm saying "Fire!" instead of just blowing into it, then its pointless. It also has a lot to do with the interface being standard to the system. When its an addon, it just doesn't get as much attention.

This is for the same reason that command pipes/stdin/stdout will always be more useful in unix-like OSes than they will in Windows. Because they essentially come with the system and 90% of the programs are setup to use them. Same as why REXX was so much more successful on the Amiga than it will be on any other OS. If the Wiimote had been an option item, then the software wouldn't have been there and the Wii would have probably been a flop.

those are my two cents, but I'm considering also those two factors: large touch screens were costly as hell, while the smaller one, even if affordable, were targeted to devices without the actual power to sustain a well designed gui. Even now the iphone sometimes stutter during image operations.
it's now that the cost of embedded devices, of touch screen and the relative power of embedded devices allow to build a useful gui.
tablet pc, while had touch screens, also had a keyboard and a mouse available, and most of the interaction was via those device, as those pc where targeted to the pc market where the programs are somewhat tailored to having a keyboard, so tablets didn't have enough added value to justify the increased cost. also those pc were sporting a p3 800mhz, not exactly a power monster, for the age.

A finger is a rather clumsy interface device compared to the pinpoint precision offered by a mouse. And when the OS and all of the software on that platform is designed for a keyboard and a mouse, then change becomes hard.

Multitouch trackpads, on the other hand, simply overlay gestures on top of existing mechanisms. A two-finger tap is a "right click". A two-finger scrolling gesture translates easily into "scroll wheel" input. All events which existing systems and software understand.

A "pinch", however, is a new type of input that has no translation. As such, software has to be reprogramed to understand that type of event, and then perform the appropriate behavior.

THAT CHICKEN MAY BE FINGER'LICKING GOOD, BUT KEEP YOUR GREASY MITS OFF MY SCREEN!

Now you know why.
this lower caps random characters are here merely to get around/.'s lame lameness filter that doesn't understand i used all the caps above to look like yelling because that is what anybody would be doing with a high priced bleeding edge touch screen and an umfriend with greasy kfc. what fools these admin mortals be.

Actually, that's why I'd like to try a "MInority Report" arm waving system for a while. Add a couple of wrist weights and you could get a nice workout while you work... (grin)Seriously though, I think for a good touch-screen system to work it would almost need to be something on the order of an inclned draftsman's desk.

And I just read somewhere about some new coatings for materials that wouldn't allow oils to stick to them. Maybe they could add them to the iPhone and help you keep your t-shirt clean... (gr

I think you missed his point: in previous iterations, fingers were too inexact on the existing hardware. Also, they couldn't distinguish all that well if two fingers hit the screen, and the originals did seem to have a lot of "aaargh, I didn't mean to do that!" in them. They were also much more expensive to make in the past, and more prone to wear out. So since mice were cheaper to make and easier to pinpoint, they won for the first decades.Or are you suggesting that Picasso should have finger-painted and n

I'm also thinking that some of our manufacturing has gotten *much* better in the last 30 years. The touch interface probably was mushy, and maybe even had visible cross-wires on the screen, as well as flat screens back then being black-and-white LCD as opposed to the full color, fast response available today. It became popular and available when the time was right, nothing more.

It became popular and available when the time was right, nothing more.

That's partly true.

I have a Compaq Concerto [findarticles.com], one of the first touch-screen notebooks. I bought mine in 1994, but they were available for a couple of years before that.

The touch-sensing hardware is good enough, but the cpu (486/25) struggles under the load and the computer feels unresponsive.

The big problem though is software. MS introduced Windows for Pen Computing for this computer, and it sucks badly. It was never really updated either. Unfortunately, that was also when the Windows monopoly started to bite, so there was no other player to pick up the touch computing slack, and the concept withered until now.

My first PDA was a Palm III, and I still wonder to this day why windows tablet was never as elegant or as good at handwriting recognition as Grafiti on the palm.

Palms didn't do handwriting recognition, they did custom glyph recognition. You had to adapt to the device, rather than the other way round, which makes recognition a far simpler problem. So Windows was in fact infinitely better at handwriting recognition, because Palms didn't even bother to try.

Touchscreens are visual interfaces, keyboards/-pads are haptic interfaces. For most devices, keys make more sense because they're always in the same place and the touch feedback makes it possible to use them without looking. I do not want a touchscreen remote control, for example. Touchscreens only make sense for complicated or multi-function devices and those haven't been portable very long.

For one thing, that's only evidence that your remote is bad for the task, not that remotes are bad in general. For another, you're assuming that changing the interface technology would improve change the interface design -- that seems unlikely to me. If your DVD player had a voice interface that had the same three options it would be just as hard to use, and somewhat slower.

The womenfolk in my family get annoyed having to have separate remote controls for each device (TV, satellite box, DVD player, VCR player, cable box). Having a universal remote is one solution to this problem, but there is still the problem of remembering which buttons control the sound volume, change channels, and knowing which channel has which number. A programmable touch screen LCD remote control seems to be a solution to this problem (if it could

I think most people have encountered this phenomenon. I don't really think the remote is the problem though, it is the multiple devices. If you (not you, them) think of everything as a channel instead of a channel on a specific device, you aren't going to do a very good job of remembering how to switch to a given channel (because it then becomes "how do I get to that channel" instead of "how do I get to that channel on that device", which is easy).The solution is a tuner that presents a list of channels tha

The main problem is remembering the locations of all four remote controls. Not too easy when there are newspapers, cats, notepads, books in the living room area as well.There is also the added complexity of navigating the customized menu of the DVD player itself, particularly those DVD's with multiple menu pages (for complete collections).

Doing something as simple as switching from watching Sky News to watching a DVD will involve:

To do that with my remote involves pushing the "Watch Movie" button, and when the DVD ends, pressing the "Watch TV" button.

Its not touch screen, just a Logitech Harmony 670.

The thing that this line of remotes does differently is that you don't control devices, you control actions. My housemate doesn't have to remember to switch to "Video-2" to watch cable tv, or "Component-1" to watch a DVD, the remote knows all that and hides it all behind the simple activity options.

I'll second that. At first I thought spending $100 for a remote control was just absurd, but I realized I hadn't bought a new toy in a while, so I went for it, fully prepared to be underwhelmed.

It's incredible how much easier my life has become as a result. Even though I'm intimately aware of the ins and outs of my entertainment system, it's still tedious to have to set the right input on the TV (receiver outputs HDMI, component, and S-video for different functions [S-video is required for the on-screen display of my CD changer and receiver]), then on the receiver [even though my H/K receiver supports input naming, so I don't actually need to know what Video3 is], and then dicking around with Comcast's remote.

Now, after about an hour doing the initial software install, setup, and then some personal customizations, I just press "watch DVD" and it turns on any components that aren't already on (leaving the others alone), sets all the right inputs (and the proper video mode and aspect ratio on my TV). I replaced 6 remotes in my entertainment room.

Then I realized that I could control just about anything with an RF receiver (my iMac, my MacBook Pro, my MythTV box in the spare room, the ceiling fan in my bedroom, and even, with a little modification, a special switch I built to control the lights in my saltwater aquarium (so that the blue glow doesn't distract during a nighttime movie session).

The learning function ensures that anything using z-wave iR can be programmed, with logical operation names that even my parents can use when they visit. Guests don't have to ask how to get to the DVR or how to get the sound to work when watching TV.

With this puff-piece ad coming to a close, I will say that I think the higher-end Harmony remotes are overrated, and that the lower models are generally more than sufficient for almost anyone, including geeks (just make sure to check the remote's limit as to number of devices before purchase).

Ditto with the above post. A touch screen contextual interface makes a lot of sense, as you can't really use the majority of the buttons on your remote without looking at it anyway.Besides, I'm pretty sure I could design a system such that the most common operations were gesture-based. Take an iPhone, for example. Keep the edge volume control, and implement a menuing system that drilled down into context buttons for each known device. But on any screen, support a two-finger scroll "channel change" and a two

OK,
I never flew f-16's, but I did fly C-5's... lots of buttons, switches and MFD's... waaaayyyy too many. As much money as the USAF pays for avionics, my alpine iva-w205 [alpine-usa.com] has a tactile feedback on the touchscreen that is way more advanced than what I worked with in FRED. [wikipedia.org].. The feedback system is kinda weird and creepy at times... but its basic idea is innovative.. why is this in a car stereo and not on some cool computing devices or lcd based fingerworks touchstream keyboard?
Perhaps some braile-based feed

The one piece of tech that really makes touch possible is flatscreen LCD technology with scratch-proof surfaces and rapid response. That's important. But what's even more important is designing products for touch, not just slapping it on.

Take the iPhone. When you use it, you're not just using your fingers - you're also using the hand holding the item, keeping it in place and even moving it a little to assist in accuracy. Physically it is better suited to touch than a laptop, which up until recently were thick and heavy. Also, laptops generally have a mandatory keyboard getting in the way. Worse, the keyboard/mouse combo is more convenient for the GUI OS in place. The iPhone on the other hand completely reinvented the GUI to support touch. Other new technologies like the touch table are doing much the same thing, albeit in different ways.

linux is "almost mainstream" ?
honestly, as much as I'd like that to be true, you gotta be deluding yourself

He is not deluding himself. He is viewing things from the majority frame of reference, Linux is a server operating system, while you are viewing things from the minority frame of reference, Linux is a server and a personal and an embedded operating system. If you were to sit in on business school classes today you would find nearly everyone has heard of Linux and it is synonymous with servers. So

I don't think it was just a matter of patents expiring, it was most likely because the technology was finally ready for it. In the past most touchscreen-equipped systems I've seen seemed to be pretty weak in every area except the touchscreen, these days the machines equipped with touchscreens are powerful enough to actually take advantage of the touchscreen capabilities.

That said, I'm still waiting for a tablet mac with multitouch tech and a built-in wacom tablet (like the Cintiq) so that I can use my hands to drag stuff around on my desktop and the stylus for actually drawing stuff.

while Jeff didnt invent multitouch, he certainly brought it to the attention of a lot of people with a good demo and a few teaser apps (maps) to show what could be doneMS, Apple and chums have a lot to thank him for as far as raising public awareness of different UI and OS possibilities, using a mouse/qwerty keyboard should not be a fundamental of interacting with computers

touch is junk and nothing out there that people buy uses it. MULTI-TOUCH caught on. multi touch was invented by two professors at the university of delaware, who founded a company that made the greatest keyboard of all time, the touchstream lp. jobs saw the inherent promise of multi-touch and bought the company and all its ip, in the process making everyone sign nondisclosure agreements and burying the company. the price of the greatest keyboard ever made, no longer available due to job's actions, has rocketed to over $1000 on ebay and keeps going up.

a lot of you are reading this and thinking of non multi touch products that are getting some sales; however they use the fundamental tech that makes multi touch work. multi touch was about figuring out the shape and pressure of the fingers being applied, in addition to distinguishing multiple fingers. this eliminates the "palm brush" problem that plagued early touch pads.

it took a long time for people to figure out what happened; in the end one of the delaware professors listed his profession as 'apple engineer' on a public political contribution and the mystery of the jobs touchstream "nuclear option" was solved.

the reason it's caught on "just now" is that it's actually brand new technology. hopefully someone someday will undo that damage that apple has done to the multitouch industry by buring it under NDA's and patents. in the meantime, they have usurped microsoft for title of tech company most damaging to progress. let's see how long they can hold the crown.

they have usurped microsoft for title of tech company most damaging to progress.

No, they would have to repeatedly dish out lies, FUD, and cripple other companies financially to the point that they have to sell to MS or someone else, and do so for 20+ years, before they could EVER hope to approach the damage Microsoft has done.

Touch did so catch on. Remember the PalmPilot? All the rage like 10 years ago. Had a touch screen. The difference is -- more than multitouch, because similar things can be done with gestures -- the iPhone is 25 times faster, in color, and internet-capable, and a phone, and a camera, plays videos, and has over 16000 times more storage space. It's as fast as a desktop computer from the PalmPilot era.

All kinds of bank machines and kiosks have had touch screens for years. It's not the touch screens that caught on. It's everything else that caught up -- and got cheap enough for consumer goods.

I remember how exciting the touch screens seemed in the late 80s, but when using them reality set in - you quickly fatigue holding your arm up to touch a screen.

Plus, with any user interface people need a certain confidence in correspondence between what they do and what happens. When you push a button, you KNOW it got pressed. If you push a joystick left, you KNOW you're going left. That 'payoff' is like a contract between you and the machine that goes favorably. But if pressing the screen where you believe you need to press may or may not do what you want, that contract gets shaky. Especially since there's no click or motion to reinforce what you're doing. This, by the way, is why I think 'free space' VR controllers never caught on...at least until the WII.

Still, software can create cues to take the place of physicality and have 'grease' to avoid common miscues. Plus, having the screen be horizontal reduces the fatigue.

But in the end, as archaic as the keyboard seems compared to touch and speech, it really is an incredibly expressive and low-energy-requirement device.

I built the first touchscreen system for mainline railroad control in 1977. We knew then that you had to give immediate feedback, by blinking, that the operator had succeeded in activating a change. The reason that it didn't catch on is that the keyboard was something you could pound on in frustration when the trains didn't do what you wanted. Nothing as satisfying as keycaps flying all over the room. We sold a lot of replacement keyboards.

Dispatchers aren't supposed to provide the safety, that's built into the signalling system. I remember a dispatcher throwing his headset down on the floor in disgust saying "they'll stop". Sometimes the communications are bad or misunderstood and a train passes the siding it should be on.

Pretty much the worst thing that could happen is a "cornfield meet". But in signal territory that means both trains entered a block with red signals.

I agree. The iPhone interface is just so amazing. The other day I was in a big box store and we looked at the GPS units they had. The only thought I had about any of them are "these touch screens are so hard to use."

I realized that was because none of them supported multi-touch. To zoom in you had to go press a little software button, and it would zoom in one level. The levels are all arbitrary. Dragging the map was often relatively unresponsive, if you were even allowed to do it. Compared to the small amount of time I've messed with iPhones (I don't own one) it was just annoying. The interface on the iPhone is just so much better for the map.

It's the same thing at my local Borders. They've always had customer terminals around the store to look up books and such, as long as I've lived here. But a few years ago they replaced some with touch screen devices. Now I think they all are.

Before they just had a mouse and a keyboard. I could what I want in fast, and browse easily using the mouse.

Now they are touchscreen devices. Half the time they don't even seem to respond to my finger touch. I've never been able to decide if I'm touching too fast or slow, hard or soft. Sometimes it works, sometimes it doesn't. The keyboard buttons (which are at least 1-1.5" on each side) are hard to hit with any accuracy. Sure my finger tip is smaller than the button, but I can't seem to press them accurately. Note that this isn't a calibration problem. Once I've figured out how off the individual machine is, it's still hard to hit the right button. The lack of any kind of tactile feedback (the auditory and visual feedback, if there, is often 100-200ms late and thus useless).

Basically, it's a pain to use. They took an easy interface everyone knew how to use, dumbed it down and made it far more useless, and spent a bunch of money in the process.

Yet I could type on the little tiny iPhone keyboard pretty well within seconds of trying. Clearly it was well written, with touch screens in mind. Compare that to the Borders system which, from what I can tell, is just a fancy website with the touchscreen operating as a mouse, distilling whatever you do into a standard mouse click. This removes all subtle differences that could be used to help figure out what you're trying to do.

This is with relatively powerful computers (1GHz plus). Imagine how well touch interfaces could have been done 15 years ago with a 25 to 100MHz processor. Thing how useful touch interfaces were 20+ years ago when most people only had character based displays and were using DOS.

I'd only now that we are getting the necessary precision, processing power, and experience to start making good (multi)touch interfaces.

I realized that was because none of them supported multi-touch. To zoom in you had to go press a little software button, and it would zoom in one level. The levels are all arbitrary. Dragging the map was often relatively unresponsive, if you were even allowed to do it. Compared to the small amount of time I've messed with iPhones (I don't own one) it was just annoying. The interface on the iPhone is just so much better for the map.

This is an important point. Touch screen interfaces are much less abstract than non-touch interfaces. You're actually physically manipulating real little "objects", rather than issuing commands. The problem with this is, the first time you try to drag something, or scroll or zoom, and the interface element you're working with doesn't follow your finger, you're sunk. The whole illusion is shattered, and the UI feels extremely awkward.

This requires a fair bit of graphics processing capability, certainly by the standards of portable devices.

Even the iPhone's hardware isn't quick enough to scroll e.g. complex web pages like this -- so what Apple did, rather cleverly, is, rather than slowing down scrolling (failing to track the finger) until the device catches up, the device simply keeps on smoothly scrolling, filling spaces it hasn't had a chance to draw yet with a checkerboard pattern, which provides a spacial reference.

There are other little things like this that make the device feel more responsive as well. For instance, if you try to scroll off the top of a web page (or other vertically scrollable view), the phone will let you -- the scroll will keep right on following your finger. Then, once you let go, the view will bounce back.

These kinds of tricks were not particularly obvious. Natural-feeling touch UI requires an entirely new vocabulary of UI behaviors, and that's just starting to emerge now.

I remember how exciting the touch screens seemed in the late 80s, but when using them reality set in - you quickly fatigue holding your arm up to touch a screen.

The term in the programming community for that was, IIRC, 'gorilla arm', from the way your arm felt when you tried to move it after a while. I suspect, though, that a significant part of this was due to technology -- a display was a large CRT, that had a depth roughly equivalent to the width of the screen, and there was a mindset that the display was something that sat out in front of you for you to look at. Aside from a number of 'table' video games, it took LCD displays to make displays something that y

To this day, I don't fully trust touchscreen, to the point where I will not even consider an iphone or any other phones without actual physical buttons. To this day I still find myself using ATM machines where I have to repeatedly jab at spots on a screen that either will not respond to touch or that are slightly misaligned.

The nice thing about a "normal" phone is that it's possible and even easy to hold and press it's buttons with just one hand. So if you're carrying something you don't have to put it down - or if you're dangling from the end of a rope, you can all the emergency services to come and rescue you.

If you're dangling from said rope with only an iPhone in your hand, you're pretty much screwed - unless you have learned the trick of operating it's touch-screen with your nose.

I had the lightpen controller for the Atari 800/800XL. It was really neat just being able to point the pen at a point on the screen, press the silver contact, and have something happen - fun applications were a scientific calculator, virtual musical keyboard, and games like chess, reversi, join-the-dots, tic-tac-toe. It was much easier to play than using a joystick, especially for isometric views.

Touch isn't very useful when you have room for a mouse and/or keyboard. Big and bulky desktops don't have much use for touch (except when used in place of a mouse, but that has been going on for a long time).

The reason touch has become so popular lately is because it has only been recently that powerful chips have become small enough and that power (batteries) have become light enough that we can find use for this stuff right in our pockets--where a mouse/keyboard just isn't practical. (Unless you believe in thumb keyboards, but those are very cumbersome IMO.)

There are some appalling grotty screens around work - and they're not touch screens! Some people feel the urge to not just point at the screen, but tap it with their finger for emphasis. Plastic LCD screens aren't as abrasion-resistant as the CRT monitors that replaced them, so when they do clean the thing with whatever dust-laden rag was handy, they often leave a permanent scuff mark.

The greasy 70's and sweaty 80's rendered touch screens intolerable after any sort of use. Now, people are much less greasy and sweaty.
I swear no one had AC in the 80's.

If you are on to something here then touch is doomed. The green movement wants to reduce the usage of air conditioning. Furthermore, last night I saw a green public service announcement advising less bathing and shaving. It advised not washing your hair and having women "put their hair up" in some sort of stylish fashion to hide this, a

I personally don't think it is as easy to use. You don't get the feed back you when you use a input device like a keyboard. You can't "feel" where the keys are. You need to stare at the screen to use the technology. On simple and small devices like phones it makes more sense because you have to cram so much into so little. For everyday workstations it does not makes sense. Look at the touch screen / motion sensitive input media wall type devices. Would you really want to stand around waving your hand

You have to wave your arms around - which is very tiring (much more so than a couple of finger movements for a mouse). that means you can't keep it up for more than a couple of minutes. If you don't beleive me, just try holding your arm aoutstretched for any length of time.

Second, it takes up an enormouse amount of space. Your fingers don't have the dots-per-inch resolution of a mouse, so the interface area has to be bigger and therefore more expensive.

On a purely practical point, you also cover up the object you're addressing. Unless you have transparent fingers, you can't see all the detail of whatever's underneath. A basic and unresolvable design flaw.

Finally, there's the goo factor. Imagine all the smears, stains and gunge that will accumulate on the touch surface - both from your hands and everyone else who uses it. Apart from the obvious hygiene issues, the surface will get dirty. We know how annoying the occasional fingerprint is on a screen - now think what it'll be like when the screen is covered in grease and other smudges.

In summary, it never caught on. The only people who advocate it are those who've watched Minority Report a few too many times. It's not cool, it's not futuristic and hopefully is doomed to the junkheap of techno-history along with punch-cards and robo-vacuum cleaners.

In summary, it never caught on.millions of iPhones/iTouches sold begs to differ about it never catching on. Not to mention that the same technology will be making its way to consumer laptops and business conference rooms. People like this technology. Yes, you will have smears on it, but with every technology, that will get better with every revision. The fact that you are so against a technology its a character flaw. Be open to it, try it and decide then. All the examples you gave were nothing by hy

A tablet PC is basically a laptop with a touchscreen. Yes, you tend to use a stylus rather than your finger, but the technology and interface are basically the same. Tablet PCs have caught on in certain markets, and I think that as the software/interfaces improve, their use will grow more widespread.

millions of iPhones/iTouches sold begs to differ about it never catching on.

One product (or, arguably, two) doesn't mean it has 'caught on".

In any case, I've only seen a few iPhones/iTouches since they were introduced - it might be common in the US, but I wouldn't say that is necessary true anywhere else particularly. Not that 'catching on' only in the US is anything to sneeze at, especially from a $ standpoint, but still. From what I've read, some people like it and some people hate it.

They have (indeed, I own one, and love it), but there aren't exactly any DS games I can think of which make any good use of the touch screen. The only game I've played that has a decent concept is SimCity DS, and their implementation is flawed. The haters are right on this one, the DS touch screen is a gimmick... or if it isn't, I have yet to see any evidence to the contrary.

Zelda: Phantom Hourglass is played entirely with the stylus. I had my doubts too before I purchased it, but it really is amazing. Give yourself some time to get used to it, after that it seems extremely intuitive.

Also, be aware that most third party titles usually aren't up to par with Nintendo in terms of quality (not only gameplay, but interface as well). For the Wii, I'd recommend Metroid Prime 3 in "expert mode" (or whatever it's called).

You use the kiosk at the bank, grocery store or public library and I don't hear any complaints about smudges or hygiene there. In those scenarios you don't want a full keyboard laying around that someone can vandalize (ie. jam the keys or rip the thing out) or get rained on.

On a purely practical point, you also cover up the object you're addressing. Unless you have transparent fingers, you can't see all the detail of whatever's underneath. A basic and unresolvable design flaw.

that's interesting. I used to draw blueprints for hours at a time. I fail to see why I couldn't do the same thing but on a computer screen instead of a table. Sure, the screen can't be 90 degrees to your desktop anymore, but then, if you have full size multitouch display.. why would you need a desk top anymore?you don't have to wave your hands around in thin air to make this interface hands down (erm, ok, pun intended) the best thing since sliced bread. you can interact with all the items in your comput

Honestly, you have probably already done so many times. Assuming of course that your are not suffering from some compulsive disorder and constantly popping out the hand sanitizer after touching every door knob, counter top, chair,... The myth busters did an episode, apparently contamination from the bathroom spreads surprising far in a building just through the air let alone direct contact.

Touch didn't "just catch on." It's been around forever and has been evolving steadily and is being used in more and more places. You're postulating that because the iPhone uses touch and Bill Gates did a demo that now, May 2008, it has "arrived"? Touch isn't just now "catching on," it's simply becoming more and more common as technology improves. The regular iPod has had a touch-sensitive wheel ever since the 2nd generation. Laptops have had trackpads for ages. PDAs have had touch-sensitive screens since, well, as long as they've been around. I've seen touchscreen kiosks and ordering screens (Arby's used to have them) The only thing I can say is that as touch technology improves in the same way that all technology improves--becoming cheaper and smaller, in addition to better--it's being offered in more devices where small and cheap matters--i.e., portables.

I had a touchscreen 17" CRT at home almost ten years ago, and while it was really neat--there's something really satisfying about actually pressing a link with your finger to 'click' on it--it was a pain (literally) to use for any extended amount of time. Touch works best when your arms can be at rest, which means your hands won't move much, which means a small device. Now, who wants to poke on a tiny screen on their desk, when they could instead use a mouse and keyboard to manipulate objects on a 20" screen? No one. So, where does that leave us? Where is touch useful? Ding ding ding! In tiny devices that are already in your hand. Or, to put it another way, it's not so much that touch is just now "catching on," it's that we're finally finding things that it's really good for. Like I said, a touchscreen is not a good replacement for a regular old mouse.

Multitouch is a nice new addition to touch technology, but you know what? I hardly ever use it on my iPhone. I rarely zoom in or out. I click and drag a lot, and double-tap to zoom in and out, but this is nothing that couldn't have been done on a mid-90s Palm.

Mod parent up. Touch-screens have been around for many years, in fast food, industrial control, kiosks, and similar casual-use push-big-buttons applications. Touch screens are a huge pain for a session long enough that you want to sit down. So they're useful for
palm-sized devices.

But for text editing, or graphical input? No way. It's too blunt a tool.

Multitouch is a nice new addition to touch technology, but you know what? I hardly ever use it on my iPhone. I rarely zoom in or out. I click and drag a lot, and double-tap to zoom in and out, but this is nothing that couldn't have been done on a mid-90s Palm.

Multitouch works really nicely on laptops; scrolling by using two fingers together rather than one just feels so natural. In fact, it reminds me of how the scrollwheel felt like a big advance in practical HCI before it; "You mean I don't have to search for the scrollbar or the paging keys in order to move up and down? Lovely!"

For me, the probable next step forward is when we get better haptics integrated with touch. For example, though the iPhone is decidedly neat, it's tricky (by comparison with a normal

You could ask the same question about CD technology, plasma TV, the computer. Alot of technology we have today was developed 20/30/40 years ago, the problem is that that its not profitable or easy to manufacture, develop, and sell these technologies.

This applies to the touch screen table tops and such. We had this technology for a while but its just now were the price to sell the product and the price to produce for the product is in reach of both huge corporations and smaller companies.

For the past thirty years most fast food stores were using the stander hierarchy register machine, green display, you pressed a keypad that added an item and it was top to bottom, very difficult to go back to the top of the list to modify a mistake. Now you go to Mcdonalds, they have touch screen displays, they display the image of the food(Big Mac), you press the items they want or do not want(lettuce, ketchup,mustard), and there is the order and if you need to correct a mistake you can easily click an item and fix the mistake.

Could they have had these type of registers earlier? Yeah, but they weren't cost effective till about 2000 when I believe they started to slowly replace the older registers with these registers.

The point is, the technology is there, its just a matter of making it cheap enough and affordable for companies and people to develop it and buy it.

The day I have a table the size of my kitchen table that can support six people playing an RTS, all through touch screens, none of that voice crap that ive seen on youtube, and were yelling off commands and tactics to each other against six other people in another room, will be the day I crap my pants.

I think the reason you haven't seen touch catch on before now is because of how horrendous it is. Tactile feedback isn't just a side effect of current interface methods, it's an important aspect to input. Even ignoring problems with touch that may be solved as the technology matures (dirt/grease, unintentional gestures, dirtying up a display that doubles as the input device, losing finger position), touch simply doesn't feel like interaction.

As far as actual devices go, having sold both touch and classical variants on appliances, I can say that the more often someone uses a touch interface, the less inclined they are to continue using it. When someone's favorite model transitions to a 'touch' type interface, they can't return it for what they had been using fast enough. It's the hot new thing that nobody likes to use, but everyone thinks is real pretty.

In order for touch, and multi-touch to be successful requires a large amount of UI bandwidth for feedback and interaction, it needs to be nearly seamless to work well.

Prior to current days, hardware just made better user interactions. A keyboard or a mouse do a lot of complicated things to feel right to the user, and yet output a simple qualified input to the computer system.

Today all of that complexity and even more is being placed into the UI at the expense of other activities, which until relatively recently was mostly CPU bound.

The last was the elegant creation of the idea to fire up everyone else. In this case the Iphone.

But just like the advancements in keyboards, mouse, trackpads, and game controllers we have only seen the beginning.

My hope is that this will also catch on with the tablet form factor, where somebody will wake up and realize the best place for the menu on a tablet is probably not the upper right hand corner, where a righty will obscure the screen. And that it probably deserves to exist or the right hand side for most items, and even look a lot more like the office ribbon, than the standard menu bar.

This is cool though, we are on the cusp of the next wave of UI. That that comes after the current mouse oriented menu and panel methods. It will be cool!

duh, pc's and screens were large and bulky in the 70's and 80's. what are you going to carry that 10kg PC around to use it's touch screen? a keyboard and mouse are still a way better interface for most things.

Because it was (re?)introduced to the masses with the DS in a way that showed them how it could be useful and entertaining. Before, all the average user could do was buy a really expensive and hard to find computer monitor with limited applications that were geared toward the device itself (as opposed to just letting you control mouse cursor clicks or menu selections). I'm sure a lot of people wanted the benefits of the device before now, but they were concerned with smearing or scratching their screens,

In addition to a lot of the reasons mentioned above, it's only recently that it's become reasonably durable and reliable. I remember the early touch screens, and the damn things couldn't hold up in public. Something that's always on the fritz, no matter how cool or easy to use, is inferior to something that's functional and reliable.
In some regards, the technology probably arrived too early, and enough people got burned on the early generations that it hurt the development.

My PDA phone has a slide-out QWERTY keyboard/and/ a touch-screen with a rather nice finger keyboard. But it still blows. I can type a message on the phone with the keyboard and not take my eyes off the road, but not so on a touch screen. there's no tactile feedback!

All the ingredients have been around for a while, but it's
only now that they have come together in a big enough way
to be noticed:
battery power, compute power, UI design,
display quality,
applications that people will actually buy.

I have an iPod Touch and love it as a portable media/information gadget. But I also despise the way Apple are handling the SDK rollout. I'm not going to invest a dime on
development for it if I have no guarantee that I will ever be able
to run what I write on a real device.

I developed for touchscreen technology back in the early 80s (prototype calculator/point-of-sale application). Touchscreen technology just plain sucked back then. It required frequent recalibration, and its resolution was piss-poor.

Let the conspiracy theorists postulate about patents all they want, the fact of the matter is, it wasn't ready back then (and neither were the platforms to use it - who'd want a touchscreen on their 4.77 MHz 8088 PC?).

It depends on what you mean by "caught on". Touch screens have been used in kiosk systems and ATMs for decades. They've been in PDAs for many years. The only place they haven't caught on is in the PC and I see no indication that this is changing. "Surface" is not being sold.

Touch didn't catch on for personal monitors because it is inferior as an input device to a mouse. It works for kiosks or ATMs because people don't use them for long periods and are in a better posture for touch screens and because they are obviously much sturdier than mice. They've been used for PDAs for decades, so the "iPod Touch" is hardly an instance of "catching on". The original Palm had a touch screen as did the Newton. (Though ones designed for a stylus.)

i recall seeing touch screens as the "next big thing" all over the place in the '84 Knoxville World's Fair, full 12x10 screens and all, yet my first touch device was a palm in 2002.

Now, the one place they made it HUGE was in restaurants, where hardly a place lacks one (half dozen) now. context is key - a place to enter orders without looking for a keyboard, a place to manage table occupancy, integrated with the credit card system to avoid an extra piece of hardware that could break - the new systems had it all and have had it for over ten years now.

I worked at Carroll Touch [elotouch.com] for a while on Touch screen drivers for their IR and Guided wave products. Before that in 1990-1992 I worked at Laser Plot and worked on adding touch screen for their Ship Navigation Systems.

The problem that many applications ran into is that people have fat fingers. A mouse is much more precise than a finger. Many people who looked at Touch technology just treated it like a mouse, which makes for a had interface. When people get exposed to a mouse/keyboard interface converted into touch, they repulsed by touch and never look back.

If you design an interface from the start to be touch based, you can get a very nice interface.

Touch screens are nothing new; I was working with this technology back in the 70's. What's changed is the processor power and UI technology that has made touch screens compellingly useful.

There are some applications where they provide the most functional user interface; Apple uses them to great advantage on their iPhone and iPod Touch. It allows rich user interaction on a pocket sized device; no room there for a keyboard or fancy set of buttons. They're not so useful on something like a laptop; there's a keyboard that's much more useful - and the software to make any kind of use of a laptop touch screen is yet to be developed.

Something tells me that history will repeat itself again. Someone will create a workable touch screen interface for general purpose computers, then a major software company will "borrow" the idea and popularize it. The innovators won't get a dime - or any recognition - but the technology will finally break through to the general public.

That's exactly the impression I used to have of touchscreens as well, all the public terminals that seemed to react so slowly, and if it wasn't the actual touchscreen then the rest of the system seemed to be horribly slow instead, taking several seconds to do things that should happen instantly.

And then there are the UI issues, every public terminal I saw prior to 2005 that was touchscreen-equipped had a UI that looked like it had been designed by someone who just didn't understand the technology they had