Underwhelming Technology: Touchscreens

When I bought my most recent laptop and desktop, I went ahead and got a versions with touchscreens built in, because, hey, why not, it’s the wave of the future and in any event within a year it would be standard issue tech, etc. I wasn’t entirely sure if I would use them, but I thought there might be a good reason to use them that I hadn’t thought of. Well, it’s been several months now and I think I can safely say that generally speaking, touch screens on laptops and desktops are mostly pointless, at least to me.

Some reasons, in no particular order:

1. I spend most of my time in my desktop environment, on applications that don’t really have much use for the touch interface — or when the touch interface is there, it’s usually markedly inferior to the mouse/trackpad interface.

2. Trackpads in particular have gotten rather better recently (Apple has a lock on this, but even Windows 8 shows a dramatic improvement on this), so the additional utility of a touchscreen is lessened relative to the greater utility of the trackpad.

3. Stopping to touch the screen breaks my workflow, which is not a great thing.

4. The times that I have used the touchscreen on my desktop have made me aware that having a screen that big generally defeats the purpose of having a touchscreen as it is most often used today. I tried playing Fruit Ninja on it and quickly learned that there’s just too much real estate to get across (and in a vertical position, no less).

5. I’m old(er) and possibly set in my ways so the touchscreen interface requires me to have to learn something new, which I don’t always want to do.

The last of these, I’ll note, is the least convincing to me, since the Scalzi Compound has several tablets and phones scattered around it and I really have no problem using the touch interface on those; I’m not wandering about angry that I can’t use a mouse on my iPad. A touch interface makes sense on an phone or tablet; it really makes less sense on a laptop or desktop.

The thing that brought that home to me, actually, was the Chromebook I have (and on which I am writing this right now). It doesn’t have a touchscreen, unlike my Dell XPS 12, and I find that this matters almost exactly zero in terms of the relative utility that I get out of the two computers on a day to day basis. I have no need or interest in touching the Chromebook screen, especially now that the trackpad is lightyears ahead of the CR-48 I got a couple of years back. Likewise, except for the relatively few times I use the XPS 12 as a tablet (it has a flippable screen), the touchscreen capability goes almost entirely unused on it.

Bear in mind this is a reflection of a) how I use desktop/laptops and b) the current state of touchscreens in these formats. I fully expect in five years I may have several really excellent reasons to use a touchscreen on the desktop or laptop. But then again, it wouldn’t surprise me if I don’t, either. Maybe at the end of the day it’s just not the right format for touch, and the touchscreen — which will nevertheless become a standard feature, because it’s getting there already — will be just another one of those things you have on your computer but almost never use more than 1% of the time.

I guess what I’m really saying is that if I had to do it all over again, the last time I bought a laptop I probably would have gone for another MacBook Air instead of the XPS 12. The XPS 12 is a fine little computer, to be clear, and I’ve been pretty happy with it overall. But I went with it on the expectation that I would be doing more with the touchscreen than I do. The irony is that by the next time I am in the market for a laptop, it’s almost certain Airs will have touchscreens. Which I probably won’t use much.

Hubby’s computer is a 24″ desktop. We discovered, quite by accident, that its default is a touchscreen. We have three cats, and anytime one of them would walk in front of the monitor, suddenly things went wonky. Hubby finally figured out how to disable the touchsceen functionality. He agrees with you that he’s perfectly fine using the touchscreen interface on his LG Intuition smartphone, but he has no use for it on his desktop. I would tend to agree–the more you touch the desktop screen, the more fingerprints you get on it, the more often you have to clean it. Ugh.

The Jargon File, a dictionary of hacker slang originally dating back to the 1970’s and still being updated) has an entry for “gorilla arm”, the current revision of which reads

The side-effect that destroyed touch-screens as a mainstream input technology despite a promising start in the early 1980s. It seems the designers of all those spiffy touch-menu systems failed to notice that humans aren’t designed to hold their arms in front of their faces making small motions. After more than a very few selections, the arm begins to feel sore, cramped, and oversized — the operator looks like a gorilla while using the touch screen and feels like one afterwards. This is now considered a classic cautionary tale to human-factors designers; “Remember the gorilla arm!” is shorthand for “How is this going to fly in real use?”.

It seems, 25-30 years later, people have forgotten the gorilla arm.

Touchscreens work on tablets and phones because people don’t tend to hold them up vertically when using them, so they tap on mostly horizontal surfaces. Laptops and desktops have vertical screens. And real keyboards.

My 2 1/2 year old who uses the ipad regularly is always confused when we get out the computer and it doesn’t have a touch screen. This is, obviously, quite the edge case. I don’t expect her to grow up demanding a touchscreen for the reasons you mentioned above.

My husband keeps trying to convince me to upgrade my monitor to a touchscreen and I keep refusing, with #1 & #3 being my biggest reasons. My main use for my computer is for photo editing and photobook design, both of which require precision best done with a mouse.

Then add to that three children from ages 3-9 that often use my machine. I don’t want their grimy fingers on the screen or I’ll constantly have to clean it. Yes, they can and should be taught to have clean hands first but the lag in the learning curve would drive me crazy.

Interesting, I just bought a Samsung Galaxy 3 8″ tablet this week to replace my Sony e-reader (don’t like Kindle or Nook), and so am now using touchscreens for a lot more things than my ereader did. My response right now is “meh”. The touch screen is fine when I am using the tablet as an ereader, which is what I bought it for, basically – but it’s a pain in the patoot when I am using it as a tablet. The 8″ screen is tiny and the keyboard is really tiny. I’m a fast touch typist and having to basically hunt and peck to do stuff is SLOW and annoying. On the whole I like the tablet as an ereader. But if I need to do something seriously computery I go for my laptop or desktop. Possibly this will change as I get more accustomed to the Samsung – but I think that if I ever wanted to use this for serious data entry (excuse me while I giggle helplessly), I would buy a keyboard. So, like you, I just don’t see touchscreens being very useful for laptops or desktops.

Really, the only thing I find a laptop with a touchscreen to be good for is if you do a lot of drawing digitally and can’t afford one of those fancy graphic display tablets like a Wacom CIntiq (though there are now a number of non-Wacom graphics displays which are good and won’t set you back $3,000). I have an HP Tablet PC from several years back, and the only thing I use the touchscreen for is for digital artwork.

That said, it’s heavy, bulky, and runs very hot. Tablets have come far enough that you can get better pressure sensitivity and some surprisingly good drawing/painting apps. I suspect that as my tablet PC gets older I will be moving onto something similar to the Galaxy Note for drawing.

Transfer of research results from the laboratory to practice has traditionally been a challenge in human-computer interaction. The nice observations you have about gorilla arms were noticed even earlier by Doug Engelbart, in the early 1960s (for light pens, as used in Ivan Sutherland’s remarkable Sketchpad system, before touch screens were feasible). Engelbart described part of his motivation for inventing the mouse in a 1973 conference paper:

… We included measurement of the “transfer time” involved when a user transferred his mode of action from screen selection with one hand to keyboard typing with both hands; surprisingly, this proved to be one of the more important aspects in choosing one device over another. The nature of the working environment diminished the relative attractiveness of a light pen, for instance, because of fatigue factors and the frustrating difficulty in constantly picking up and putting down the pen as the user intermixed display selections with other operations. The mouse is a screen-selection device that we developed in 1964 to fill a gap in the range of devices that we were testing.

Implications for touch displays on desktop/laptop machines should have been obvious. Not that touch isn’t useful in many situations, but as an ad hoc addition, just because the technology is there, it doesn’t necessarily work well.

I feel the same way about touchscreens for myself. However, for anyone with dysgraphia (writing disability), a laptop with a flip-able screen that can be laid flat to go into tablet mode can be very useful. My dysgraphic mathematician son uses his laptop this way for writing math homework – he doesn’t need as much pressure as with a pen/pencil so his hand doesn’t cramp up and the result is more readable than the very light pencil marks he’d end up with otherwise. Mentioning this in case anyone else might find it helpful as assistive technology for themselves or their kids.

One of the things I’ve found with PC touch screens is that I tend to be too close to them. For me, being an arm’s length away from a 23″ monitor is too close and I develop eye strain. If I back away to a comfortable distance, I can no longer reach the screen and reach for the mouse instead. So what’s the point.

Funny, I remember your argument #3 back when Windows and Mac introduced the mouse to consumers: diehards (such as myself) used keyboard shortcuts rather than removing hands from the keyboard.

I think the primary problem with touchscreens is how software is written for them. If the finger is considered a surrogate mouse pointer, you almost guarantee a poor experience. Touch as input is NOT simply substituting a finger for a mouse; a similar failure of adoption occurred with speech recognition, because it was simply dropped in place as a substitute for the keyboard. Software must be written to a particular interface metaphor; Microsoft’s problems with its tablet products (and I used the Tablet PC ten years ago) are based in its almost religious commitment to the WIMP metaphor. IMHO.

Interestingly, I put down my tablet and opened my Macbook Air to type this. But had I been without the Mac, I would have pulled out the Bluetooth keyboard that lives in the shoulder bag with my tablet. Having said that, in my opinion an onscreen keyboard is still more usable than a pseudo-mouse-pointer.

My finger is huge, my mouse arrow thingy is a point. Why would I want to decrease the “resolution” of my pointing device? Makes no sense. And it sounds very *not* ergonomic. I like touchscreens from some things — Fruit ninja is fun, slicing with my finger. But on a huge screen? I think that would be a workout. :D And might cause me to damage the screen or fall down or something. I hate trackpads, I do like trackballs. I don’t really like mice (mouses?). I find a computer mouse makes you move your arm around too much and it’s a pain to switch back to the keyboard and your arm gets sore. So I suppose what I really prefer is keyboard commands. ;) (Back when I was a girl, we had the DOS prompt, and we liked it! I remember when browsers were invented, so we could stop using Archie and Veronica. *shakes cane*)

For art, I would *love* one of those wacom-tablets that are also screens. But then, your pointing “resolution” is a stylus not a giant blobby finger.

I like my tablet, but I don’t use it for the same things I use a computer for.

“Blaise Pascal” and gorilla arm: Oh my gosh! How could I forget light pens! Oh yes, they were *horrible*. It fixes my finger-blob resolution problem, but it was still awful. So awful I forgot about them for all these years…

I have a touchscreen desktop and keep finding things had changed on my monitor overnight – different programs were open or closed, etc. Suspected a virus, but nothing was detected. Discovered in a moment of insomnia that there were tiny flying insects that were attracted to the screen when it was the only light on, and when they hit the screen they were acting as “touches” and selecting things… Turned off the touch function (which I never used) and no issues since.
Also, the touchscreen is part of an all-in-one computer, and that puts the fan and its associated noise at ear-level, which is not ideal for me when audio editing sans headphones…

What I find on my Chromebook Pixel (which has a touchscreen) is that every time I use the touchscreen I immediately regret it because I then have to clean the faint finger-oil smears off the very very glossy screen.

And, of course, cleaning the screen without “touching” things is rather a trick.

I generally agree with you–the touchscreen on my Windows 8 laptop is pretty worthless. But the multi-touch gestures that some of the more innovative thinkers in desktop HCI like Bumptop are pretty cool:

Hopefully some of these changes will become mainstream in the next few years. Bumptop has been working on their desktop for years.

Yeah, maybe just my generation and the hardware I grew up with (Get orf my lawn!) but I even find myself irritated if I have to switch input device at all, i.e. if I’m using the mouse to navigate around the OS I get miffed (to be polite) if the designers insist that I use the keyboard to trigger functionality (Windows 8) or have to type a Program name to get it to launch rather than use an application menu (Gnome3). If I’m typing into an application I hate having to go onto the mouse to do something. Touch is entirely appropriate for the tablet/smart phone form factor, IMHO, but on a laptop/desktop being expected to hop from keyboard to mouse/trackpad and now to touchscreen … No.

Having said that I plan to get a Win8 laptop with a touchscreen as I occasionally help folks with their computers so it’d be good if I know what I’m talking about if/when they have issues with Win8, also curious to see if my 6 and 9 year old children like it better than Win7 on all the other computers in the house.

Dave Bauer – I always eat my Cheetos with chopsticks when working – doesn’t everyone?
I live in fear of the touchscreen becoming standard, or worse – voice control. Every now and then I want to be able to work outside my house, in a cafe or library, and bellowing commands at my computer would be embarrassing to say the least. Not to mention the act that I use Windows and therefore the computer would not do what I was shouting at it to do….
I sincerely hope that you will continue to find suitable functionality to allow further production of excellent stories. I finally got hold of “The Android’s Dream” the other day and enjoyed it immensely.

Could an additional reason be that your computer software/hardware ergonomics and logic were designed and optimized for keyboard/mouse/touchpad input and adding a touchscreen is a really suboptimal option? If the touchscreen sat directly in front of you, not at an arm’s length, and was canted at a 45 degree angle or so, and the software menus/logic were designed for touch input rather than point-and-click GUIs (like an iPad…) then you’d have a valid comparison. Instead, the comments seem to be saying that a chisel makes a poor screwdriver–an unfair comparision when it was designed to be something different.

Back in the dim, dark ages of shifting to a world of ones and zeros the USAF had a series of tests (AIMVAL/ACEVAL) where they compared digital displays in tactical aircraft cockpits to the then-common analog ones. The tests concluded that the analog displays were superior–but the digital displays had never been human engineered/optimized to best exploit their capability; they crudely displayed the same information in roughly the same way as the analog displays. Over the next few years the digital displays improved through engineering and today I can’t imagine any pilot wanting to go back to a collection of steam gauges in the cockpit instead of the current MFDs (with touch screens, by the way). I think the current state of the art in using touchscreens is comparable to the original digital displays–we’re doing the same thing in the same way with a new technology and saying it doesn’t work very well.

I have an Acer w700 which functions as a tablet and as a laptop running Windows 8. I love the shit out of the combination. Use it as an e-reader, some quick browsing, etc in tablet mode with touch screen. Switch to the bluetooth keyboard for writing, chatting, etc.
On a desktop, I would agree that a touch screen is completely useless…

I find that even mice and trackballs can slow you down. While I make extensive use of a trackball, learning just a few keyboard shortcuts for a frequently-used application gives me a significant uptick in productivity.

My husband’s boss gave my husband an old computer a few months ago that had a touchscreen on it (desktop). Dmitry played with it a couple times and then just turned it off. Every time a fly landed on the screen it messed things up; he just didn’t like it at all.

I can’t imagine using a touch screen to type on; DH has a iPad, and in frustration with the on-screen keyboard, bought the stand-alone keyboard to go with it. Most touch typists like the feedback you get from pressing keys, and tapping a glossy screen isn’t the same. OTOH, a touch screen is great for digitizing drawings, but I would still prefer using a stylus to my fat fingertip.

We will be adding touch screens to the ADA public terminals where I work, and I wait with interest to see what the feedback from the public will be.

Katie: I remember entering “win :” at the DOS prompt, so that I could skip the flying Windows flag before getting to the desktop. I also remember playing the original Warcraft. I had to exit Windows, initialize the mouse, then start the game (from the DOS prompt). And if I forgot to initialize the mouse, I had to play with no mouse (or exit and start over). Ahhh, the good old days!

Personally, I think brain/machine interfaces are coming along fast enough that they’ll catch up to touchscreens before smudgy screens become standard. You’ll just look at the icon on the screen and think the command to open the program. Maybe in 20 years, there won’t even be a screen, just signals provided directly to the optic nerve.

Desktops – which in the future will mean mini-ITX or smaller with a separate, probably very thin, flatscreen until all screens in a home will be linked by a wireless server the size of a small kitchen appliance – will move away from touchscreen and toward gesture-control as a secondary interface. Asimov predicted this with robots that did what you wanted at the slightest flick of your wrist. Laptops will continue down the touchscreen road. The most interesting development in human-interface design this decade will be the marriage of HUDs (like Google Glass) with gesture control including air keyboards and icons.

The biggest threat to privacy and user-autonomy is that everything will be fully web-enabled (almost already there), software (“aps” for the kids) will be zero-footprint, operating systems will devolve to rent-a-browsers with glorified firmware that remains firmly under its vendor’s control unless hacked which will be a federal crime thanks to the cockeyed abuses by computer-illiterate courts and feudalistic corporations of the DMCA and its spin-offs. Data will be squirreled away in the “cloud”, an appropriately nebulous euphemism for servers that can capriciously lock users out without recourse at the drop of a pin on the user-compliance rapsheet your ethically-bankrupt “customer-oriented” distributors devoid of a shred of accountability keep on you and probably sell under the table. Pro-tip: you may find customer-oriented doesn’t mean what you think it means either.

You’ll still be able to download your content to a separate solid-state drive, but good luck finding a machine that won’t brick on you when you try to use it to strip the identifying metadata hidden in tags or encoded in the digital signal itself. Everyone knows hex editors are tools of cyber-terrorists bent on destroying [insert you country here] and kicking your puppy.

Of course as long as you’re a good little serf who doesn’t stand up for your fellow citizens’ liberties and does what the multinationals tell you, you’ll be richly rewarded with colorful trinkets and entertaining baubles. And, statistically, the chances of you being mistakenly classified as a threat to the unholy alliance of bureaucratic corporatocracy and the opaque surveillance of lazy policing is really very small, Mr. Tuttle.

Could you please, some time talk about how you use Google docs as opposed to Word or Libreoffice. I write separate chapters and then use a master document which embeds changes as I make them. Docs doesn’t have this capability afaik and I find that really large documents are slow. (over 50k words.) Thank you. I would really like to have everything in the cloud so that if I suffer a hard drive crash, I know there is a backup of my latest level from Fort Meade.

There are just some physical realities about using a computer that will never change, no matter how technology changes. Mostly because the physical humans in front of the computer only change at evolutionary pace, not iphone pace. With a mouse, trackpad or trackball, i can navigate all over a 24″ screen while only moving my hand about an inch. And i can move my hand from keyboad to input device in about ten inches. No touchscreen technology can ever match that economy of movement and precision because a 24″ screen will always be larger than a 3″ trackpad and a finger is always thicker than a cursor, and the screen has to be further away from the keyboard than a mousepad. Touchscreens are excellent for devices where adding a mouse or trackball would double the size, or is simply impossible, can’t use a mouse with a smartphone while walking after all. But when other input devices are available, touchscreens are always the inferior choice. “Gorilla Arm” perfectly demonstrates the problems of not taking the human factor into consideration, and it doesn’t even have to be a literal case of that. Imagine a horizontal or angled touchscreen embedded in the desk. You”ll still be waving your arms around something like 300 square inches, whereas a mouse (or ball or pad) gives more precision while only needing about ten. Touchscreen typing can also be improved by technology that simulates haptic feedback, but even then, you’re giving up part of the screen to work as a keyboard, which stays pointless if you could just use a physical keyboard instead. Sometimes, the best tool for a certain job has already been found and there’s just not much room for improvements. Hammers haven’t undergone any revolutionary redesigns for quite some time either.

It’s fascinating to watch people talk about their experiences with touch screens. I personally have no love for touch screens for general computing, but like them on my phone and eReader. I love the touchpad on my MacBook Air far better for general use more than I like using a mouse, games excepted. Similarly, I prefer keyboard-based commands over pointing devices, including keyboard shortcuts for editing, and VIM for serious text editing.

At the same time, my wife has an iPad, and the interface is very good, but the iPad is properly designed for it. I wouldn’t want to write a novel on the touchpad, but for browsing, reading email, and most of the native apps, the interface is excellent. I’ve also watched a half dozen children learn to use an iPad by just touching and playing with it, including a memorable event with two four year olds teaching a two year old how to use the interface. You cannot watch that and think that touch interfaces aren’t the future for the majority of general interface design for ‘personal’ or short-term, non-time-intensive devices.

The failure of Windows 8 is that the touchscreen isn’t used in the ways it would be most effective: either as an oversized tablet or as a Wacom style ‘drawing canvas’ that sits at a nice close in tilt, like a drafting desk.

I don’t see Apple making the same mistake. (Their mistakes seem to be primarily related to iCloud these days.) I would like an iMac/Mac Pro to have the option for an Apple keyboard with a built in wrist rest and a touchpad, mimicking a MacBook Air. That way I wouldn’t need to use the mouse for anything other than gaming.

I find that whenever I’m sitting in front of a laptop I will frequently reach out and touch the screen without even thinking about it, and my laptop is not actually a touchscreen. My desktop, on the other hand, is one of the 27″ iMac monsters, and it would be worse than useless for it to have touchscreen functionality. That said, Apple’s Magic Trackpad has replaced my mouse altogether; if I never touch another one I’ll be perfectly happy.

I can see why a touch screen desktop would not work – the distance between the screen and your arm is just too far.

But I have to admit to wanting a touch screen laptop. We have an Asus Transformer tablet that we use as a laptop with the keyboard and we get that used to using the keyboard, mouse *and* touch screen (it doesn’t replace the mouse, it works alongside) on it that when we go back to our actual laptop computers we end of poking the screen for some things which are just easier on a touch screen than a mouse.

@Stacy Cashmore – I’ve been using an Asus Transformer Pad Infinity for over a year now as my main mobile platform and I love it so much I’ve basically stopped using my older laptop. In fact, I’ve come to dislike laptops. Most of the things that I need a fully capable computer and OS for I vastly prefer to do on my desktop environment in my home office. Everything else I can do on the Transformer or my phone.

I would be very surprised if Apple went into touch screens for OS X devices. They’re definitely bringing aspects of iOS to OS X’s UI, but I think they’re going to leave touch screens to the handheld iOS devices.

There are free-standing video arcade versions of Fruit Ninja that dispense redemption tickets. I played it in a Dave & Busters. It in a place like that, the large arm movements are to be expected, but games like that quickly demonstrate why the user interface in the film Minority Report are impractical for long term use.

Touch screens have been around for for a while. They are handy for things like public kiosks. My oscilloscope at work has one, and touch works well in that context. I can’t think of a case where touch screens really caught on in a big way until the Newton and Palm Pilot demonstrated that they worked well for small devices.

The ticket-spitting arcade version of Fruit Ninja is most fun if two people play it at the same time. That mitigates the big-screen trouble a bit.

Light pens actually go back to the very dawn of interactive computer graphics: they were used in the SAGE air-defense system from the early Sixties. I think the gorilla-arm problem was well-known from the start.

Touchscreens have been around forever, and there are obvious reasons that they’re inferior to how we already work.

Even in their niche domain (high-mobility devices,) touchscreens actually have substantial drawbacks when compared to a stylus interface — specifically, fat-fingering lack of precision and smearing your body oils all over what you’re trying to look at — and no real advantages.

They’ve always been the Next Big Thing in computing, in much the same way that “3D” is always the Next Big Thing in cinema. With all that that implies.

I don’t think there’s any reason to believe that, 30 years from now, we won’t still be using the mouse/keyboard interface for all of our serious computing. It’s not like alternative technologies are new. They’re just expensive and impractical, and that’s not likely to change any time soon.

Just a bit of trivia: The earliest publication I’m aware of on touch screens is by E. A. Johnson in 1965, describing an overlay of wires on a CTR.

@jedibear: Even in their niche domain (high-mobility devices,) touchscreens actually have substantial drawbacks when compared to a stylus interface… and no real advantages.

Actually, there are a few advantages in comparison with stylus-based systems: one-handed use; accessibility for people with fine motor or vision impairment; and the practical issue of not misplacing the stylus.

Funny… I totally agree with you. I have my PC hooked up to a 32 inch TV/Monitor. Touch is ridiculous where that is concerned. As for my Nook HD and Nokia 521 though… I love touch on those. Touch–I believe–is one of those things dependent on size. It really makes no since on a large monitor, except maybe to launch a program. However, on a 7 inch tablet, it works great for playing pinball*.. this I know from a lot of experience. Touch seems to be great for quick things, like the check out stations we have in the canteen at work… not so much for playing STO.

*Pinball: For the way too young, a contraption that requires physical (muscle use) interaction, timing, skill, and a bit of luck to flip a large metal ball at targets. Love me some video games… but pinball is KINETIC!

I use a MacBook Pro and an iPad. Both have the Retina display. I find that the keyboard and touch pad are so responsive and easy to use on the laptop that I really have no need of a touch-screen interface. So for the times when I have a lot of writing to do, I really have no need of a touch screen. Like most people who have been typing for a long time (decades), I don’t look at the keyboard because I am looking at the screen. Likewise, the touch pad has moved “out of sight” as I use it.

With my iPad, I am holding it quite closely and what I do to move the text usually does not obscure it. For those applications where I do need to enter text, I find that small amounts are OK but a lot of these applications have Mac OSX counterparts (OmniFocus for instance or Pages) and what I really use the iPad versions for is display, organization, and manipulation of something rather than the heavy duty text processing that are naturally done on the laptop.

Touchscreen is awesome for tablets and a lot of art and non-linear editing programs. Things that were traditionally hands on (like how airbrushing is extinct and replaced with photoshopping) can feel somewhat disconnected through a computer interface. Touchscreen gives back that hands on experience.

I agree, though. For most text based programs workflow, keyboard and mouse/trackpad are not only better, but easier and faster.

I don’t think it’s an either/or situation. I think it’s a best of both worlds thing.

I got the ASUS Taichi and it has a normal laptop mode but when I shut the lid it turns into a windows 8 tablet. I adore it for the different capabilities. I can easily switch back and forth for what fits my needs. The laptop screen isn’t a touchscreen which is great. Having a one year old makes it hard to keep a laptop open but using it in touchscreen tablet mode keeps him from pulling it off my lap. I was debating on a new laptop or a tablet but this helped me in needing two different devices.

You cannot watch that and think that touch interfaces aren’t the future for the majority of general interface design for ‘personal’ or short-term, non-time-intensive devices.

I think the key difference is how input-intensive the application is. Touch screens are great for handheld devices because you don’t need separate input and output hardware; you can use part of your screen as your input device and save space. This is fine for something like YouTube or e-readers where you do a tiny bit of input and then watch the output for a while. But if you’re going to be doing a lot of input (like entering a few paragraphs of text), and/or precision is important, dedicated input hardware that doesn’t compete for screen space starts looking a lot better.

I could easily imagine reading this blog, or other blogs, on an iPad — but if I wanted to comment, I would walk over to the desktop to do it, even if it’s in another room. Because any comment long enough to be worth making in the first place (i.e. not “Me too”) is long enough that trying to enter it on a touchscreen “keyboard” would be a major pain.

When the screen isn’t even handheld but at arm’s length, it’s even less useful as an input device, and there’s even more value to having a keyboard and/or mouse and/or trackpad. (I don’t really like trackpads either, but they may be the least bad option for using WIMP interfaces when there may not be room for a mouse.)

I just wish they had morphic tactile surfaces so that you can get the sensation of keys or buttons on the screen. I haven’t been able to text and drive since the 2000s (although that’s probably a good thing…)

I’ve got RSI, bad enough that I’ve had Dragon installed on my computer since 1999, and nevertheless will use it if I have to. (Dragon has come a *long* way since 1999, for those of you who never had to use it way back when the hardware didn’t have the muscle the software needed.) I can manage the touchscreen on my iPad, as long as I’m not using it for anything like the number of hours a day I spend in front of the Wintel machines at home and work. And that’s a device that’s small and used in a reasonable position. Expecting me to use the desktop monitor as the “mouse” is just… no. I can’t afford the physiotherapy bills that would result.