Touch in context: what a great feature!

The touch interface is clearly gaining momentum: we’ve got touch-enabled Tablet PCs, UMPCs, PDAs, smartphones and other small handhelds. Palm rejection is a staple in the latest Fujitsu P1610 devices and true multi-touch devices are on the near-horizon with Apple’s iPhone. Ironically, my MacBook Pro has multi-touch to a degree as well. Most Mac users know that you can use two fingers on the touchpad simultaneously for scrolling; did you try moving those two fingers in different directions like I did? Hop over to Google Maps, find your favorite location and try moving your two fingers apart and then together. See what happens; I won’t spoil the surprise. ;)

The Nokia N800 is another touchscreen device and there’s a feature that I’m calling "touch in context" for lack of a better phrase. It’s a simple function, but I find it so useful that I’d love to see it on all touchscreen computing devices going forward.

First, you need to make sure you have this feature enabled. In the Tools menu, you’ll find the Control Panel; there you’ll see an option to configure the Text Input Settings. There’s a few tabs here, so if you scroll to the last one (which is nearly hidden, so you might not know it’s there) called Thumb board, you can select the "Launch via touch screen" option. Trust me, you’ll be glad you did.

With this setting in place, the N800 display will discern between a stylus tap and the pressure of a finger for the keyboard request, hence my "touch in context" phrase. The device is now smart enough to know when you’re using a stylus and when you’re not. Using those smarts the handheld will show you an on-screen keyboard for that context. When you use the stylus to tap on a text input field, the device uses the smaller virtual keyboard like this, which is perfect for a stylus:

Try tapping the same input field with a chunky finger and the whole display becomes a virtual keyboard that’s a breeze to use with your thumbs:

Luckily, you don’t need the thumb board setting for menus; the display automagically applies this setting to shortcuts.

Here I’ve tapped the main menu with a stylus:

Here’s the same menu option when I touch it with my finger; notice the larger areas to press, along with a larger scroll arrow?

Same goes for the web menu with the stylus:

A finger press of the web menu shows fewer items, but makes it easier to navigate:

As I started out, touchscreens are becoming more accepted as input devices. As the devices get smart enough to recognize the context of how we’re touching them and then display data within that context, we’ll have a much more enjoyable experience.