Apple's Proposed Multi-Touch UI System

Apple's Proposed Multi-Touch UI System

A few years ago, I took stock of several Apple patents that opened up new interaction possibilities by rethinking the ways people could provide input through multi-touch, virtual interface controls, new physical controls, sensors, and more. Several of these including the "multi touch mouse" (now released as the Magic Mouse) have made their way into shipping Apple products.

Yesterday, in anticipation of Apple's "latest creation," Patently Apple compiled a similar list of Apple patents that may see the light of day soon. Looking through their article and at several additional patents from Apple, I compiled a list of the new interaction design capabilities these patents cover. In aggregate, these interactions began to look like an integrated system for managing applications and content.

The overarching UI model is a set of contextual virtual interface elements with audible and haptic (perhaps) feedback that are accessed and manipulated through multiple input formats. That's a mouthful—let's break it down.

Virtual interface elements

Virtual scroll wheels, slider bars, keyboards, dials, menus, and more are used to edit, manage, and input information on the screen. These controls are mostly shown overlaid or "floating" on top of content and applications. Some controls require specific touch gestures to be used and/or provide audible or tactile feedback when a user interacts with them. For example, a rotation gesture for virtual dials can be used to set volume and may include feedback when the limits of the dial are reached.

Included as part of the virtual controls are several forms of virtual keyboards and specifically a two-handed virtual keyboard that uses multipoint touch for input (deliberately called out as different from the iPod/iPhone texting keyboard).

Contextual interface elements

These virtual interface controls can be associated with specific user modes like navigation, scrolling, data entry, display, etc. So a virtual scroll wheel or slider bar may be associated with a scroll mode. A keyboard or keypad may be associated with data entry mode, and so on.

Controls can also be specific to the application a user currently has running. So a floating virtual panel for iTunes could include the controls you'll use most often in the application like volume, playlist access, next song, etc.

Virtual controls can also be position sensitive. For example, selecting a song in iTunes could bring up specific controls for audio files with data associated with that file (e.g., title, artist, genre, etc.), or a page-turning gesture that allows you to move between pages of content could only be available at the bottom of the screen.

Accessed through multiple input formats

These virtual interface controls can be accessed through specific touch gestures or multi-touch inputs. For example, a virtual scroll wheel in iTunes could only appear when two fingers are placed on the touch screen as opposed to one finger. Additional fingers could be placed on the screen to modify or enhance the visible controls bringing up new interactions or information.

In fact, Apple has outlined a complete hand-based input system with "unprecedented integration of typing, resting, pointing, scrolling, 3D manipulation, and handwriting." The system can individually detect all ten fingers and separate palms on a person's hand, which allows it to detect resting of hands, measuring when a hand or fingers touches and leaves the surface, interpreting taps from one finger as mouse button clicks, but disregarding a tap from two fingers, and more.

The touch-sensitive areas that can accept this kind of input are not confined to the front or screen of a device. The back of a hardware device can also contain touch-sensitive areas that may be tapped, pressed, or slid to generate inputs.

Different hardware inputs can also bring up specific controls. Technologies that can recognize your thumb or fingerprints can be used as inputs for accessing virtual controls. Specifically, fingerprint patterns can be used to actually identify distinct fingers. This could then be used to display different functions depending on which finger is being used. Similarly, proximity sensors can detect when a hand is near a device and display the appropriate controls.

Haptic Tactile Feedback (perhaps)

Finally, haptic responses can be used to provide feedback to users when they interact with a series of virtual controls. Haptic display technologies allow a user to "feel" different surfaces as their finger moves across a touchscreen. For example, a display could include a virtual click wheel which vibrates at a different frequency at the center. Users could easily sense the difference and use the click wheel without having to look at it.

In Summary…

Together, these proposals outline an integrated interaction model of virtual "floating" controls that are specific to the mode or application the system is in. The controls are accessed and manipulated through touch-based gestures, combinations of mutli-touch inputs, and/or inputs detected through sensors. Users get haptic, audible, and visual feedback when using these input methods to interact with the system's set of virtual controls.

Needless to say, it will be interesting to see which of these proposals (if any) make their way into Apple's "latest creation" (tablet?) this month!

ABOUT THE AUTHOR(S)

Luke is an internationally recognized product design leader who has designed or contributed to software used by more than 700 million people worldwide. He is currently Chief Design Architect at Yahoo! Inc. where he works on forward-looking integrated customer experiences on the Web, mobile, TV, and beyond. Luke is the author of two popular Web design books: Web Form Design (2008) and Site-Seeing: A Visual Approach to Web Usability (2002). He also publishes Functioning Form, a leading online publication for interaction designers.

Add new comment

Login via:

Your name *

E-mail *

The content of this field is kept private and will not be shown publicly.

Comment *

Because of problems with spam comments, HTML in comments is not permitted. URLs are allowed, but they will not be rendered as click-able links.

Comments

February 22, 2010

Multi-touch is an enhancement to touchscreen technology, which provides the user with the ability to apply multiple finger gestures simultaneously onto the electronic visual display to send complex commands to the device. The term multi-touch is a trademark of Apple Inc. Multi-touch has been implemented in several different ways, depending on the size and type of interface. Both touchtables and touch walls project an image through acrylic or glass, and then backlight the image with LED's. When a finger or an object touches the surface, causing the light to scatter, the reflection is caught with sensors or cameras that send the data to software which dictates response to the touch, depending on the type of reflection measured. Touch surfaces can also be made pressure-sensitive by the addition of a pressure-sensitive coating that flexes differently depending on how firmly it is pressed, altering the reflection.
I am a student of cissp certification and i am so much specialties in the network solution's.

If I were an accessibility champion, I would be worrying hard with touchscreen ubiquity.

I'm not disabled (just a little ham fisted and left-handed) and I already find that the increasing symbiosis between application and ergonomics is causing me problems.

We've only just started to resolve alternative device compatibility on the standard browsers and OS's. I anticipate a rush to service touchscreen users will leave a physically disabled community further behind than ever before.

Touch screens cannot be manupulated by alternative tools even (like a pen or a stick), it has to be a heat emitting digit right?, and one with quite fine motion at that.

Does anyone have a discussion around accessiblity of touchscreens going - I don't know much about it at all, but I really hope someone is tackling these issues.