Contents

SAN FRANCISCO—When most of us think of computer hardware, we think of only the finished product: how the audio sounds, whether the display has good viewing angles, if its touch screen works the way you expect when you swipe it with your finger. It's easy to take these mundane things for granted, but if you're a technology company you don't have that luxury—if everything doesn't work exactly the way your customer anticipates, your business could suffer. Getting this stuff right is serious business.

Intel proved that during a tour of the company's User Experience Lab (UEL) while I was in the area for the annual Intel Developer Forum (IDF). Located on Intel's campus in Santa Clara, the UEL is filled with techs who use a variety of methods to better understand, in the words of robots engineer Eddie Raleigh,"where the user is coming from." They do this primarily by focusing on the senses of sight, touch, and hearing.

It's with sight and touch that Raleigh began, explaining up front that the tests they run on qualities ranging from smoothness (how a device's screen updates as you're using it) to form factor (is the screen the correct size and weight for its intended use?) can sometimes yield surprising results. For example, you would assume that under most (if not all situations), a higher frame rate on a display would result in a smoother viewing experience. But that's not always true, says Raleigh: An average frame rate number could be deceptive if drop frames are involved, or if there are no frames dropped but the area is "redrawn in an area that was inconsistent with the user expectations."

Once these tests are completed, a team of "human factors engineers" conduct additional scientific studies (with participants pre-screened based on their experience with devices) to obtain even more specific feedback. That's used to populate a database that helps Raleigh and his colleagues create a perceptual model that tells them what parameters matter for any specific usage, and how that correlates to a specific usage experience. Intel uses this information to improve both its own devices, and those of its OEM partners.

Raleigh then demonstrated one of the methods by which they can extract the parameters that help the UEL better understand user experience: a robot. Shaped something like a human arm, complete with wrist and fingers, and enclosed within a safety cage, it's connected to a camera that follows a series of configurable tokens on a screen that assist in directing the arm's motion. The arm is capable of replicating human movement with considerable precision, which makes it ideal for assessing touch devices, and once Raleigh had powered it on the arm executed a series of familiar gestures (swipes, pinches and zooms) on an actual tablet.

Matthew Murray got his humble start leading a technology-sensitive life in elementary school, where he struggled to satisfy his ravenous hunger for computers, computer games, and writing book reports in Integer BASIC. He earned his B.A. in Dramatic Writing at Western Washington University, where he also minored in Web design and German. He has been building computers for himself and others for more than 20 years, and he spent several years working in IT and helpdesk capacities before escaping into the far more exciting world...
More »