Raleigh then demonstrated one of the methods by which they can extract the parameters that help the UEL better understand user experience: a robot. Shaped something like a human arm, complete with wrist and fingers, and enclosed within a safety cage, it’s connected to a camera that follows a series of configurable tokens on a screen that assist in directing the arm’s motion. The arm is capable of replicating human movement with considerable precision, which makes it ideal for assessing touch devices, and once Raleigh had powered it on the arm executed a series of familiar gestures (swipes, pinches and zooms) on an actual tablet.