Yet to date it's been difficult or impossible for most robotic and prosthetic hands to accurately sense the vibrations and shear forces that occur, for example, when a finger is sliding along a tabletop or when an object begins to fall.

Now, engineers from the University of Washington and UCLA have developed a flexible sensor 'skin' that can be stretched over any part of a robot's body or prosthetic to accurately convey information about shear forces and vibration that are critical to successfully grasping and manipulating objects.

'Robotic and prosthetic hands are really based on visual cues right now -- such as, 'Can I see my hand wrapped around this object?' or 'Is it touching this wire?' But that's obviously incomplete information,' said senior author Jonathan Posner, a UW professor of mechanical engineering and of chemical engineering.

The fact that our latest skin prototype incorporates all three modalities creates many new possibilities for machine learning-based approaches for advancing robot capabilities,' said co-author and robotics collaborator Veronica Santos, a UCLA associate professor of mechanical and aerospace engineering.

The rubber is embedded with tiny serpentine channels -- roughly half the width of a human hair -- filled with electrically conductive liquid metal that won't crack or fatigue when the skin is stretched, as solid wires would do.

The research team from the UW College of Engineering and the UCLA Henry Samueli School of Engineering and Applied Science has demonstrated that the physically robust and chemically resistant sensor skin has a high level of precision and sensitivity for light touch applications -- opening a door, interacting with a phone, shaking hands, picking up packages, handling objects, among others.

Robots capable of disabling roadside bombs with the same dexterity as humans may soon be a reality, after engineers developed artificial skin that has the same sensitivity as a human hand.

UCLA Engineering “Robotic and prosthetic hands are really based on visual cues right now—such as, ‘Can I see my hand wrapped around this object?’ or ‘Is it touching this wire?’ But that’s obviously incomplete information,” said Jonathan Posner, a professor of mechanical engineering at the University of Washington and senior author of the research.

By measuring the electrical resistance, the research team was able to correlate it to the amount of shear forces or vibrations the robot finger is feeling. “Traditionally, tactile sensor designs have focused on sensing individual modalities: normal forces, shear forces or vibration exclusively,” said Veronica Santos, a UCLA professor and co-author of the research.

The fact that our latest skin prototype incorporates all three modalities creates many new possibilities for machine learning-based approaches for advancing robot capabilities.” In designing the skin, the engineers followed “cues of human biology” in order to create sensors that make it easier to perform human tasks, such as opening a door, playing jazz instruments, interacting with a phone and shaking hands.

Flexible Sensor “Skin” Gives Robots a Sense of Touch

The cross-disciplinary team—led by Jonathan Posner, a UW professor of mechanical engineering and of chemical engineering, with his co-collaborator collaborator Veronica Santos, a UCLA associate professor of mechanical and aerospace engineering—has developed a flexible sensor “skin” that can be stretched over any part of a robot’s body or prosthetic.

The skin—made from silicone rubber—can accurately transmit information about shear forces and vibration to a robot or prosthetic device, allowing it in a sense to “feel” objects so it can properly grasp and manipulate them, researchers said.

“Most robotic hands and prosthetics don’t have any tactile sensors that can measure the forces,” he told Design News. “There are some research-level sensors available, but they are typically whole finger sensors that replace the existing robotic or prosthetic finger. Our technology is a skin that wraps around existing fingers or surfaces.” For example, the sensor capability currently in use can’t allow a robot or device to sense when a finger is sliding along a tabletop or when an object begins to fall, critical movements when disabling a bomb or using a surgical instrument, for example.

“Bringing the sense of touch to robotic and prosthetic hands can greatly improve the manipulation of objects for a wide range of applications that are as simple as holding an egg, or pressing buttons on electronics and as complex as diffusing an incendiary device in a military application,” Posner said.

When the channel geometry changes, so does the amount of electricity that can flow through them.” The team demonstrated how the skin, when used on a robot, can perform tasks such as opening a door, interacting with a phone, shaking hands, and others with similar sensitivity and precision as human hands.