Fast forward thirteen years to a recent article entitled Exoskin: A Programmable Hybrid Shape-Changing Material, by Evan Ackerman, posted on IEEE Spectrum on June 3, 2016. This is about an all-new and entirely different development, quite separate from quantum dots, but nonetheless a current variation on the concept that matter can be programmed for new applications. While we always think of programming as involving systems and software, this new story takes and literally stretches this long-established process into some entirely new directions.

I highly recommend reading this most interesting report in its entirety and viewing the two short video demos embedded within it. I will summarize and annotate it, and then pose several questions of my own on this, well, matter. I also think it fits in well with these 10 Subway Fold posts on other recent developments in material science including, among others, such way cool stuff as Q-Carbon, self-healing concrete and metamaterials.

Matter of Fact

The science of programmable matter is still in its formative stages. The Tangible Media Group at MIT Media Lab is currently working on this challenge included in its scores of imaginative projects. A student pursuing his Master’s Degree in this group is Basheer Tome. Among his current research projects, he is working on a type of programmable material he calls “Exoskin” which he describes as “membrane-backed rigid material”. It is composed of “tessellated triangles of firm silicone mounted on top of a stack of flexible silicone bladders”. By inflating these bladders in specific ways, Exoskin can change its shape in reaction to the user’s touch. This activity can, in turn, be used to relay information and “change functionality”.

Although this might sound a bit abstract, the two accompanying videos make the Exoskin’s operations quite clear. For example, it can be applied to a steering wheel which, through “tactile feedback”, can inform the driver about direction-finding using GPS navigation and other relevant driving data. This is intended to lower driver distractions and “simplify previously complex multitasking” behind the wheel.

The Exoskin, in part, by its very nature makes use of haptics (using touch as a form of interface). One of the advantages of this approach is that it enables “fast reflexive motor responses to stimuli”. Moreover, the Exoskin actually involves inputs that “are both highly tactily perceptible and visually interpretable”.

Fabrication Issues

A gap still exists between the current prototype and a commercially viable product in the future in terms of the user’s degree of “granular control” over the Exoskin. The number of “bladders” underneath the rigid top materials will play a key role in this. Under existing fabrication methods, multiple bladders in certain configurations are “not practical” at this time.

However, this restriction might be changing. Soon it may be possible to produce bladders for each “individual Exoskin element” rather than a single bladder for all of them. (Again, the videos present this.) This would involve a system of “reversible electrolysis” that alternatively separates water into hydrogen and oxygen and then back again into water. Other options to solve this fabrication issue are also under consideration.

Mt. Tome hopes this line of research disrupts the distinction between what is “rigid and soft” as well as “animate and inanimate” to inspire Human-Computer Interaction researchers at MIT to create “more interfaces using physical materials”.

My Questions

In what other fields might this technology find viable applications? What about medicine, architecture, education and online gaming just to begin?

Might Exoskin present new opportunities to enhance users’ experience with the current and future releases virtual reality and augmented reality systems? (These 15 Subway Fold posts cover a sampling of trends and developments in VR and AR.)

How might such an Exoskin-embedded steering wheel possibly improve drivers’ and riders’ experiences with Uber and other ride-sharing services?

Among the recent advancements of the replication of various human senses, particularly for prosthetics and robotics, scientists have just made another interesting achievement in creating, of all things, artificial fingerprints. They can actually sense certain real world stimuli. This development could have some potentially very productive – – and conductive – – applications.

This latest digital and all-digit story was reported in a fascinating story posted on Sciencemag.org on October 30, 2015 entitled New Artificial Fingerprints Feel Texture, Hear Sound by Hanae Armitage. I will summarize and annotate it, and then add some of my own non-artificial questions.

Design and Materials

An electronic material has been created at the University of Illinois, Urbana-Champaign, that, while still under development in the lab “mimics the swirling design” of fingerprints. It can detect pressure, temperature and sound. The researchers who devised this believe it could be helpful in artificial limbs and perhaps even enhancing our own organic senses.

Dr. John Rogers, a member of the development team, finds this new material is an addition to the “sensor types that can be integrated with the skin”.

In the team’s work, Dr. Ko and the others began with “a thin, flexible material” textured with features much like human fingerprints. Next, they used this to create a “microstructured ferroelectric skin“. This contains small embedded structures called “microdomes” (as shown in an illustration accompanying the AAAS.org article), that enable the following e-skin’s sensory perceptions*:

Pressure: When outside pressure moves two layers of this material together it generates a small electric current that is monitored through embedded electrodes. In effect, the greater the pressure the greater the current.

Temperature: The e-skin relaxes in warmer temperatures and stiffens in colder temperatures, likewise generating changes in the electrical current and thus enabling it to sense temperature changes.

Sound: While not originally expected, the e-skin was also been found to be sensitive to sound. This occurred in testing by Dr. Ko and his team. They electronically measured the vibrations from pronouncing the letters in the word “skin” right near the e-skin. The results show this affected the microdomes and, in turn, the electric current to register changes.

Dr. Ko said his next challenge is how to transmit all of these sensations to the human brain. This has been done elsewhere using optogenetics (the use of light to control neurons that have been genetically modified) in e-skins, but he plans to research other technologies for this. Specifically, in the increasing scientific interest and development in skin-mounted sensors (such as those described in the October 18, 2015 Subway Fold post linked above), this involves a smart groups of “ideas and materials” to engineer these.

My Questions

Might e-skins have applications in virtual reality and augmented reality systems for medicine, engineering, manufacturing, design, robotics, architecture, and gaming? (These 11 Subway Fold posts cover a range of new developments and applications of these technologies.)

What other fields and marketplaces might also benefit from integrating e-skin technology? What entrepreneurial opportunities might emerge here?

For an absolutely magnificent literary exploration of the human senses, I recommend A Natural History of the Senses by Diane Ackerman (Vintage, 1991) in the highest possible terms. It is a gem in both its sparking prose and engaging subject.

* See this Wikipedia page for detailed information and online resources about the field known as haptic technology.