Fast forward thirteen years to a recent article entitled Exoskin: A Programmable Hybrid Shape-Changing Material, by Evan Ackerman, posted on IEEE Spectrum on June 3, 2016. This is about an all-new and entirely different development, quite separate from quantum dots, but nonetheless a current variation on the concept that matter can be programmed for new applications. While we always think of programming as involving systems and software, this new story takes and literally stretches this long-established process into some entirely new directions.

I highly recommend reading this most interesting report in its entirety and viewing the two short video demos embedded within it. I will summarize and annotate it, and then pose several questions of my own on this, well, matter. I also think it fits in well with these 10 Subway Fold posts on other recent developments in material science including, among others, such way cool stuff as Q-Carbon, self-healing concrete and metamaterials.

Matter of Fact

The science of programmable matter is still in its formative stages. The Tangible Media Group at MIT Media Lab is currently working on this challenge included in its scores of imaginative projects. A student pursuing his Master’s Degree in this group is Basheer Tome. Among his current research projects, he is working on a type of programmable material he calls “Exoskin” which he describes as “membrane-backed rigid material”. It is composed of “tessellated triangles of firm silicone mounted on top of a stack of flexible silicone bladders”. By inflating these bladders in specific ways, Exoskin can change its shape in reaction to the user’s touch. This activity can, in turn, be used to relay information and “change functionality”.

Although this might sound a bit abstract, the two accompanying videos make the Exoskin’s operations quite clear. For example, it can be applied to a steering wheel which, through “tactile feedback”, can inform the driver about direction-finding using GPS navigation and other relevant driving data. This is intended to lower driver distractions and “simplify previously complex multitasking” behind the wheel.

The Exoskin, in part, by its very nature makes use of haptics (using touch as a form of interface). One of the advantages of this approach is that it enables “fast reflexive motor responses to stimuli”. Moreover, the Exoskin actually involves inputs that “are both highly tactily perceptible and visually interpretable”.

Fabrication Issues

A gap still exists between the current prototype and a commercially viable product in the future in terms of the user’s degree of “granular control” over the Exoskin. The number of “bladders” underneath the rigid top materials will play a key role in this. Under existing fabrication methods, multiple bladders in certain configurations are “not practical” at this time.

However, this restriction might be changing. Soon it may be possible to produce bladders for each “individual Exoskin element” rather than a single bladder for all of them. (Again, the videos present this.) This would involve a system of “reversible electrolysis” that alternatively separates water into hydrogen and oxygen and then back again into water. Other options to solve this fabrication issue are also under consideration.

Mt. Tome hopes this line of research disrupts the distinction between what is “rigid and soft” as well as “animate and inanimate” to inspire Human-Computer Interaction researchers at MIT to create “more interfaces using physical materials”.

My Questions

In what other fields might this technology find viable applications? What about medicine, architecture, education and online gaming just to begin?

Might Exoskin present new opportunities to enhance users’ experience with the current and future releases virtual reality and augmented reality systems? (These 15 Subway Fold posts cover a sampling of trends and developments in VR and AR.)

How might such an Exoskin-embedded steering wheel possibly improve drivers’ and riders’ experiences with Uber and other ride-sharing services?