New UI/UX after Apple’s iPhone/iPad

Sometimes I get the impression that the industry believes the iPhone and iPad represent the pinnacle of human technology. Even though the majority of the market attention is on these form factors, several new UI technologies are already out of the labs. These technologies have the potential to disrupt the traditional smartphone/tablet market and might pave the way for new types of products.

Here are a few examples that point toward a world after candybar multitouch. Exactly how they can be used and integrated in the UI/UX remains to be seen.

Demo of Microsoft Surface with PixelSense from Samsung

I have written about Microsoft Surface before, which is large horizontal multitouch screen built as a table. In the new slimmer version of Surface, Microsoft together with Samsung have developed PixelSense touch sensing technology. In PixelSense every pixel in the screen is also an infrared sensor that detects warm fingers on the surface. Just imagine what a future development of this technology could do if Samsung manages to fit the three RBG color sensors in every pixel. The surface could double as a copying machine. You put a paper, coupon or picture facing down on the surface, and when you lift it up, the copied object is displayed on the screen.

A technology for high performance multitouch screens has been developed by the Swedish startup Flatfrog. Their multitouch is based on an optical in-glass solution (Planar Scatter Detection) that also can be used to create multitouch on curved glass surfaces.

Another Swedish startup is Tobii, which has developed a technology for tracking eye movements. Using cameras that track the position of the pupil it is possible to calculate exactly what the user is focusing on. The company’s initial markets have been expensive high end systems for paralyzed people, market researchers, and academic researchers in cognitive psychology. Tobii has now begun to target the mainstream market together with Lenovo which are integrating eye tracking in a prototype laptop.

Kinect is a technology that Microsoft developed for their gaming console Xbox. It is an add-on gadget for your gaming console or flatscreen with facial recognition, voice recogniton and the ability to track gestures such as arm and hand movements. With Kinect you can control a game or PC by talking and waving your arms. It can be used for controlling an action figure or for moving between windows such as browsing your music collection, zooming in and out of a photo, etc. Up to six users can be tracked at the same time.

Even more futuristic UI/UX modalities are BCI technologies (Brain Computer Interface) where brain waves directly control an UI or some machinery. BCI has been used in research labs for a long time with electrodes implanted in the skull. Newer products based on less invasive methods with the electrodes attached to the scalp are now hitting the market, often in the form of a headset. The precision and bandwidth of these methods are still very primitive. One of the few things that can be reliably measured with BCI are emotive states such as relaxation vs. concentration.

Most of these new innovations are early in their life cycle and it is still too early to tell if anyone of them has a strong disruptive potential. New technologies drive development of new form factors. It remains to be seen if and how this will create future killer hardware. There is also a shortage of apps that can take advantage of the new features and turn them into compelling user experiences.

There are several hurdles to overcome. Products such as Kinect, Tobii and Surface put significant demands on processor capacity and there is a learning curve for any new UI technology. Prices have to come down for the large mainstream market to accept them.

I am slightly skeptical about a technology that requires you to wave your arms. What’s fine when gaming in your own living room, lifting and waving your arms for an extended period of time is tiresome. This has already been shown by users’ resistance to large vertical PC touchscreens.

It is possible that these new technologies will find their way into the candybar smartphone/tablet. But I think it is more likely that the future smartphone will integrate these new UI technologies without residing in the handset. If most tables, office desks, and bars are made of hard glass, with MS Surface technology perhaps the user could just place their smartphone on the glass and have all their apps, contacts and pictures displayed. The surface might even have built in eye tracking. Or maybe Corning’s vision of a world of glass will come true and the nearest wall will be able to display your smartphone home screen with built in eye tracking for navigation in the wall. Just make sure to control your eyeballs – you never know who might be looking over your shoulder.