Sensor-based apps offer lots of potential but are hindered by fragmentation

Sensors are ubiquitous in mobile devices and they should inspire a world of innovative applications based on their special abilities to detect motion, touch, light and environmental conditions. But there is a serious gap between the capabilities sensors can enable in devices today and the actual use of these components in mobile applications.

Sony's Xperia sola features "floating touch" navigation.

In general, problems like lack of standard tools and fragmentation in how sensors are accessed and configured for use in devices make working with these features pretty difficult for developers. "I bleed for all the application developers that are trying to develop a cool UI," Carlos Puig, principal engineer at Qualcomm (NASDAQ:QCOM), said at an event on developing sensor-based apps last week.

The event, sponsored by the Wireless Communications Alliance and held at Qualcomm's offices in Silicon Valley, drew hundreds of engineers and developers looking to incorporate these components into mobile applications. They got an earful.

Sensors in smartphones are increasingToday, roughly 30 percent to 40 percent of smartphones include a combination of the most popular sensors: an accelerometer, compass, gyroscope, optical sensor, touch sensor and camera with better than 3-megapixel resolution. These are used in addition to the routine sensors, such as the radio, microphone and others that provide basic functions. By 2015, more than 50 percent of smartphones will have these popular components, according to Tristan Joo, co-chair of the mobile special interest group and a board member for the WCA.

Despite the availability of these sensors in devices today, applications hardly ever use them. Of the hundreds of thousands of apps available in app stores today, fewer than 0.5 percent employ sensors, Joo said. That's right: less than 0.5 percent. And the most common sensor used is the compass.

Qualcomm's Puig said that the reason sensors are rarely incorporated into applications is because it is difficult for developers to navigate the dozens of sensor vendors, sensor product lines and application development tools. In addition, OEMs, when installing sensors in devices, make their own decisions on how the sensors should respond to variables that can influence the design of an app, such as the quality of ambient light or proximity values that trigger a phone to prepare for a voice call when the phone is held close to the face.

Another problem is that the sensor APIs available with high-level operating systems are hard to understand because the documentation is very limited and because their use is dependent on manufacturer implementations.

Standardization is necessarySome type of standardization is needed, Puig said, to allow creativity in this business to go forward. He said that Qualcomm wants the industry to come up with common APIs that will support basic sensor functions while allowing companies to introduce proprietary enhancements that can add special features to an application. Qualcomm is willing to take the lead to make this happen. It has not decided what type of forum would be best to achieve a standard approach for sensor APIs, but would prefer an informal organization that can proceed quickly.

"I inspire you to take this on for the benefit of the industry and for the good of all the application developers," Puig said.

In addition to the current challenges associated with sensor development, it is hard for many to imagine how sensors can actually be used to create new user experiences with mobile devices. Sensor vendors are trying to generate imaginative use of their components.

Examples of sensor-based appsSensirion, a vendor of temperature and humidity sensors, showed off an application, called AirTouch, that allows a consumer to use their breath to interact with their smartphone. The user lightly blows on the phone when a call comes in and the temperature and humidity in their breath trigger the sensor to answer the call. The sensor can also be applied to take pictures with a phone: just hold up the phone, frame the picture, blow on the device and the sensor snaps the shutter. For consumers who are out snapping photos with their friends, the application would add a new and fun dimension to the common social experience of smartphone picture-taking. The applications generated a very enthusiastic response from the audience.

Gesture-based applications are stimulating innovative ideas for application development. One use of this idea, called "hover browsing," makes it possible for a device to respond to the user's hand or finger even though they are not touching the device. This is becoming available now on some devices. One example is the "floating touch" feature offered on the Sony Xperia sola device built with TrueTouch sensors from Cypress Semiconductor. Samsung also showed how sensors are employed to enable use of the gesture-based "Sensor" music player on the Galaxy Note.

Motion is believed to be a killer app according one vendor, Kevin Shaw, who is chief technology officer at Sensor Platforms. He said he believes motion is "the hidden interface" that will make it possible for a device to interpret how a device is being used and react accordingly. By being aware of the context in which a device is being used, for example, a tablet could automatically shift into photo mode when a consumer picks up the tablet and holds it up in front of their face.

"It allows a whole computing device to feel natural and intuitive and interface in a way that makes sense," he said.

These examples just brush the surface of the innumerable applications that sensors will enable in consumer and business markets that range from gaming to healthcare, to weather monitoring, not to mention location-based services. While it is clearly too soon to see how the industry responds to Qualcomm's call for common tools, developers can reach out to developer groups if they want to get started. Samsung and Microsoft (NASDAQ:MSFT), for example, are eager for developers to work with their firms to create sensor-based apps. They invited developers to contact their organizations to learn more.