Peter Simmons, Self-Aware Apps, and Smartphone Healthcare

BOULDER – While a senior engineer at Hughes Aircraft in the mid-eighties, Peter Simmons worked in the field of automatic target recognition, developing autonomous computer systems that were able to recognize tanks and destroy them with missiles.

As the current head of Self-Aware Apps, a Boulder-based digital health startup, Simmons draws on his background in artificial intelligence and image processing to build smartphone applications that might one day deliver basic medical care.

“I’ve always been focused on how we can take computers and use them to understand what people are doing,” Simmons admitted during a recent interview.

After serving as a Hughes Fellow, Simmons worked with different semi-conductor and digital advertising startups. He became interested in digital health technology following the sudden death of a close friend.

“I lived in Austin for a little while and had a friend who was diabetic. He was young and very healthy looking. One day we found out that he had died in a Walgreen’s parking lot. He had forgotten to take his insulin and went into diabetic shock, less than a hundred feet away from help.”

The event caused Simmons to ask himself, “How can I prevent this from happening in the future?” At first he considered creating a journaling app, which would enable diabetics to keep track of their insulin use. But he quickly discarded this idea, deciding that diabetics who forgot to take their insulin might also forget to record when they had last administered it.

“Then I began to wonder how I could take what I knew about smartphones and develop something that’s intelligent enough to say, ‘Hey, did you take your insulin today?”

Building a Doctor in Your Smartphone

Simmons has an engineer’s understanding of the smartphone, seeing it less as a mobile phone, and more as a handheld computer capable of a rudimentary comprehension of its surroundings.

“Your smartphone has a camera, a microphone, and an accelerometer. It’s basically an intelligent sensor package.”

This perspective has allowed Simmons to develop a range of novel applications that use smartphone sensors to generate healthcare data. One of these applications employs a smartphone’s accelerometer to calculate body weight.

“The accelerometer in your smartphone is very accurate when your body moves from a positive to a negative acceleration,” Simmons noted. When a smartphone user jumps, Simmons’ app analyzes the differences in acceleration caused by this movement to accurately calculate the user’s weight.

The CEO of Self-Aware Apps has also developed an application that employs the sensor in a smartphone camera to record blood flow in the human face. He believes that this app will enable its users to measure their body temperature and detect the presence of fever, though he has yet to test it in this capacity.

Perhaps most interesting is the application Simmons has built to understand sounds. This app listens to its surroundings through a smartphone and tries to match the sounds it hears to one of several audio files stored in an online catalogue. Simmons hopes to build out this catalogue so that smartphones can listen to coughs and distinguish between respiratory ailments like bronchitis and pneumonia.

The Dawn of Smartphone Healthcare

Peter Simmons has created an app that uses a smartphone’s camera to detect blood flow in the face.

Simmons sees these applications converging to provide their users with a very simple healthcare interaction.

“You’ll bring up an app and say, ‘I’m not feeling too good.’ The camera will allow the app to verify your identity through facial recognition software. Then the app will ask how you’re feeling. It will ask you to cough for it and identify whether or not you’re congested. It will also be able to tell in the infrared if you have a fever. And it will have access to your medical record in the cloud.”

Simmons admits that this technology would only be appropriate for minor conditions, before noting that such minor conditions currently underlie the majority of patient-provider interactions.

“Unfortunately, there isn’t a reimbursement model for keeping people out of the clinic, which is what this technology would do,” Simmons admits. “That’s why we’re moving into the elder care field first.”

Simmons plans to combine the different apps that his company has already developed to create a smartphone-based AI caretaker, which will be able to check in on elderly users much like a friendly neighbor would.

“Your smartphone is not going to be able to make eggs for you in the morning. But it will be able to tell if you’ve left the gas on, which will allow it to intervene at some point and ask, ‘Hey, did you remember to shut off the stove?”

This AI caretaker will permit what Simmons refers to as “digital aging-in-place.” He plans to design it in such a way that it will be able to interact with users through facial recognition technology, fostering a positive relationship with the app that will reduce feelings of isolation and loneliness.

The Technology Exists Now

Once the technology in this caretaker app has been validated in the elder care market, Simmons sees it making its way into the healthcare system. When asked how long he thinks it will take for this technology to reach healthcare consumers, he states that it’s only a matter of how quickly the demand for it appears.

“It’s not a matter of developing the technology. The technology exists now.”

Like the coverage that CyberMed News provides? Follow us on Twitter, LinkedIn, and Facebook to make sure you keep up to date on the most recent developments in Colorado’s digital health community.