mHealth: Can smartphone front-facing cameras go beyond selfies?

This week, we saw the new iPhone X’s TrueDepth front-facing camera, being used for face authentication. It uses a dot projector that projects over 30,000 invisible dots to map your face structure. Front-facing cameras started showing up in smartphones back in 2003. Selfies being the biggest use case of these front facing cameras. Snapchat (Now so do Facebook & Instagram) thrives on their user’s front-facing cameras with hundred of fun filters. Using computer vision, they map the user’s face. Apple also showcased the improved AR face filter experience with Snapchat on the iPhone X. Beyond selfies and face filters, what else can front-facing cameras do? Can we use them to detect a person’s mood or state of mental health? Can it be used as a mobile healthcare diagnostic tool?

Front facing cameras play an important role in eye-tracking. It is definitely one of the bigger trends in tech. Starting with advertisers & designers, on learning on how to better capture user attention and use it to improve their products & services. After plenty of research it turns out that the data captured by eye-tracking can help diagnose mental neurological disorders. By detecting abnormal eye movements, researchers have been able to detect Parkinson’s patients with 90% accuracy and ADHD in children with 77% accuracy.

One of Israel’s leading startups Umoove has developed eye-tracking tech that runs on any smartphone. uMoove thinks that a number of neurological diseases can be diagnosed based on eye movement thanks to medical research. Using this, millions of mobile phones can be transformed into diagnostic devices.

In a discussion about smartphone cameras, Ed Diender, a VP at Huawei’s Enterprise Business Group quoted,

“Every phone has front cameras. Huawei phones have dual. It gives better image quality. The dual Leica cameras top that even more. In healthcare, there are Apps that use front cameras to check and monitor and control certain (facial) features that this App can get from the front camera to predict and give an estimate on one’s health.

With dual front cameras, such estimates can be more accurate. With the best quality of such dual front camera like Leica’s, such estimate would become closer to precise and actual?”

Earlier this year, Apple was granted a patent for using the camera, light & proximity sensor for health measurements. This could be interesting seeing that Apple is moving towards healthcare solutions from the wellness side.

“The health data may include one or more of a variety of different wellness, fitness, and/or other parameters relating to the health of a user such as: a blood pressure index, a blood hydration, a body fat content, an oxygen saturation, a pulse rate, a perfusion index, an electrocardiogram, a photoplethysmogram, and/or any other such health data,” reads the patent.

Another startup Cardiio detects your Heart Rate with the front facing camera based on the principle of light absorption by blood. Even though it won’t be as accurate as a Heart Rate sensor in the Apple Watch, it can be an inexpensive, seamless solution for smartphone users.

Thanks to AI & Neural Networks facial & sentiment analysis is more accessible than ever. Looking to see more innovative use cases around mental health & mood tracking with improving smartphone cameras.