The new big-budget movie “Her” is about a lonely man named Theodore who develops a romantic relationship with his smartphone. It’s operating system, known as Samantha, is so powerful it can carry on conversations that seem to tap into everything Theodore sees, thinks and feels.

Samantha isn’t a Siri knockoff; her voice is sexy, charming, and most of all, human. Theodore ends up telling Samantha, “I wish I could touch you.” The OS purrs, “How would you touch me?”

The AndroSensor app gives a glimpse at some of the information that the phone's sensors track.

The AndroSensor app gives a glimpse at some of the information that the phone's sensors track.

The movie is set in Los Angeles, in the undefined ‘‘near future,” which may in fact be relatively near. Scientists at such places at the University of California San Diego are making rapid progress in using sensors, apps and wearable devices to monitor and interact with people. It’s raised a question: Will smartphones soon become sentient?

For insight, U-T San Diego turned to Rajesh Gupta, chair of UC San Diego’s Department of Computer Science and Engineering, a leader in machine learning and artificial intelligence.

Q: What is the scope of the information that smartphones, apps and wearable devices can collect on us now?

A: It’s quite significant. Your smartphone knows who you are, where you are, where you’ve been, and a lot about what you’re doing. Its sensors can tell if you’re walking, running or sitting. Some phones also can sense air pressure, temperature, light, humidity and your proximity to objects. What the smartphone can’t do is sense your mood, like your pets can do.

A: Computers are beginning to elicit emotional responses from users. But they can’t sense mood, which you need to know to have a conversation. That will change as sensors are developed to give computers ‘people skills,’ like being able to detect when you’re annoyed or frustrated. There are already applications in automobiles that look for distraction in drivers.

Most of the advances will be tied to wellness, an area where there’s a lot of sensors and apps. Your smartphone can determine your heart rate by using its camera to see changes in the color of your skin. That information can help reveal if you’re excited or anxious -- or not. Advances are being made in learning algorithms that navigate this data and make sense of it. Javier Movellan from UC San Diego’s Machine Perception Lab, along with others, is beginning to fuse information from voice (things like vocal tone), images (the frequency and length of frowns and smiles), skin (conductance, heart rate, temperature, breathing rate), and device usage patterns.

Meet Samantha

The problem is, smartphones don’t have the processing power or the energy to pull all of this together to determine your mood in realtime. Without that, your computer wouldn’t be able to engage you in a realistic conversation, like Samantha does.

Q: I dislike Siri’s computer-generated voice on the iPhone. I know others who feel the same. Will scientists need to give these operating systems natural voices, like Samantha’s? The OS is able to convey surprise, empathy and curiosity.