Coming to an airport near you someday: a machine that can tell if you’re lying

Aaron Elkins, an assistant professor in the Fowler College of Business Administration at San Diego State University, has designed an Automated Virtual Agent for Truth Assessments in Real Time, or AVATAR. The machine is designed to detect whether a person is lying. Aaron Elkins (left) demonstrates AVATAR to Alan D. Bersin, former acting commissioner of U.S. Customs and Border Protection. To Bersin’s right is Luc Portelance, former president of the Canadian Border Services Agency. (Photo courtesy of Elkins/San Diego State University)

A professor in California has developed a screener for airport or border control checkpoints to determine whether a traveler is lying, smuggling contraband or otherwise up to no good.

And unlike the Transportation Security Administration, this robotic agent wouldn’t even have to touch you.

Aaron Elkins, an assistant professor in the Fowler College of Business Administration at San Diego State University, has designed an Automated Virtual Agent for Truth Assessments in Real Time, or AVATAR.

Using technology that scans and sifts a range of data, from facial movements to eye movement to pupil dilation to curling toes, the machine interrogates travelers to figure out if a person is lying or trying to enter the country with ulterior motives. Sensors pick up a variety of information the person’s body, much of it unconscious that provide telltale evidence of whether a person is lying.

All a visitor has to do is step up to the kiosk and the machine — after scanning a passport or other ID and offering a quick introduction – swiftly begins grilling the person with questions.

“Hello, I am AVATAR,” it says. “What do you intend to do after speaking with me today? Have you ever used any other names? In what year were you born? Did you pack your own bag today? Please describe the contents of your bag. What was the name of your high school? Who was the principal at your high school. I would like you to describe everything you did today. . .”

And so on. And what a person might try to conceal – even with coaching – the body gives away in subtle clues that the technology detects.

Lying is part of being human, as National Geographic said in a cover story about the origins of human deceit and its adaptations. The article, which appears in its June edition, notes that researchers showed two decades ago that people lie, on average, once or twice a day – usually white lies meant to protect another person’s feelings or conceal one’s inadequacies. Lying also is essential when it comes to manipulating other people without using physical force.

But machines, especially those with artificial intelligence, may be just as sharp and flexible at figuring you out. Part of the reason is that lying takes a lot of work, and your body is the first to show it.

“It’s a lot harder to lie because you’re using strategies,” Elkins said. “You’re managing your story, you’re managing your impression, you’re evaluating the perception of yourself in a kind of meta way, and then you’re changing your story if you think it’s successful or people think you’re suspicious.”

AVATAR is a kiosk that resembles those used for airport check-ins or grocery store self-checkouts. (Photo courtesy of Elkins San Diego State University)

AVATAR sifts an impressive variety of data:

Voice – AVATAR listens carefully to speech, looking for linguistic shifts and changes in tone to determine whether a person’s being truthful. People who are being evasive, for example, tend to switch pronouns – “I” becomes “we” – as they hedge. The machine also looks for “disfluencies,” Elkins said – those breaks and placeholders in speech that suggest a person is buying time while thinking of the right thing to say. The machine also compares a person’s tone with other physical data to tell whether a person really means what he’s saying. When a person says, “Today was the best day of my life,” the machine has ways of determining whether the speaker is being earnest or ironic.

Eyes – The poets were right: the eyes truly are windows on the soul. If a person’s pupils are dilated, it suggests arousal: evolution has primed us to gather all the light we can for a flight or flight response. AVATAR can note that and the way the eyeballs are moving. The eyes are tireless hunters, scanning and tracking their target in ways a person isn’t generally aware of. Of particular interest is when eyes are saccading – making twitchy little excursions back and forth over an area of interest as the mind builds a mental image of an object. In research, Elkins said, scientists have studied the way that a person’s tracking can give away what’s of most interest to them if the person is trying to hide it.

Facial expression – Cameras – both 2D and 3D – are able to analyze a person’s facial expression to see whether the muscles are moving in a way that matches what a person is trying to convey. When a person is genuinely smiling, he or she is using muscles at the corners of the eyes and the mouth, known as a Duchenne smile after the scientist who discovered it. .Just pulling up the corners of the mouth in a smile without any change to the eyes – which has been referred to as the “Botox” smile – suggests phoniness.

Posture – AVATAR’s cameras can figure out whether a person is holding his head rigidly – a possible sign that he’s making stuff up. It can also tell if you’re curling your toes – another suggestion that you’re uptight about something.

All of this analyzing is possible without touching the person. And yet it’s still sort of creepy in other ways, as Elkins acknowledged.

“I do think any of these technologies have the potential for unintended consequences,” Elkins said.