By setting up multiple sensors, researchers have learned to “see” what’s out of sight.

Think back a moment to the Bin Laden raid and the dangerous business of invading an enemy headquarters, with adversaries lurking around every corner. In the future, that might be a solved problem. Researchers from the University of Central Florida, with funding from the U.S. military, have developed a method for seeing around corners by calculating how the light waves emanating from a source, reflecting off objects like people and then reflect off of walls.

“We have shown that information about a non-line-of-sight object can be obtained completely passively without using mirrors and without any access to the source of natural light,” they write in their paper, published this week in the journal Nature Communications.

“So far, we have demonstrated that we can identify the shape and assess the size and location of a spatially incoherent source,” meaning an object, like a person, or anything else that reflects natural, ambient light, said UCF researcher Aristide Dogariu.

Subscribe

Receive daily email updates:

Subscribe to the Defense One daily.

Be the first to receive updates.

Unlike earlier efforts to see around corners using sound vibrations and shadows, the new technique uses light-wave effects directly, potentially providing more details in even less light.

When light waves encounter an object like a wall they scatter and become less “coherent.” But not all the information that they carry disappears.

If you can see an object, your eye is picking up enough information to recover an image; if you are separated by a wall, the light wave is incoherent to you. But incoherent doesn’t mean non-existent. What Dogariu and his colleagues discovered is that it’s possible to piece together information about the wave and the object that reflected it in the same way you can piece together an image in the shards of a broken mirror. All that you really need are points of reference. It’s sort of the same way having multiple radars can help you pinpoint the location of an aircraft, or having two people describe a person leads to a better picture of that person’s appearance.

The method uses wavefront shearing interferometry at multiple points to collect information about the less coherent light waves in order to piece together a statistical picture of size, shape, and distance.

The work was funded, in part, by the Defense Advanced Research Projects Agency.

Patrick Tucker is technology editor for Defense One. He’s also the author of The Naked Future: What Happens in a World That Anticipates Your Every Move? (Current, 2014). Previously, Tucker was deputy editor for The Futurist for nine years. Tucker has written about emerging technology in Slate, ...
Full bio

By using this service you agree not to post material that is obscene, harassing, defamatory, or
otherwise objectionable. Although Defenseone.com does not monitor comments posted to this site (and
has no obligation to), it reserves the right to delete, edit, or move any material that it deems
to be in violation of this rule.