WASHINGTON--(BUSINESS WIRE)--The Naval Research Laboratory (NRL) and the Space Dynamics Laboratory
(SDL) through the support of the Office of Naval Research (ONR), have
shown an autonomous multi-sensor motion-tracking and interrogation
system that reduces the workload for analysts by automatically finding
moving objects, then presenting high-resolution images of those objects
with no human input.

“Not only did the network sensing demonstration
achieve simultaneous real-time tracking, sensor cross cueing and
inspection of multiple vehicle-sized objects, but we also showed an
ability to follow smaller human-sized objects under specialized
conditions.”

Intelligence, surveillance and reconnaissance (ISR) assets in the field
generate vast amounts of data that can overwhelm human operators and can
severely limit the ability of an analyst to generate intelligence
reports in operationally relevant timeframes. This multi-user tracking
capability enables the system to manage collection of imagery without
continuous monitoring by a ground or airborne operator, thus requiring
fewer personnel and freeing up operational assets.

“These tests display how a single imaging sensor can be used to provide
imagery of multiple tracked objects,” said Dr. Brian Daniel, research
physicist, NRL ISR Systems and Processing Section, “a job typically
requiring multiple sensors.”

The network sensing demonstration utilized sensors built under other ONR
sponsored programs. The interrogation sensor was the precision,
jitter-stabilized EyePod developed under the Fusion, Exploitation,
Algorithm, and Targeting High-Altitude Reconnaissance (FEATHAR) program.
EyePod is a dual-band visible-near infrared and long-wave infrared
sensor mounted inside a nine-inch gimbal pod assembly designed for small
UAV platforms. The mid-wave infrared nighttime WAPSS (N-WAPSS) was
chosen as the wide-area sensor, and has a 16 mega-pixel, large format
camera that captures single frames at four hertz (cycles per second) and
has a step-stare capability with a one hertz refresh rate.

Using precision geo-projection of the N-WAPSS imagery, all moving
vehicle-size objects in the FOV were tracked in real-time. The tracks
were converted to geodetic coordinates and sent via an air-based network
to a cue manager system. The cue manager autonomously tasked EyePod to
interrogate all selected tracks for target classification and
identification.