Dépêches

NRL Flight-Tests Autonomous Multi-Target, Multi-User Tracking Capability

Dépèche transmise le 17 août 2011 par Business Wire

NRL Flight-Tests Autonomous Multi-Target, Multi-User Tracking Capability

NRL Flight-Tests Autonomous Multi-Target, Multi-User Tracking Capability

WASHINGTON--(BUSINESS WIRE)--The Naval Research Laboratory (NRL) and the Space Dynamics Laboratory (SDL) through the support of the Office of Naval Research (ONR), have shown an autonomous multi-sensor motion-tracking and interrogation system that reduces the workload for analysts by automatically finding moving objects, then presenting high-resolution images of those objects with no human input.

“Not only did the network sensing demonstration achieve simultaneous real-time tracking, sensor cross cueing and inspection of multiple vehicle-sized objects, but we also showed an ability to follow smaller human-sized objects under specialized conditions.”

Intelligence, surveillance and reconnaissance (ISR) assets in the field generate vast amounts of data that can overwhelm human operators and can severely limit the ability of an analyst to generate intelligence reports in operationally relevant timeframes. This multi-user tracking capability enables the system to manage collection of imagery without continuous monitoring by a ground or airborne operator, thus requiring fewer personnel and freeing up operational assets.

“These tests display how a single imaging sensor can be used to provide imagery of multiple tracked objects,” said Dr. Brian Daniel, research physicist, NRL ISR Systems and Processing Section, “a job typically requiring multiple sensors.”

During flight tests, March 2011, multiple real-time tracks generated by a wide-area persistent surveillance sensor (WAPSS) were autonomously cross-cued to a high-resolution narrow field-of-view (NFOV) interrogation sensor via an airborne network. Both sensors were networked by the high-speed Tactical Reachback Extended Communications, TREC, data-link provided by the NRL Information Technology Division, Satellite and Wireless Technology Branch.

“The demonstration was a complete success,” noted Dr. Michael Duncan, ONR program manager. “Not only did the network sensing demonstration achieve simultaneous real-time tracking, sensor cross cueing and inspection of multiple vehicle-sized objects, but we also showed an ability to follow smaller human-sized objects under specialized conditions.”

The network sensing demonstration utilized sensors built under other ONR sponsored programs. The interrogation sensor was the precision, jitter-stabilized EyePod developed under the Fusion, Exploitation, Algorithm, and Targeting High-Altitude Reconnaissance (FEATHAR) program. EyePod is a dual-band visible-near infrared and long-wave infrared sensor mounted inside a nine-inch gimbal pod assembly designed for small UAV platforms. The mid-wave infrared nighttime WAPSS (N-WAPSS) was chosen as the wide-area sensor, and has a 16 mega-pixel, large format camera that captures single frames at four hertz (cycles per second) and has a step-stare capability with a one hertz refresh rate.

Using precision geo-projection of the N-WAPSS imagery, all moving vehicle-size objects in the FOV were tracked in real-time. The tracks were converted to geodetic coordinates and sent via an air-based network to a cue manager system. The cue manager autonomously tasked EyePod to interrogate all selected tracks for target classification and identification.

Business Wire

Les plus belles photos d'avions
Airbus A319-114 (D-AILK) Airbus A320-214 (EI-DEJ) Dassault Falcon 2000EX-EAsy (CS-DLF) Bombardier BD-100-1A10 Challenger 350 (HB-JKR) E195-E2 (HB-AZJ) E195-E2 (HB-AZJ)