Army considers multispectral and radar sensor fusion for helicopter pilots

FORT BELVOIR, Va., 20 July 2015. U.S. Army electro-optics experts are reaching out to industry to find companies able to develop multi-spectral sensor fusion with a distributed aperture system (distributed aperture system) to help helicopter pilots fly fog, dust, smoke, darkness, and other degraded visual environment (DVE) conditions.

Jacobs to develop communication, navigation, surveillance, air traffic management software for U.S. Air Force
Jacobs to develop communication, navigation, surveillance, air traffic management software for U.S. Air Force
FORT BELVOIR, Va., 20 July 2015. U.S. Army electro-optics experts are reaching out to industry to find companies able to develop multi-spectral sensor fusion with a distributed aperture system (distributed aperture system) to help helicopter pilots fly fog, dust, smoke, darkness, and other degraded visual environment (DVE) conditions.

Officials of the Fort Belvoir, Va., branch of the Army Contracting Command at Aberdeen Proving Ground, issued a request for information this week (W909MY-15-R-C020) for a market investigation for sensor fusion development.

The integrated system would consist of a forward-looking sensor suite for pilotage in degraded visuals, fused with a distributed aperture system with spherical coverage for situational awareness.

The concept involves fusing outputs from a long-wave infrared camera, a light direction and ranging (lidar) sensor, and a radar with existing terrain and image databases to produce a head-tracked image to help the pilot fly in brownout, fog, smoke, rain, and other bad visual conditions.

The system must be interoperable with state of the art Army aviation display technology, balancing performance with small size, weight, and power consumption.

Related: Air Force kicks off sensor-fusion program to help UAVs sense and avoid other aircraft in shared airspace

This program will develop a sensor fusion engine, and the distributed aperture system. Teams proposing an integrated solution that addresses both efforts have the advantage.

The distributed aperture system effort involves providing a head tracked view of the region of interest of several sensors covering a spherical field of view. The ultimate goal is to develop a system for Army helicopters.

story continues below

The visualization should be compatible with a head-tracked helmet-mounted display with a 1x magnification fused synthetic image that aligns with the real world scene on a see-through display. Ultimately the Army wants the system to work with the Joint Common Architecture (JCA) and Future Aviation Capability Environment (FACE).

The system should help Army helicopter pilots estimate navigation information from the fused sensor data by estimating velocity from changes in imagery over time, by matching measured 3D lidar and radar data to digital terrain elevation data to estimate position.

Related: DARPA pushes forward with navigation sensor fusion initiative to reduce dependence on GPS

The distributed aperture system, meanwhile, should support situational awareness and architectural growth to multifunction threat warning capability. The system will integrate several sensors covering a spherical field of regard with a processor and algorithms that provide a seamless, head-tracked view of any portion of the imagery.

At a minimum, the distributed aperture system must include processor and algorithms to create an image sphere from the sensor outputs, and to provide a head-tracked view to an helmet-mounted display.

Companies interested should submit white papers online no later than 10 Aug. 2015 online to https://safe.amrdec.army.mil/SAFE2/. Email questions or concerns to the Army's Brian Thomas at brian.w.thomas30.civ@mail.mil, and copy Sabin Joseph at sabin.a.joseph.civ@mail.mil.

More information is online at https://www.fbo.gov/notices/7cc2286c5dbd06dcface2a00a2771863.

More in Military