Laser radar blends with computer simulation to help rotorcraft pilots land safely in dust
OTTAWA, 30 April 2009. Rotorcraft avionics specialists at two Canadian aerospace companies are blending lidar and simulation and training technology to create a virtual environment for helicopter cockpits to enable pilots to land in zero-visibility obscurants -- especially at night.
OTTAWA, 30 April 2009.Rotorcraft avionics specialists at two Canadian aerospace companies are blending lidar and simulation and training technology to create a virtual environment for helicopter cockpits to enable pilots to land in zero-visibility obscurants -- especially at night.
Lidar stands for light direction and ranging, and uses infrared laser beams as primary sensors for image capture of the surrounding area. This technology also is referred to as ladar, which is short for laser radar. Neptec's military avionics technology is called OPAL, which is short for obscurant-penetrating autosynchronous lidar.
Engineers at the Neptec Design Group in Ottawa and CAE in Montreal are using military laser technology to create imagery of visually obscured areas, which they overlay with terrain database information designed for military training and simulation to create an enhanced, computer-generated out-the-window view that is updated in real time to help helicopter pilots land where dust, smoke, fog, or other obscurants blot out the pilot's view of the outside.
"Landing helicopters in brown-out conditions is the major application for this technology," explains Neptec President Dr. Ian Christie. "This could help alleviate operational constraints where pilots can't land in dusty areas at night. Lidar sees as well at night as it does during the day."
CAE and Neptec demonstrated the OPAL sensor integrated into the CAE AVS at the Yuma Proving Grounds, Ariz. when they penetrated dust clouds generated by a UH-1 test helicopter. The system helped pilots see through brownout conditions opaque to the human eye to differentiate between rocks, bushes, sloping terrain, utility poles, ground vehicles, and wires at distances greater than 200 meters.
"Our job is to have a very efficient sensor system for the CAE AVS with updates to the database that it uses to create the simulated environment," Neptec's Christie explains. "Pilots need to find out how different the landing zone is from what the system thinks it is. Combine the sensor and simulation and you have a very accurate synthetic environment."
The CAE AVS combines OPAL with other sensors such as forward-looking infrared (FLIR). Sensor information blends with the CAE-developed common database (CDB), originally developed for U.S. Special Operations Command.