NASA uses simulation technology to evaluate fighter pilot performance

MOFFETT FIELD, Calif. 17 Dec. 2008. Engineers at NASA Ames Research Center used current simulation technology to create scientific methods for evaluating fighter pilot performance.

Dec 17th, 2008

By John McHale

MOFFETT FIELD, Calif. 17 Dec. 2008. Engineers at NASA Ames Research Center used current simulation technology to create scientific methods for evaluating fighter pilot performance.

The U.S. Air Force Office of the Surgeon General tasked NASA two years ago to do a feasibility study – the Operational Based Vision Assessment (OBVA) program – to see if it was possible with current technology to create a flight simulator with an eye-limiting resolution out-the-window (OTW) visual system, says Dr. Barbara Sweet, aerospace engineer for the Human Systems Integration division at NASA Ames. In other words one that could accurately replicate what a pilot would see when flying to the point that his operational vision could be accurately evaluated, she explains.

Eye-limiting means the resolution is related to human eye characteristics, Sweet says.

"We determined it was feasible to do this with commercial-off-the-shelf (COTS) technology" and successfully demonstrated it to the Air Force this summer, Sweet says. Her team used a Sony SRX projection system with an Independence 4000 Image Generator from Quantum3D in San Jose, Calif. NASA also worked with the U.S. Air Force School of Aerospace Medicine 7th Air Force on this program.

"The average vision of a fighter pilot is 20/13, and we wanted to create a system with visual detail" at the 20/15 or 20/10 level, but found it would take about 80 projectors that offer 1600 x 1200 pixels. This would be too expensive to maintain let alone operate, Sweet says.

The Sony technology only required 15 projectors for a 20/15 system and 20 for a 20/10 system, she adds.

The Quantum image generator rendered a scene from the Innsbruck airport in Austria during the demonstration, says John Archdeacon, vice president of product marketing at Quantum3D. The importance of an image generator lies in its ability to compute the image that the pilot would see in an out-the-window scene.

NASA Ames recommends a system that has the ability to sustain a constant, high update rate (60 Hz or greater) without missing refresh cycles or producing objectionable artifacts, Quantum3D officials say.

Quantum's image generator technology is quite mature and capable of rendering the images at high resolution levels, Sweet says.

Sweet and her team looked at ways to collect find, fix, target, track, and engage (F2T2E) data in simulation environments and how that can be used to create standards for operational vision evaluation.

Current pilot operational standards were set before World War II, Sweet says. They are also not scientific in nature, but based on observation and reporting of the pilots themselves. "Each branch of the service also has different standards with different histories."

The Air Force standards cover visual acuity, color, depth perception, eye alignment and coordination, and peripheral vision, Sweet says. The Air Force would also like to explore the effect of other visual characteristics such as contrast sensitivity, attention, dynamic visual acuity, low-light performance, she says.

The goal is to create a system "good enough to show the relationship between pilot vision and operational performance," Sweet says.

To accomplish that the system needed to overcome the hurdle of preventing motion-induced blur in the projection systems, she continues.

The blur is negative artifact caused by response time (rise/decay of pixel illuminance), and hold-time (the amount of time each pixel is illuminated per refresh). Sweet says response time is fixed in modern systems, but the hold time in Liquid crystal displays (LCD) and liquid crystal on silicon (LCoS) causes blurring with moving images – in other words they when the image moves it blurs the retina. With a cathode ray tube (CRT) there is no blur but less light, she says.

Sweet referenced a paper she and Dean P. Giovannetti, assistant branch chief at Aerospace NASA Ames Research Simulation Operations Branch, wrote titled "Design of an eye limiting resolution visual system using commercial-off-the-shelf equipment."

In it they state that hold time is "due to an interaction between the display and pursuit eye movements. When the image is placed in motion, and the eye tracks the motion, the long illumination time of the pixel causes the light from a particular pixel to smear on the retina, causing blurring. The magnitude of the blur is proportional to the amount of image motion. Technologies with brief illumination times, such as cathode-ray tube (CRT) and lasers typically do not exhibit this artifact. Limiting illumination time through shuttering has been shown to reduce blur; the amount of blur reduction is proportional to the duration of the shuttering. Unfortunately, blur reduction through this method also reduces the overall brightness of the display. It is expected that an increase in refresh rate (e.g., from 60 Hz to 90 Hz or 120 Hz) will reduce blur while retaining brightness."

The successful demonstration led Air Force officials to approve Sweet and her team moving onto the third phase, which is technology development, she says. That should be about a two-year phase as well, she adds.

In their paper Sweet and Giovannetti concluded by stating "until the holodeck design concept popularized on Star Trek is realized, no simulator visual system will replace the experience of viewing an out-the-window scene from the actual cockpit. Any visual system design will involve trade-offs between design variables" such as refresh rate vs. scene complexity or brightness vs. visual motion depiction.

More in Military