As PC gamers know, computers can generate scenery that is real enough to draw you into the action. Aircraft software also can create "synthetic" terrain by rendering stored data into a three-dimensional (3D) picture. Aided by sensors, synthetic vision (SV) can improve safety and situational awareness, not just in high-level cruise, but in high-speed, low-level maneuvers. It can improve the effectiveness of unmanned air vehicle (UAV) operations, as well. The synthetic view, for all its limitations, remains clearly visible at all times.
In some concepts, terrain depictions are sketchy because they complement a sensor image that must be seen through. In others the terrain–on a head-down display–looks strikingly realistic. Both approaches superimpose synthetic terrain on primary flight displays (PFDs) with head-up display (HUD)-style flight path markers and flight directors. The former approach, taken by Rockwell Collins in a HUD and head-down display demonstrated recently in Albuquerque, N.M., and Atlantic City, N.J, is designed to aid navigation. It includes synthetic terrain following and approach symbology, and integrates synthetic and infrared (IR) imagery. The latter approach, taken by Honeywell in an internal research program, is not yet intended for navigation.
Aimed at SOF
Collins recently completed evaluation flights of a prototype synthetic and enhanced vision (SE-Vision) system developed with Air Force, NASA and internal funding. The company has largely proved the technology, much of which is promising for commercial as well as military applications, says Tim Etherington, technical director for SE-Vision programs. Collins’ demonstrations capped a dual-use science and technology (DUST) program led by the Air Force Research Lab (AFRL).
Collins showcased terrain following guidance aligned with infrared imagery and other symbology for customers like the USAF Special Operations Forces (SOF). Talon II SOF pilots regularly fly MC-130Hs through narrow valleys in mountainous terrain to mask the aircraft from enemy sensors. In visual flight rules (VFR) conditions the aircraft fly down to 100 to 150 feet above terrain, and in instrument flight rules (IFR) conditions, down to 250 feet. The prototype PFD depicts aircraft bank angle on a horseshoe-shaped ground pointer at the bottom of the display, a presentation Talon II pilots are used to seeing.
Collins hopes to see its technology adopted for MC-130Hs and C-17s. The company could mature a system by 2007, says Robert Koelling, principal marketing manager for Air Force programs. The baseline of the C-130 Avionics Modernization Program (AMP) would have to be adjusted, however, to accommodate additional sensor, processing and human interface requirements. "It’s going to come down to what the best overall sensor solution is and then the augmentation of database-driven information, which is a real strength of Rockwell," he says. Collins’ "rails," visible on the HUD and PFD, help pilots stay above the terrain. Arching upward and downward and bending left or right, the two white lines indicate the minimum safe altitude, explains Guy French, AFRL research psychologist and DUST program manager. The rails’ lateral component is programmed into the flight management system (FMS), while the vertical component is calculated in real time, based on terrain data and aircraft performance parameters. In the demo the rails were set at 500 feet above ground level, and you were supposed to be 150 feet above them.
The rails "emphasize that there’s a floor thou shalt not go below or you will be killed," Etherington stresses. They are not bounded at the top, so the pilot can fly as high above them as he wants. A pilot flying at 5,000 feet above a valley floor next to a 20,000-foot mountain, for example, could easily fly higher. An absolute floor is essential in terrain following, but an exact altitude to fly is not. And in terrain following, rails are better than a "tunnel" because they produce much less clutter.
When the pilot flies off the path, as often occurs in combat, another set of rails appears on the HUD (yellow on the PFD) to indicate the safe floor for the current vector. But neither the HUD nor PFD provides lateral guidance off the path. Nor, in the flight demonstration, did either provide symbology to guide pilots from the off-path rails back to the commanded rails.
Flight Demos
In Atlantic City, with custom screens installed on an FAA Boeing 727-100, the evaluation pilot used the head-down nav display for off-path lateral guidance and to return to the commanded path. As the aircraft approached the commanded path, the HUD and PFD displays showed the original rails, and the pilot was able to turn back onto the planned path. Collins earlier demonstrated a push-button control that activated another set of rails to lead the pilot from off-path back to the commanded path, Etherington says.
Preliminary results–announced before the Atlantic City flight–show that after one simulator session, one ground briefing and 30 minutes of in-flight practice, pilots were within 10 seconds of their planned time on target, never below 500 feet above ground level–the minimum safe floor–and were always in a position to land on their self-contained approaches (SCAs). Military pilots fly curving, precision SCAs, rather than long, straight-in approaches, to minimize exposure to enemy sensors and firepower.
The demo pilot took off from Atlantic City airport (KACY) and flew a proxied, terrain following flight path (borrowed from earlier New Mexico flight tests) in the Cape May, N.J., VOR tactical navigation (VORTAC) area along the eastern seaboard. The pilot then flew off the path, and eventually onto a 3D pathway, or tunnel, executing an SCA at Cape May, N.J. Since the runway there was a bit short for a B727, the pilot came back and executed an approach and landing at KACY. Collins’ synthetic flight tunnel–a set of four lines, top and bottom–presents the whole lateral and vertical containment of the path. Using the pathway, a pilot can discern very small changes in roll angle and position–errors on the order of 10 to 15 feet (3 to 4.5 m), Etherington says.
Talon II pilots routinely fly off the planned path. They program a set of waypoints into the FMS but then fly to the side of them to avoid threats and exploit features of the terrain that they detect with a high-power, narrow-beam, terrain following radar. A detailed terrain following flight plan would require thousands of waypoints and be difficult to store, Etherington explains. But Collins is considering a flight plan optimizer tool as part of its research.
Using the HUD
HUDs were an important part of the DUST program, too, as military pilots probably will use them as the primary flight reference. Collins’ HUD, like the PFD, integrates both fused, dual-band IR and synthetic data, overlaying but not fusing the SV and enhanced vision (EV) imagery. A knob can be used to dial down the content of either information source, and it’s unlikely that both sources would be used with equal intensity at the same time. The "wireframe" terrain allows the pilot to see through the synthetic image and tell at a glance whether it lines up with actual terrain, thus gauging the accuracy of the virtual information.
Because the HUD needs to match up exactly with the real world, its roughly 30-degree field of view (FOV) won’t always match what’s on the PFD, which can zoom in and out. But the two devices complement each other. Since an EV sensor, such as forward looking IR, has a limited FOV and sees ahead only so far when the aircraft is flying a curving path, SV can widen the field of view. Virtual data also can be used to display an outline of the runway, which would help the pilot find the runway quickly after making the final turn before touchdown.
AFRL’s concept is to show synthetic and enhanced vision on both head-up and head-down displays, French says. The Air Force also wants to fuse millimeter-wave radar and IR on the HUD. The New Jersey demo did not use millimeter-wave or weather radar data.
Honeywell’s Path
Honeywell’s SV concept, quite different from Collins’, is likely to reach business aviation first. The realistic mountain scene near the Reno, Nev., airport–shown on the prototype PFD–is drawn from Honeywell’s enhanced ground proximity warning system (EGPWS) database–which has 250 million flight hours of experience.
FAA has responded favorably so far. "They kind of breathed a sigh of relief when they learned that this isn’t a new database, that we’re not introducing any new errors," says Thea Feyereisen, a Honeywell research scientist for human centered systems. Nevertheless, "it’s not for navigation at this point–[the imagery] is for reference only," she says.
Still it’s "going to do a lot for pilot vigilance," she smiles, and "you don’t have to turn it off on 2 miles and final." The first step is en-route terrain awareness, but the busy terminal environment is "where the thing is going to pay for itself," she comments.
Honeywell does not use rails, tunnels or highways in the sky on the current prototype, so the scene is relatively uncluttered. A high-performance navigation package will be required to show the pilot where he is in the scene. As the technology is still in the research phase, the company has not indicated a target date for fielding a system.
When Honeywell flew the system from Arizona to Seattle, it was "eye-opening," Feyereisen says. Data pulled from the EGPWS database is rendered into a 3D perspective view in real time. Color-coded, based on elevation, like a VFR sectional chart, the terrain is lifelike enough to intuitively cue the pilot to terrain hazards, but distinct enough to avoid complacency.
Basically, the PFD terrain display is "just wallpaper," on which key flight parameter data is displayed, says Feyereisen. Any symbology has to buy its way on. Honeywell sums up the technology’s usefulness in three adjectives: ambient, natural and continuous. It’s ambient because "you’re in the data–it’s all around you," says Aaron Gannon, a Honeywell human factors engineer. It’s natural because it’s presented in an "egocentric," pilot-centered view. And it’s continuous because the pilot sees it on the PFD.
It’s also intuitive. The pilot would know to pull up if the terrain is above his zero-pitch reference line or his flight path marker is down in the terrain.
Honeywell approaches synthetic vision elements from the perspective of their intended functions and associated crew actions, Gannon says. For example, what data will the crew use in coordination? What actions will they take based on SV data? Can the pilot easily tell whether the terrain to the left is above or below him? Can he tell whether accepting a certain radar vector is going to head him into the rocks? Answering such questions requires more rigor, discipline and analysis than simply claiming situational awareness, he contends.
Honeywell’s SV technology would complement EGPWS. The display lets the pilot see a hazard from farther away, Feyereisen says. How far ahead and with what field of view are crucial questions still under investigation. Whether the software can run on existing Primus Epic hardware or would require a new graphics processor is another question. So far the software has been prototyped on four displays, including the roughly 13-by-10-inch display used on PlaneView-equipped Gulfstreams and the 8-by-10-inch display used in the Cessna Sovereign.
Honeywell has not yet integrated sensor data for real-time database validation, but that’s in the business plan, Feyereisen says. The company is taking an incremental "stepping stone" approach, she adds. The ultimate goal is to take controlled flight into terrain (CFIT) accidents out of the equation.
Like Collins, Honeywell is able to change the synthetic field of view. In cruise, for example, a 60-degree FOV would be nice, but when the pilot is coming in to land, a narrower, more conformal view would be better, Feyereisen says. The FOV would be tailored to the aircraft manufacturers’ specifications. Other issues include alerts and monitors of aircraft position, architectural design and database updates.
Managing UAV Sensors
Synthetic vision (SV) has obvious applications to unmanned air vehicles (UAVs). The Air Force Research Lab (AFRL) is developing and evaluating synthetic vision concepts to improve UAV sensor operators’ situational awareness and decision making. These officers don’t control the vehicles, but point the cameras, search for targets, react to events, and carry out assignments from above, making many decisions in real time. Synthetic vision clearly will benefit pilots, too, and AFRL is working on concepts such as highway in the sky.
There’s no doubt operator interfaces can be improved, says Mark Draper, a senior engineering research psychologist with the lab’s Human Effectiveness Directorate. A "good percent" of UAV mishaps are associated in some fashion with human interface and operator errors.
The directorate has a collaborative agreement with Rapid Imaging Software, Albuquerque, N.M., in which AFRL employs the company’s SmartCam 3D SV software to evaluate the impact of new display concepts on mission effectiveness through a simulator "loosely patterned" after the Predator’s ground control station. SV concepts also have been flight tested on UAVs in restricted U.S. airspace.
The SmartCam 3D software uses "spatially relevant" information from databases, such as terrain models, to generate synthetic markers of features or points of interest like buildings and overlay them conformally on dynamic video imagery. SV technology can highlight an emergency runway, for example. "It can increase the signal-to-noise ratio of the imagery the [sensor] operator is viewing," Draper says.
He also envisions symbology with no physical corollary. Though AFRL has yet to perform empirical research in this area, it would be possible for the sensor operator to "see" through the ground into a tunnel complex if an accurate template is loaded into the vehicle’s database. The entrance to the tunnel, located on the other side of a mountain from the UAV, could be presented as a "grayed out" area, so the operator would know it’s on the other side. A sensor camera would not help in that situation.
The SV software can update the screen icons in real time, generally viewed as 30 times a second. But things get complicated if an information source, such as a positional beacon from a friendly ground vehicle, transmits at a slower rate, say once every 1 to 2 seconds. The synthetic image may not be registered properly and may lag behind the real-time video imagery, disorienting the operator. Or misleading information could be displayed. The friendly vehicle’s icon would be correct the moment it is updated, but until the next update, it would be displayed at the same relative location on the screen while the ground vehicle continues to move. When the vehicle sends a second update, its icon would suddenly jump to the correct location on the screen. Researchers are varying the rate at which symbology is overlaid on the sensor imagery to determine what the best rate should be.
Other issues include:
Over reliance on computer-generated symbology to the extent that the operator might miss something important on the live video;
Information view management–how to display information and symbology without overly cluttering the display; and
Decluttering–how to take away information in a simple and intuitive manner.
"You want to keep [SV renderings] as minimalist as possible," Draper declares.