What must a 21st-century tactical aircraft incorporate to satisfy the needs of the U.S. Air Force, Navy and Marine Corps and international customers seeking a multimission air vehicle? The short answer is plenty of onboard and offboard data collection, processing and fusion. The long answer emerges from a close look at the Joint Strike Fighter’s (JSF’s) design.
The stealthy, supersonic fighter, designated the F-35, is expected to replace U.S. F-16s, A-10s, F/A-18/A/B/C/Ds, F-14s, and AV-8Bs, as well as UK GR7s and Sea Harriers. The U.S. Air Force wants to buy 1,763 Joint Strike Fighters; the U.S. Navy and Marines, 680; the Royal Air Force, 90; and the Royal Navy, 60. First flight of the conventional takeoff/landing (CTOL) version is expected in 2005. CTOL, short takeoff/vertical landing (STOVL), and carrier-capable versions will feature "high 90 percent" avionics commonality.
The affordability, size and mission goals for an aircraft developed with funding from eight countries, as well as the United States, have dictated unprecedented sensor overlap and processing centralization. The electronically scanned radar array, under the control of mission systems software, will be able to perform electronic warfare (EW) functions, and the EW system will share some com/nav/identification (CNI) apertures. The JSF’s infrared (IR) sensors will use detector/cooling assemblies of a common design. Integration also means the use of common modules wherever possible, both in the integrated core processor (ICP) and in other key systems, as well as the use of a 2-gigabit/sec Fibre Channel backbone for instant communications between the ICP and the sensors, CNI system and displays.
Designers intend integration and cooperation to drive breakthrough situational awareness. Data from radar, electro-optical, EW and CNI sensors–not to mention offboard systems–will be fused by mission systems software and presented to the pilot as an intuitive tactical picture on a panel-wide head-down display. A helmet-mounted display system (HMDS) will project the IR picture and urgent tactical, flight and safety symbology onto the pilot’s visor and provide high-angle, off-boresight targeting.
Inputs from six common, distributed aperture system (DAS) sensors are designed to create a 360-degree protective IR sphere around the airplane, providing the pilot approximately 20/40 vision acuity and allowing airplanes to fly in closely spaced nighttime combat spreads. The pilot will be able to look down to "see" the scene below the aircraft, through darkness, smoke and dust, projected on the helmet visor. DAS, the latest in IR-based missile warning and situational awareness tools, is complemented by EOTS, the internally mounted electro-optical targeting system. EOTS provides a smaller field of view but longer-range targeting. Under the command of the mission software, EOTS could provide range to a target without turning on the radar.
Fourth-Generation Radar
The F-35’s fourth-generation active electronically scanned array (AESA) radar is designed to reduce by half the cost and weight of third-generation technology, deployed in emerging platforms such as the F/A-22. The JSF radar, for example, uses "twinpack" T/R modules, consolidating two into one package. The AESA system’s lifespan is projected to be "well over" 8,000 hours, the typical life of a fighter aircraft, says Robert Thompson, director of JSF combat avionics for radar developer� Northrop Grumman Electronic Systems.
In air-to-surface operations the radar will support functions such as synthetic aperture radar (SAR) ground mapping and inverse SAR for ship classification. In air-to-air operations, the sensor will support features such as cued search, passive search and multitarget, beyond-visual-range tracking and targeting. Because the beam can move from point to point in millionths of a second, a single target can be viewed as many as 15 times a second.
JSF’s powerful sensor suite will allow the aircraft to assume an active role in the tactical "infosphere," company officials assert. "The tactical fighter used to be at the end of the food chain," receiving information from special-purpose sensor aircraft, Thompson says. But it became obvious, from the quality of JSF sensor data and the number of planes to be fielded, that they will be "a major feed of tactical information."
The sensors have gone through preliminary design review (PDR) and are heading toward critical design review (CDR) over the next six months. Critical design work, on the hardware side, emphasizes areas such as component reliability, cost and ruggedness, and final board layouts.
Wrapping Sensors Up
Mission systems software, still in early development, will be key to the F-35’s success, sifting, fusing and presenting sensor data "so that it is inherently obvious to the pilot what the course of action should be," asserts Jon Waldrop, director of international programs for prime contractor Lockheed Martin. The software "wraps [the sensors] up into a functional architecture that allows them to smartly work together, cross-cue and take advantage of fused information to help the pilot," Thompson explains.
The crucial data fusion function has been identified as a program-level risk, which means that senior officials will track its progress, says Air Force Lt. Col. Jim Baker, F-35 mission systems lead. A risk-reduction effort is under way. Flight testing was scheduled to commence in August or September, using current versions of the radar and EOTS system on Northrop Grumman’s BAC-111 test aircraft, according to Steve Foley, tactical information systems lead with the JSF program office.
"The government pushed on Lockheed to start fusion flying early," Thompson says. The idea is to look at baseline algorithms, prove out algorithm development and simulation tools, and confirm basic architectural concepts, explains John Harrell, Lockheed Martin’s tactical information systems lead. The risk reduction flight program is expected to run about six months, with analysis of the results feeding into on-going fusion algorithm studies.
The approximately 4.5 million lines of mission systems code will be developed in block upgrades. Early versions of data fusion algorithms will be examined in the risk reduction program. "Fusion really starts hitting in 2007, when we start doing fusion of all onboard sensors," Harrell says. Fusion capabilities will continue to increase with the Block 3 mission software release to flight test in mid-2010, adding information from offboard sources.
Mission systems functions are organized around the concept of a continuous "OODA loop," which stands for observe, orient, decide and act. Sensors and data links will acquire data, which will be fused in the ICP, activating tactical decision aids–or "planners." Search, attack, avoidance and denial planner modules would work simultaneously on the fused data, producing action plans for the pilot.
The search planner is intended to help pilots locate targets. This software application would look, for example, at all the possible places where a column of tanks could be, based on factors such as the last siting, the road network, terrain and the speed of the vehicles.
Although the details of pilot/software interaction are far from mature this early in the program, Baker describes one search planner scenario. The software module would ask the group leader–digitally or audibly–how many F-35s are on the mission? If the lead says, or indicates, "four," a grid would pop up to show where each wingman should be for optimal searching. Similarly, the search planner would overlay the possible locations of the tank column on a map for the pilots in the JSF formation.
After the tanks have been located, the attack planner could plan the ingress route, assess the vulnerability of the tanks, and indicate where the wingmen should be. While these tasks are proceeding, a "fast track" process would send any high-priority threat information directly to the pilot, who would determine, with the help of an "avoid planner," the evasion route. Although still a long way from realization, these processes would execute in fractions of a second, permitting pilots in a multiship formation to counter or avoid multiple threats and at the same time attack multiple targets.
Lockheed plans to hold several "pilot simulation events" to evaluate the mechanization and utility of these functions, i.e., what the pilot can do well and what is best handled by onboard computers.
A portable memory device from Smiths Aerospace–designed to provide hundreds of gigabytes of nonvolatile storage–will help the pilot load mission plan data and record video and other information in flight. Smiths also will provide a second, permanently installed mass memory device and an airborne file server.
Core Processor
Hosting the mission systems software is the JSF’s electronic brain, the ICP. Packaged in two racks, with 23 and eight slots, respectively, this computer consolidates functions previously managed by separate mission and weapons computers, and dedicated signal processors. At initial operational capability, the ICP data processors will crunch data at 40.8 billion operations/ sec (giga operations, or GOPS); the signal processors, at 75.6 billion floating point operations (gigaflops, or GFLOPS); and the image processors at 225.6 billion multiply/accumulate operations, or GMACS, a specialized signal processing measure, reports Chuck Wilcox, Lockheed’s ICP team lead. The design includes 22 modules of seven types:
Four general-purpose (GP) processing modules,
Two GPIO (input/output) modules,
Two signal processing (SP) modules,
Five SPIO modules,
Two image processor modules,
Two switch modules, and
Five power supply modules.
The ICP also will have� "pluggable growth" for eight more digital processing modules and an additional power supply, Wilcox adds. It uses commercial off-the-shelf (COTS) components, standardizing at this stage on Motorola G4 PowerPC microprocessors, which incorporate 128-bit AltiVec technology. The image processor uses commercial field programmable gate arrays (FPGAs) and the VHDL hardware description language to form a very specialized processing engine.The ICP employs the Green Hills Software Integrity commercial real-time operating system (RTOS) for data processing and Mercury Computer Systems’ commercial Multi-computing OS (MCOS) for signal processing. Depending on processing trades still to be made in the program, the JSF also could use commercial RTOSs in sensor front ends to perform digital preprocessing, according to Baker. The display management computer and the CNI system also use the Integrity RTOS. COTS reduces development risk and� ensures an upgrade path, according to Ralph Lachenmaier, the program office’s ICP and common components lead.
Tying the ICP modules together like a backplane bus and connecting the sensors, CNI and the displays to the ICP is the optical Fibre Channel network. Key to this interconnect are the two 32-port ICP switch modules. The 400-megabit/sec IEEE 1394B (Firewire) interconnect is used externally to link the ICP, display management computer and the CNI system to the vehicle management system.
Low-level processing will occur in the sensor systems, but most digital processing will occur in the ICP. The radar, for example, will have the smarts to generate waveforms and do analog-to-digital conversion. But the radar will send target range and bearing data to the ICP signal processor, which will generate a report for the data processor, responsible for data fusion. Radar data, fused with data from other onboard and offboard systems, then will be sent from the ICP to the display processor for presentation on the head-down and helmet-mounted displays.
EW System
The electronic warfare suite, integrated by BAE Systems, includes:
All-aspect radar warning capability, supporting analysis, identification, tracking, mode determination and angle of arrival (AOA) of mainbeam emissions, plus automatic direction finding for correlation with other sensors, threat avoidance and targeting information;
Defensive threat awareness and offensive targeting support–acquisition and tracking of� main beam and side lobe emissions, beyond-visual-range emitter location and ranging, emitter ID and signal parameter measurement;
A multispectral countermeasures suite with countermeasures response manager function, standard chaff and flare rounds; and
Passive EW apertures.
The EW suite complements the field-of-view and frequency coverage of the radar by providing complete coverage around the aircraft at a wider frequency range. Passive radar warning system apertures–at three different frequency ranges – are embedded in the skin of leading and trailing wing edges and horizontal tail surfaces. The EW system also can use the radar antenna for electronic support measures (ESM). Expected mean time between failure (MTBF) is 440 hours.
The radar warning system is active all of the time, providing both air and surface coverage. Packaged in two electronics racks, it includes cards for radar warning, direction finding and ESM. The system uses DAS inputs directly, as well as fused inputs from the ICP. Digital processing allows reprogramming and increases reliability.
Vehicle Management System
One of the most important non-ICP processing functions is the vehicle management system, which handles flight control and utility systems such as fuel management and electrical and hydraulic system controls. BAE Systems designed the vehicle management computer (VMC), three of which are connected together via an IEEE 1394B bus. About the size of a shoe box, each computer contains a processor card, I/O card and power supply card.
All three VMCs process data simultaneously, constantly comparing results across channels to assure data integrity. In the case of divergent data, two processors can "vote" one processor or signal out, explains Bill Dawson, JSF program manager for BAE Systems Aerospace Controls.
Interfacing to the VMCs are remote I/O units provided by Smiths. These devices–10 per aircraft–are an integral part of the vehicle management network, receiving flight control and other inputs from hundreds of digital, analog and discrete sources, processing� the data and outputting the results to the VMCs over the 1394 bus.
Head-Down and Helmet Displays
The Joint Strike Fighter’s flight deck display moves beyond the F/A-22’s multifunction display-type layout to a single, panoramic, 8-by-20-inch� viewing area, the largest ever in a fighter aircraft. Developed by Rockwell Collins (Kaiser Electronics), the multifunction display system (MFDS) comprises two adjacent 8-by-10-inch projection displays, each with a resolution of 1280-by-1024 pixels. Each half is fully functional, so the system can continue to operate if one half fails.
The MFDS will present sensor, weapons and aircraft status data, plus tactical and safety information. The viewing area can be presented as a large tactical horizontal situation display or be divided into multiple windows.
Functions are accessed and activated by touch–the first touch screen on a large-format display–or by hands-on-throttle-and stick (HOTAS) commands. Each 8-by-10-inch area has an integrated display processor for low-level data crunching and a "projection engine" to cast the image onto the screen. The MFDS uses micro-active matrix liquid crystal display (LCD) image sources–three per projection engine–illuminated by arc lamps. Collins will provide the display drivers and the first layer of services software, which Lockheed Martin will use to implement display applications.
Collins will procure glass commercially, tempering it with proprietary chemical processes. The Collins display processor _circuit card assembly design also is used for the display management computer-helmet (DMC-H). The assembly uses Collins application-specific integrated circuits (ASICs), as well as commercial processors, memory and graphics chips. Flight qualification and safety testing will begin once initial display systems are delivered in the second quarter of 2004. Standby 3-by-3-inch active matrix LCD flight displays are provided by Smiths Aerospace.�
The F-35’s helmet-mounted display system (HMDS) will replace the traditional head-up display (HUD), "allowing for a tremendous cost savings and, more importantly, weight reduction," asserts Louis Taddeo, director of business development with HMDS designer, Vision Systems International (VSI). VSI is a joint venture partnership between Collins and EFW Inc., an Elbit Systems Ltd. subsidiary.
The HMDS uses a combination of electro-optics and head position and orientation tracking software algorithms to present critical flight, mission, threat and safety symbology on the pilot’s visor. The system allows the pilot to direct aircraft weapons and sensors to an area of interest or issues visual cues to direct the pilot’s attention, Taddeo explains. The HMDS comprises the helmet-mounted display, DMC-H, and helmet tracking system. VSI also supplies the joint helmet-mounted cueing systems used on the F-15 and F/A-18E/F aircraft.
Fundamental requirements for the HMDS include visor-projected, binocular, wide field-of-view, high-resolution, near real-time imagery and symbology; equivalent accuracy to head-up display systems; 24-hour usability; and fit, comfort and safety during ejection. Proper weight and balance are crucial in minimizing pilot fatigue resulting from high-g maneuvers and reducing head and neck loads in ejections, Taddeo stresses. The F-35 helmet is expected to weigh 4.2 pounds (1.9 kg).
The F-35’s HMDS employs a flat panel, active matrix LCD, coupled with a high-intensity back light, as its image source. The partially overlapped display provides a binocular image 50 degrees wide by 30 degrees high.
The digital image source provides both symbol writing and video capability. The system includes a clear, optically coated visor for night operations and a shaded visor for daylight operations. Imagery is provided via the distributed aperture system (DAS) or a helmet-mounted day/night camera. F-35 pilots can select imagery and symbology via HOTAS commands.
F-35’s CNI System
The two-rack communications, navigation and identification (CNI) system processes waveforms internally, sending high-level status data to the core processor. The CNI system is designed to provide functions such as beyond-visual-range identification friend-or-foe (IFF); secure, multichannel, multiband voice communications; and intraflight data link (IFDL) exchanges, synchronizing the displays of multiple aircraft. The CNI suite will support 35 different com, nav and identification waveforms–about 5 pounds (2.26 kg) per waveform function, compared with the legacy black box approach of 10 to 30 pounds (4.54 to 13.6 kg), or more, per waveform, according to Frank Flores, JSF program director for Northrop Grumman Radio Systems.
Software-defined radio technology means that the suite can provide numerous radio functions–ranging in frequency from VHF to K band–from a set of more general-purpose module types, including:
Wideband RF module, performing analog-to-digital conversion, waveform processing and digital signal processing.
Dual-channel transceiver module, which can receive and digitize waveforms over a wide frequency band and generate� waveforms for transmission, driving ��� power amplifiers. This module supports most of the 35 waveforms.
Frequency-dependent power amplifiers, including L-band, VHF/UHF, and higher-frequency units.
Power supply module.
CNI processor module, which performs signal processing, data processing and comsec processing.
And an interface module.
Northrop Grumman developed middleware, located between the operating system and the applications. This layer of software is designed to allow smooth system growth and compatibility with Joint Tactical Radio System (JTRS) waveforms.
The CNI suite uses Green Hills Software’s Integrity commercial real-time operating system, PowerPC processors, field programmable gate arrays and digital signal processors. Radio Systems is streamlining the design to minimize footprint.
Some of the suite’s baseline functions include: VHF/UHF voice, HaveQuick I/II, Saturn (HQ IIA), satcom T/R, IFF/SIF (selective ID feature) transponder, IFF Mode IV interrogator, ILS/MLS/MLS/Tacan, IFDL, Link 16 T/R, Link 4A, tactical data information link (TADIL-K), 3-D audio, TACFIRE/Air Force applications program development (AFAPD), and ADS-B.