EOTS
While the AESA radar provides the F-35 pilot with an all-weather, active targeting sensor, the electro-optical targeting system incorporates day/night passive sensors, unable to be detected by enemy warning systems. Providing high-resolution infrared imagery that is software-enhanced through signal processing, EOTS can give the F-35 pilot a closer look at the target area initially detected by the radar. "If you see [with the radar] a ground object of interest, you can pass the coordinates to the EOTS, to zoom in for long-range confirmation," says Porter.
"The two sensors complement each other," adds Mike Skaff, F-35 pilot/vehicle interface lead with Lockheed Martin. "Radar works well in weather, while EOTS is good for targeting, especially air-to-surface targeting, because of its high definition."
EOTS is part of the F-35's electro-optical sensor system (EOSS), developed by Orlando, Fla.-based Lockheed Martin Missiles and Fire Control. Lockheed also serves as lead in the EOTS's development, with support from BAE Systems. EOTS incorporates a targeting laser, TV camera and a third-generation infrared sensor.
What makes the internally mounted EOTS unique is that it is not turret-mounted. And its capabilities are comparable to those of the low altitude navigation and targeting infrared for night (LANTIRN) system but without the aerodynamic drag of a pod. (EOTS technology derives from Lockheed's Sniper targeting pod.) Rather, the EOTS' automatically boresighted sensor is positioned behind a glass-like sapphire housing that blends into the F-35's nose, just under the radar antenna, easily accessible by maintenance personnel.
The EOTS incorporates an air-to-surface FLIR tracker and air-to-air IRST system. It also includes a single aperture design and advanced, third-generation focal plane array, as well as a "spot tracker," capable of tracking a laser beam directed by a remote source.
EOTS has been sent to Lockheed Martin's F-35 mission systems integration lab in Fort Worth for testing in simulated scenarios.
The DAS
Like EOTS, the F-35's distributed aperture system is incorporated in the fuselage design and does not require a pod. Six IR cameras--Porter calls them situation awareness "eyeballs" that create a flying "Imax"--are embedded in the aircraft, positioned to provide full spherical imagery around the aircraft.
The IR sensors form an integrated detector assembly inside the IR camera. The sensors form a total, passive picture around the aircraft, with "all of the information all the time," according to Porter. They give the F-35 pilot missile approach warning, countermeasures deployment, passive air-to-air radar, off-axis targeting for air-to-air missiles, and wide field-of-view day/night pilot vision. With off-axis targeting the pilot can assign a display of interest to the HMD and point his head to the intended target, designate and shoot. Providing through-the-cockpit-floor viewing, the DAS even assists the pilot in landing the aircraft. It can be used for after-dark, bomb damage indication (BDI), too, offering the pilot an alternative to using night vision goggles for damage assessment.
Also in the SDD phase, the DAS is being flight tested on an Air Force F-16 at Edwards AFB, Calif.
Each of the F-35's sensors provides powerful situation awareness and targeting information, but their integration to form a single, fused image makes them even more powerful, while not burdening the pilot with information overload. Each sensor has its own processor to automatically determine the appropriate modes, acquire targeting data, and deliver the imaging data over a Fibre Channel backbone bus within the integrated core processor (ICP), where it is fused to present a clear, comprehensive picture of the target and its setting.
To analyze and prioritize the incoming targeting data, the ICP uses algorithms dedicated to the various tasks: air-to-air, air-to-ground and the target identification received from the CNI suite. And these algorithms are distinct from those used to fuse other onboard data, for example, the GPS and inertial nav data for a comprehensive navigation picture.
The fused targeting data can be overlayed onto a battlefield situation display that the F-35 pilot has uplinked from a ground base or another aircraft. The intent of these features is to produce battle scene awareness to support an observe, orient, decide and act (OODA) sequence for F-35 pilots.
In a typical scenario the pilot would first detect a beyond-eyesight target in a predominantly radar image on the MFD. As the target gets closer, the EOTS imagery automatically creates a clearer picture of the target on the MFD. At this point the pilot assesses an operational picture of the battle space, evaluates the threat responses and rapidly plans a route to secure minimum exposure and maximum weapon effectiveness, and determines the best choice of weapon.
Once he has made his decision to attack the target, the pilot would switch from the head-down to the head-up display in his helmet-mounted visor. "You look at the two displays like wearing bifocals," says a Northrop offical. In addition to presenting a center cross that locks onto the target for a point-and-shoot capability, the HMD also presents the status of available weapons, a symbol for IFF and indication of the target's range, closure and velocity. With most of the target detection and presentation achieved automatically, the OODA process, from acquisition to destruction, can be done within the few minutes that Gen. Jumper set as a goal for engagement.
All told, the targeting sensors and processors make the F-35 not just a combat aircraft firing weapons, but a first-day-of-the-war, multimission aircraft able to perform autonomously, cooperatively or remotely, using information from offboard sources. In a cooperative mission, for example, the F-35's ICP would package and format targeting data to form a waveform for delivery by the CNI to a ground base or other aircraft via Link 16 or an internal data link.
The suppliers of the radar, EOTS, DAS and other systems are performing much of the F-35's software development and integration work--as much as 40 percent, a Lockheed Martin official estimates. The mission software (ultimately, an estimated 4.5 million lines of code written in C/C++) is still under development and will be completed in increments. The integrated core processor is in development, and integration testing will begin in the first quarter of 2006.