This post is also available in: עברית (Hebrew)
Several fighter jet models will soon use artificial intelligence to control nearby UAVs that will be able to carry weapons, test enemy air defenses or perform intelligence, surveillance and reconnaissance missions in high-risk areas, Senior US Air Force officials said recently.
US Air Force Chief Scientist Gregory Zacharias said that much higher degrees of autonomy and manned-unmanned teaming are expected to emerge in the near future from work at the Air Force Research Lab. “This involves an attempt to have another platform fly alongside a human, perhaps serving as a weapons truck” Zacharias told DefenseSystems.com.
He added that F-35 pilots will be able to control a small group of drones flying nearby from the aircraft cockpit in the air, performing sensing, reconnaissance and targeting functions.
The existing F-35 computer system uses early applications of artificial intelligence that help computers make assessments and make some decisions by themselves – without needing human intervention. “We’re working on making platforms more autonomous with multi-infusion systems and data from across different intel streams,” Zacharias explained.
Computers can more quickly complete checklists and various procedures, but human perception abilities can more quickly process changing information in many respects.
“A computer might have to go through a big long checklist, whereas a pilot might immediately know that the engines are out without going through a checklist. He is able to make a quicker decision about where to land,” Zacharias said.
The F-35s “sensor fusion” technology uses computer algorithms to acquire, distill, organize and present otherwise disparate pieces of intelligence into a single picture for the pilot. The technology, Zacharias said, also exhibits some early implementations of artificial intelligence.
Wargames, exercises and simulations are ways the Air Force is working to advance autonomous technologies. “Right now we are using lots of bandwidth to send our real-time video. One of the things that we have is a smarter on-board processor. These systems can learn over time and be a force multiplier. There’s plenty of opportunity to go beyond the code base of an original designer and work on a greater ability to sense your environment or sense what your teammate might be telling you as a human,” he said.
For example, with advances in computer technology, autonomy and artificial intelligence, drones will be able to stay above a certain area and identify particular identified relevant objects or targets at certain times, without needing a human operator, Zacharias explained.
This is particularly relevant because the large amount of ISR video demands organizing algorithms and technology to help process and sift through the vast volumes of gathered footage – in order to pinpoint and communicate what is tactically relevant. “With image processing and pattern recognition, you could just send a signal instead of using up all this bandwidth,” he explained. This development could greatly enhance mission scope, flexibility and effectiveness by enabling a fighter jet to conduct a mission with more weapons, sensors, targeting technology and cargo.
For instance, real-time video feeds from the electro-optical/infrared sensors on board an Air Force Predator, Reaper or Global Hawk drone could go directly into an F-35 cockpit, without needing to go to a ground control station. This could speed up targeting and tactical input from drones on reconnaissance missions in the vicinity of where a fighter pilot might want to attack.
In fast-moving combat circumstances involving both air-to-air and air-to-ground threats, increased speed could make a large difference.
In addition, drones could be programmed to fly into heavily defended or high-risk areas ahead of manned-fighter jets in order to assess enemy air defenses and reduce risk to pilots.
Unlike ground robotics wherein autonomy algorithms have to contend with an ability to move quickly in relation to unanticipated developments and other moving objects, simple autonomous flight guidance from the air is much more manageable to accomplish.
Since there are often fewer obstacles in the air compared with the ground, drones above the ground can be programmed more easily to fly toward certain pre-determined locations.
At the same time, unanticipated movements, objects or combat circumstances can easily occur in the skies as well. “The question is what happens when you have to react more to your environment and a threat is coming after you,” he said.
As a result, scientists are now working on advancing autonomy to the point where a drone can, for example, be programmed to spoof a radar system, see where threats are and more quickly identify targets independently.