Home » Artificial Intelligence » AI Pilot Defeats Human Pilot in Dog Fights

AI Pilot Defeats Human Pilot in Dog Fights

August 25, 2020

supplu-chain

One of the great fears many people have is that artificial intelligence (AI) systems, like Skynet in Terminator movies, will wage war and eventually turn weapons on all of humankind deeming humanity a threat its own survival. Although a true artificial general intelligence (AGI) system — that is, a sentient AI system — has yet to be created, AI is being pursued for military purposes and that cat isn’t going back into the bag. Demonstrating how far AI systems have progressed, journalist Andrew Eversden (@andreweversden1) reports, “An artificial intelligence algorithm has defeated a human F-16 fighter pilot in a virtual dogfight simulation. The Aug. 20 event was the finale of the Pentagon research agency’s AI air combat competition. The algorithm, developed by Heron Systems, easily defeated the fighter pilot in all five rounds that capped off a yearlong competition hosted by the Defense Advanced Research Projects Agency.”[1] According to Eversden, “The human pilot, only known to the audience by his call sign ‘Banger’ for operational security reasons, was a graduate of the Air Force’s weapons instructor course, a highly selective training course reserved for top fighter pilots.”

 

It must be remembered that the competition was conducted in simulators, not actual aircraft. Eversden notes, “While the victory for the AI system is a big step forward for the young DARPA program, the work is far from over. The conditions in the simulation weren’t realistic for aerial combat. To start, the artificial intelligence system had perfect information, which experts commentating on the event noted never happens in the field. The human pilot was also flying a fake stick in a virtual seat.” Nevertheless, the results of the DARPA program are likely to reverberate throughout the military and will undoubtedly stir much debate. Colonel Dan Javorsek, USAF, program manager in DARPA’s Strategic Technology Office, was quick to stress that the program was aimed toward improving human/machine cooperation rather than trying to develop autonomous fighter pilots. He stated, “The AlphaDogfight Trials is all about increasing trust in AI. If the champion AI earns the respect of an F-16 pilot, we’ll have come one step closer to achieving effective human-machine teaming in air combat.”[2]

 

AI and the future of warfare

 

Like any modern organization, military organizations can benefit from augmented decision-making, process optimization, and other efficiencies cognitive technologies can provide. However, when you start talking about AI in warfare, caution flags are often raised. Paul Maxwell, the Cyber Fellow of Computer Engineering at the Army Cyber Institute at the United States Military Academy, writes, “Artificial intelligence is among the many hot technologies that promise to change the face of warfare for years to come. Articles abound that describe its possibilities and warn those who fall behind in the AI race. The Department of Defense has duly created the Joint Artificial Intelligence Center in the hopes of winning the AI battle. Visions exist of AI enabling autonomous systems to conduct missions, achieving sensor fusion, automating tasks, and making better, quicker decisions than humans. AI is improving rapidly and some day in the future those goals may be achieved. In the meantime, AI’s impact will be in the more mundane, dull, and monotonous tasks performed by our military in uncontested environments.”[3] Over half-a-century ago, USAF Colonel John Boyd, a military strategist, introduced the concept of the OODA Loop. The TechTarget staff notes, “The OODA loop (Observe, Orient, Decide, Act) is a four-step approach to decision-making that focuses on filtering available information, putting it in context and quickly making the most appropriate decision while also understanding that changes can be made as more data becomes available.”[4]

 

In the digital age, where data is available in real-time, humans are simply not capable of beating computers in an OODA loop. That’s why, in the military, there is so much interest about AI. The TechTarget staff notes, “While the OODA loop is a popular decision-making model, there are criticisms of its effectiveness. The main downfall is that the OODA loop might be too obvious, thus potentially wasting time. The process itself is sometimes instinctual, and therefore, does not need to be explicitly spelled out. Additionally, the underlying goal of making decisions faster than the opponent to increase the odds of winning should be a universal goal regardless of which decision-making method is employed.” Insert AI into the mix and wasted time no longer becomes an overriding issue. The biggest issues with AI in warfare involve data. You have to have the right data, the system has to be able to analyze the data, and it must be able to explain how it reached it decision to act. Those can all be challenges. Maxwell concludes, “Artificial intelligence will certainly have a role in future military applications. It has many application areas where it will enhance productivity, reduce user workload, and operate more quickly than humans. Ongoing research will continue to improve its capability, explainability, and resilience.” Nevertheless, caution should be the word of the day.

 

National security consultant Jonathan Clifford writes, “AI is sure to permeate every aspect of warfighting — from movement to communication, logistics, intelligence, weapons, and people. Delivering these warfare-changing technologies to the frontlines and into American hands will depend on less glamorous activities — namely, expediting the procurement process to more quickly field AI, and securing supply chains by collaborating with U.S. companies.”[5] Tech journalist Simon Chandler (@_simonchandler_) agrees AI is likely to permeate all sectors of military operations — and it worries him. He writes, “Besides being the future of everything else, AI is likely to be the future of warfare. It will increasingly process defense-related information, filter such data for the greatest threats, make defense decisions based on its programmed algorithms, and perhaps even direct combat robots. This will most likely make national militaries ‘stronger’ and more ‘capable,’ but it could come at the cost of innocent lives, and perhaps even the cost of escalation into open warfare.”[6]

 

Concluding thoughts

 

My guess is that most people remain uncomfortable with autonomous weapons systems that eliminate humans from the decision process. On the other hand, they are likely to have fewer objections to AI helping military personnel make better decisions, especially under the stress of battle. In non-combat areas, such as military supply chains, I suspect there is widespread support for any technology that can make them more efficient and effective. Clifford thinks it’s high time for the military to embrace cognitive technologies. He writes, “Throw the old way of doing business overboard and embrace the tech sector driving AI R&D. The status quo should be upset and a few dangerous thinkers, in government and business, empowered to be revolutionary.” Although most people might hesitate to empower truly “dangerous” thinkers in the military, they probably wouldn’t object to a empowering a few “radical” thinkers with innovative ideas about how AI can improve military operations and reducing costs.

 

Footnotes
[1] Andrew Eversden, “AI algorithm defeats human fighter pilot in simulated dogfight,” C4ISRNET, 22 August 2020.
[2] Peter Aitken, “AI pilot beats human in clean sweep of virtual F-16 dogfights, human fails to register a single hit,” Fox News, 23 August 2020.
[3] Paul Maxwell, “Artificial Intelligence is the Future of Warfare (Just Not the Way You Think),” Modern War Institute, 20 April 2020.
[4] Staff, “OODA loop,” TechTarget, June 2019.
[5] Jonathan Clifford, “AI Will Change War, But Not in the Way You Think,” War on the Rocks,” 2 September 2019.
[6] Simon Chandler, “How Artificial Intelligence Will Make Decisions In Tomorrow’s Wars,” Forbes, 20 January 2020.

Related Posts: