Is Air Force Doctrine Stuck on Artificial Intelligence?

Approximate Reading Time: 8 Minutes

By: Darin Gregg

As we pay respects to the 75th anniversary of the 14 October 1943 allied air raid on Schweinfurt, I find myself wondering if the Air Force is stuck in the same type of doctrinal rut that lost the Allied Forces 60 bombers that day and earmarked the end of unescorted bombing missions during WWII. Ultimately, it was the strict adherence to doctrine in the face of rapidly advancing technology and capabilities that resulted in the egregious US losses that day. This occurred despite some tacticians of the time beseeching leaders to implement changes toward escorted bombing campaigns. Today’s Air Force is postured and employed in significantly different ways than the Army Air Corps during the Schweinfurt raid. While it is unlikely we will see the Air Force employing 200+ bombers in group formation ever again, we are in a similar phase as WWII where technology and capabilities have significantly outpaced our doctrine. Some WWII era visionaries were able to identify the problem and clearly articulate and implement a viable solution to the their doctrinal challenges. However, today’s doctrinal challenge involving artificial intelligence (AI) may require a more iterative approach in formatting a solution. The first step towards formatting a solution is identifying the main problem, which I believe is that today’s doctrine has ostensibly ignored advancements in AI and machine learning.

All of us interface with AI during our daily routines without even realizing it. For example, Facebook algorithms determine what advertisements and news feeds we see. Smart speakers such as Google and Alexa utilize AI technology to search for new music specific to our preferences. I’m not saying that we are in danger of Skynet becoming self-aware, but we in the Air Force at least, seem to be behind the power curve of understanding the ramifications of advanced AI on warfare and if not careful could be left behind as others are more willing to accept AI into the fold. Much like our Air Force forefathers, we know the technology is there. We know there is potential for both good and bad. We need to work out where this continually evolving technology best fits our needs and how to get the most out of it.

Some of you are probably ready to stop reading because you, and everyone else that deals with national security issues, can’t go a day in our business without hearing about cyber. Obviously, the DoD is paying attention and working AI-related issues. Initiatives like Cyberworks, Project Maven and Kessel Run are a good start at trying to grasp the problem and attempting to incorporate the capabilities that AI advancements bring us. However, I argue these initiatives are tantamount to the same initiatives that brought WWII weapon systems such as the Spitfire and Hurricane. These were impressive advances to meet a specific requirement and brought great successes in the Battle of Britain, but they fell short in realizing the full potential of what only doctrinal change could bring. Only after strategists of the time broke through their doctrinal chains and allowed the significant advances in aeronautical technology to drive doctrine, did we see such tools and weapons as the P-51 Mustang which allowed the implementation of fully escorted bombing missions and ushered in a new era in strategic air warfare.

If you search through published Air Force doctrine today, you will be hard pressed to find a reference to AI, yet almost every senior leader discussion on future capabilities references advancement in AI. On 2 Aug 2018, Lt Gen “Dash” Jamieson (HAF/A2) outlined an ISR Flight Plan that envisions a future in which she stated, “PED is dead.” This future is based on the implementation of a machine intelligence supported strategic framework. In this framework, AI conducts the mundane basic tasks that currently take up the preponderance of current analyst’s time. This then frees humans to conduct true analysis and critical thinking. To bring this framework to fruition, the Air Force must fully embrace and incorporate cyber operations, to include AI across the full spectrum of operations. General Jamieson has a vision of the future where AI facilitates human-machine teaming by enabling PED (Intelligence Processing, Exploitation, and Dissemination) to occur at the moment of exploitation which will lead to faster and more thorough analysis of the available data. How quickly we get to that future will be driven largely by an Air Force culture willing to accept a vastly new way of thinking that is able to fully incorporate these ideas into doctrine.

Why doctrine though? Does anyone really read it? I have to admit it wasn’t until I was a new Major and working on my first Program Objective and Memorandum (POM) inputs that I really even looked at doctrine, let alone dug deep enough to understand their importance. Whether we like it or not, we live in an Air Force defined by cumbersome and bloated bureaucratic processes. The POM is no different. Having worked through this process though, one can see where it is a prime example of how doctrine still drives the Air Force today. The POM begins the budget process and can be thought of as the individual programs that eventually get approved or disapproved for the five-year spending plan. This is a very simplistic explanation, but it serves the purpose of this narrative. Everything the Air Force does is tied to the POM. If one wishes to get their portion of it approved, it’s likely only going to happen if you can tie your program to the Air Force distinctive capabilities as they are outlined in strategy and doctrinal documents. My point, the POM isn’t very receptive to initiatives as approving officials prefer to spend money on proven capabilities, which is why successful POM proposals show linkage to doctrine. This in itself necessitates the inclusion of AI into Air Force doctrinal documents.

Doctrine is a compilation of tried and true practices that inform how we conduct operations in defense of our national interests. It outlines lessons observed to create an understanding of how best to conduct operations. When discussing to topic of adding AI references to current doctrine, many argue that without lessons observed, we cannot write doctrine and since AI is still in its infancy, we don’t know enough to affect doctrinal change. I wholeheartedly disagree for three reasons:

First, AI may be extremely limited in its military utilization thus far, but it is well into its toddler stage in the private sector. Amazon, Google, Apple, and others already have machine learning programs and advanced research into quantum computing. The Washington Post, using their arc publishing framework Heliograf, has even employed an AI reporting capability. USA Today and Reuters are using similar technologies. This is just the tip of the iceberg on what is going on in the world of AI. The bottom line: there is enough information regarding lessons observed for AI uses to begin addressing potential military applications within Air Force doctrine.

Second, and even more important, doctrine may be authoritative in nature, but it also provides a common definition, understanding, and common frame of reference. A common understanding of what AI means is severely lacking. Ask four service members what AI is, and you’ll get four different answers that most likely will not match industry’s definition. This is extremely problematic, especially during the acquisition process. We wouldn’t ask industry to build us a new airplane without asking for specific capabilities, yet that is almost exactly what the DoD does when it asks for AI-related capabilities. Doctrine can fix this problem.

Third, doctrine isn’t meant to be rigid and unchanging. We can’t afford to study specific AI capabilities and lessons learned over the next 5-10 years before implementing changes to doctrine. By then, our competitors will already have implemented capabilities far exceeding ours. Right now, we can produce our best assessment on what AI can and should be doing for us and push those assessments out through published doctrine. As AI capabilities evolve, so should our assessments and so should our doctrine. To do this, our way of updating doctrine has to become more dynamic and responsive. To quote General James Mattis “If we fail to adapt…at the speed of relevance, then our military forces, our Air Force, will lose the very technical and tactical advantages we’ve enjoyed since World War II.”

The question becomes, are we going to allow history to repeat itself? Do we have to get to the point where we run missions that are today’s version of Schweinfurt before we explore more effective and efficient methods of updating our doctrine? If the answer is no, then we must drive for doctrinal change. To make this happen, those in the field must push for that change. Those who employ AI techniques must be willing to gather best practices and push them to the doctrine POCs at their respective MAJCOMS and staffs for inclusion into doctrine. I don’t doubt various AI employment concepts, and tactics, techniques, and procedures exist at the tactical-level or are even in strategy documents I am not privy too, but the call from the field to incorporate AI into doctrine has yet to happen. Doctrine may be published at the headquarters, but that is not where it is generated. The ideas generated in the field must be pushed through senior leadership to doctrine development leadership. Once there is demand from the field than change will occur. Once that call for change occurs, the challenge will then become accepting some risk that we won’t have the perfect definitive doctrinal answer the first, second, or even third go around. Then we may be able to get past our traditional mindsets and get out of our doctrinal rut so that we allow ourselves to drive the environment of change and start welcoming the onboarding of AI.

Lieutenant Colonel Darin M. Gregg is the Chief of Intelligence Surveillance and Reconnaissance (ISR) Education at the LeMay Center for Doctrine Development and Education.  He is responsible for organizing and teaching intelligence related electives, including Distance-Learning courses, and serves as an advisor for Air War College and Air Command and Staff College research papers, including those of the ISR Research Task Force.  Additionally, Lt Col Gregg supports classified and unclassified research and publication of materials on key intelligence issues facing the Air Force, joint warfighter, and the nation.

Disclaimer: The views expressed are those of the author and do not necessarily reflect the official policy or position of the Department of the Air Force or the U.S. Government.

Leave a Reply