Estimated Time to Read: 20 minutes
By Dan “Plato” Morabito
Defining Information Warfare
Abstract: This second of four essays develops a taxonomy, attack vectors, definition and theory of victory for information warfare developed from first principles of information theory.
Using information for military advantage is as old as the earliest recorded battles, yet defining the phenomenon as a type of warfare has proven frustratingly elusive.1 The phrases “Information Warfare” and “Information Operations (IO)” are often used interchangeably, with little clarity as to what they mean and how they manifest across the competition continuum.2 US joint doctrine provides no definition for IW and defines IO as “The integrated employment, during military operations, of information-related capabilities in concert with other lines of operation to influence, disrupt, corrupt, or usurp the decision-making of adversaries and potential adversaries while protecting our own [emphasis added].”3 This definition is lacking as it constrains IO to “military operations” and describes the phenomenon using presupposed “information-related capabilities.”
Similarly, the US Air Force recently described IW as “The employment of military capabilities in and through the information environment to deliberately affect adversary human and system behavior and preserve friendly freedom of action during cooperation, competition, and conflict [emphasis added].”4 This description is also lacking because it defines IW based on presupposed “military capabilities.”
Both definitions seek to define the phenomena they describe from military perspectives within the system they seek to understand. This is a mistake as, according to military theorist John Boyd, “one cannot determine the character or nature of a system within itself;” such efforts generate confusion and disorder, ultimately impeding action and magnifying friction.5 The result is that both definitions do little to illuminate how the United States and others might compete within the information environment using novel capabilities across the continuum of military conflict.
The United States military lacks a sufficient, comprehensive doctrinal understanding of IW, resigning IO to a mere tertiary function supporting the primary focus of large-scale combat operations. For example, the December 2020 release of Joint Publication 5-0, Joint Planning, makes only a single reference to IO, describing it as an example of “requested military flexible deterrent options” without elaborating on what that means or how it should be integrated into joint planning.6 It goes on to make meager but laudable efforts to include information environment considerations during joint planning by adding a statement that “the joint force synchronizes operations in the information environment to shape the perceptions, decisions, and actions of relevant actors” along with adding “information environment (including cyberspace), and electromagnetic spectrum” considerations within the Course of Action Development step of the Joint Planning Process.7 Meanwhile, China and Russia have already operationalized IW theory and integrated it into their operational art, considering it sufficient in its own right to triumph in competition below the threshold of armed conflict.8
US doctrine must define IW based on the phenomenon’s basic elements and emergent properties so that it informs capability development and employment based on the broader nature and character of the information environment, rather than unnecessarily constraining IW thought to expressions of preexisting military capabilities.
The next section posits a theory of IW from its most basic elements through its implementation as a weapon used to support national interests. It reveals IW as a manifestation of the Clausewitzian clash of wills expressed through competing narratives and shaped by access, trust, and cognition.9 Finally, it concludes with a proposed novel IW taxonomy, definition, and theory of victory.
In order to define IW, one must understand how data, information, and knowledgeinteract within information ecosystems to create individual and shared perceptions of reality.
Data is the most abstract form of information and is derived from individual processes of observation, measurement, or sensing. Data can be quantitative or qualitative but has minimal to no relational information or context. The binary encoding of information used by computers and the internet are excellent examples of data that is unintelligible until it is converted into information through the addition of context.
Information is less abstract and consists of data that is organized by relational context through processes of sorting, classifying, or indexing. This process of the relational grouping of data based on context is the most primitive form of intelligence. As such, the informational content of each data object is higher than pure data alone. Information paired with an intended receiver is called a message.
Knowledge exists in the thought-world of the observer as a theoretical description of a phenomenon under study.10 It is a mental model of an observed phenomenon or interpretation of information.11 As something observed or studied, access to the phenomena or information about it is a requirement for knowledge creation. Knowledge is formed by cognition of both the static and dynamic relationships of information, informed by context, emotion, and exposure to past observations.12 The accuracy of knowledge is probabilistic and must be continuously assessed against new observations to infer its relative validity, a measure of trust. Valid knowledge infers predictability of the observed phenomenon, presenting a kind of foresight.
Cognition, the conversion of information to knowledge, is continuous and occurs through both conscious and unconscious reasoning, phenomena described by behavioral psychologist and economist Daniel Kahneman’s two systems theory. According to Kahneman, System 1 uses cognitive shortcuts called heuristics to quickly filter information and reach conclusions subconsciously and with minimal effort. In contrast, System 2 is deliberate, conscious thinking that requires one’s attention and effort and which produces some level of cognitive strain.13 Although fast and less effortful, System 1 thinking is especially problematic as it actively filters information that does not fit one’s preconceptions of reality, reducing one’s likelihood of discovery and reinforcing preconceived notions. Finally, it must be noted that cognition includes emotive factors and is capable of answering both questions of what does one feel about what they think, and what does one know about what they feel. As illusionists have known for centuries, the cognitive features of human biology can be hacked or tricked to induce people to reach perceptions in their thought-world that are entirely unsupported by reality.
Access, trust, and cognition are necessary for knowledge creation and are therefore fundamental to the information environment. This suggests a novel model for visualizing the information environment with knowledgeas the emergent property of the interaction of the fundamental elements (figure 1).
This is a superior model because it defines the information environment using the fundamental elements of knowledge rather than defining it as a combination of “dimensions” paired with pre-existing military capabilities as is seen in Russian, Chinese, and American military conceptions.14
Data, information and knowledge exist within a global super information ecosystemthat consists of all the smaller information ecosystems and which may overlap or exist independently of each other. These information ecosystems are the physical and social information environments that people interact with and inhabit. The physical information ecosystems are the world people inhabit and can be directly and immediately observed. The social ecosystems extend people’s perceptions to the broader world, well beyond their immediate environment, through social interactions and access enabled by means of communication, such as writing and the internet. Fragmentation of information ecosystems occurs when access between ecosystems is reduced or does not exist.
It is important to emphasize that the preponderance of people’s individual knowledge about the broader world is obtained from others, which they receive through social interactions. This concept is often referred to as “the sociology of knowledge,” where the individual’s perceived reality, apart from that personally experienced, is “socially constructed.”15 This social construction of knowledge requires access to the social ecosystems of others, along with trust in the validity of shared information to reduce uncertainty. Finally, the persistence of shared knowledge creates norms that can harden within people’s mental models into heuristics that may or may not accurately fit one’s continuously evolving environment, creating bias. The attributes of fragmentation, uncertainty, and bias comprise the first three problems of knowing.
The Problems of Knowing
The problems of knowing emerge from the dysfunction or denial of the three fundamental elements of the information environment: access, trust, and cognition. The first three problems of knowing, fragmentation, uncertainty, and bias directly counter the elements of access, trust, and cognition. Three additional problems emerge from vulnerabilities within the interplay of overlapping fundamental elements. These are root of trust, misinformation, and filtering. Combined, the six problems of knowing define the vulnerability space within the information environment model (red arrows, figure 2). As such, they are also described as attack vectors and are required for theorization about IW capabilities.
The first problem of knowing is fragmentation. As previously noted, information ecosystems are fragmented relative to other ecosystems when they have few or no connection paths between them. Fragmentation is categorized as physical, socio-structural, or voluntary.
Physical fragmentation occurs as a consequence of the geographic separation of people groups. An instance of physical fragmentation resulting in surprise would be the “discovery” of the New World by Christopher Columbus. Similarly, the sight of a Western European was “new” to the indigenous North Americans as this knowledge was absent from their information ecosystem.
Socio-structural fragmentation occurs from efforts to control or deny information to others in order to preserve power hierarchies, worldviews, or paradigms. An example of this is the trade guilds of the Middle Ages that sought to reduce trade competition through the preservation of specialized knowledge and craftsmanship. In today’s information-centric society, it includes the use of multi-level information security policies that preserve confidentiality through access controls.16
Voluntary fragmentation occurs as an outward expression of rejecting unwanted information. Individuals may voluntarily attempt to avoid information from intruding into their ecosystems by deliberately cutting themselves off from it. Examples include ignoring or avoiding disturbing or degrading phenomena or by deliberately choosing to consume only news media that confirms or aligns with one’s preexisting worldview.
The second problem of knowing, filtering, emerges from the interaction between access to information and the heuristics that support cognition. Filtering occurs when a second party controls which information gets delivered to a person or when the information that is delivered to a person is ignored due to their heuristics. This problem is especially challenging because it is the information previously experienced by a person that solidifies their heuristics. In turn, these heuristics can subconsciously filter out information that does not match preexisting mental models, a function of System 1 thinking also described as confirmation bias. This confirmation bias creates a reinforcement loop that continuously filters new information that does not match preexisting bias until something occurs that does not match the preexisting mental model but which demands System 2’s attention.
The third problem of knowing is if and how much a person can trust the validity of information gleaned from others, a problem that manifests itself as uncertainty. Since most knowledge comes from others instead of one’s own personal observation and creation,trust is a measure of the validity of information received from others.17
Root of trust is the fourth problem of knowing. It exists within the interplay between the elements of access and trust, and the problems of fragmentation and uncertainty. The root of trust problem extends directly to the discipline of information management where practitioners are concerned with the confidentiality, integrity, and availability of information. Among many threats, cybersecurity analysts concern themselves with preserving the integrity of data using check bit, hashing, and encryption algorithms to avoid data manipulation that could impact future information and knowledge. Of course, one must also trust that the algorithms themselves are effective and have not been tampered with, and then one must also trust the hardware that the algorithms use for their calculations, which means one must trust the hardware designers and manufacturers. This multi-layered trust hierarchy problem was foreseen as far back as 1984 when Computer Science pioneer Ken Thompson published his essay “Reflections on Trusting Trust” and is often referred to as the “root of trust” problem.18 The theoretical answer to ensuring high truth and low uncertainty requires that the validity of information is not assumed if it was not personally created, and yet the overwhelming preponderance of information that people continuously rely on comes from and is created by others. Human perceptions are based on trusting information from others who, in turn, base their perceptions on trusting information from others. As several security researchers have metaphorically described trust, “It’s turtles, all the way down.”19
A fifth problem of knowing, cognitive bias, is a consequence of how the human brain employs heuristics to rapidly and efficiently interpret the environment while minimizing distractions and cognitive strain. A heuristic is a cognitive shortcut that allows the subconscious, System 1, to reach a quick and reasonably accurate conclusion despite time constraints or limited information.20 Some heuristics are innate to human nature while others are developed through repeated exposure to ideology, phenomena, or emotional events.21 The problem of heuristics arises when the brain uses them to reach conclusions that are not supported by reality. Further, when heuristics fail, the failures are unlikely to be detected until a significant event forces one’s conscious thinking to recognize the mistake. This failure is called cognitive bias.
Social psychologist Jonathan Haidt identified an especially powerful group of heuristics that are relevant to IW due to their strong ability to motivate individuals and groups. In his Theory of Moral Foundations, Haidt asserts that there are “six psychological systems that comprise the universal foundations of the world’s many moral matrices.”22 Each of his six moral psychological systems is labeled with “value”and“anti-value” pairs, where values are desired or accepted traits and anti-values are a traits or actions that moral intuition rejects. These six foundations are Care/Harm, Liberty/Oppression, Fairness/Cheating, Loyalty/Betrayal, Authority/Subversion, and Sanctity/Degradation.
What makes this theory especially significant is that it provides a framework for understanding how moral biases influence global populations along with how groups use morality to motivate and order their societies according to social systems.23 According to Haidt, “Moral systems are interlocking sets of values, virtues, norms, practices, identities, institutions, technologies, and evolved psychological mechanisms that work together to suppress or regulate self-interest and make cooperative societies possible [emphasis added].”24 When it comes to power, the concept of a “moral high ground” is an appropriate metaphor since moral foundation biases shape how people interpret the world, and motivate the actions they take within it, giving a moral “positional advantage” to some at the expense of others. These moral matrices shape people’s biases and bind them into cooperative groups with shared values. At the same time, they blind people to the perspectives of others.25 This is important because if one understands the moral heuristics which drive a group of people, one can selectively present them with information that exploits and amplifies their naturally occurring potential for biased thinking and manipulate their behavior. In this way one can weaponize bias to change behavior, potentially to violent extremes. Haidt’s moral framework-based heuristics are just some of many potential heuristics that may exist within a population. What makes them particularly relevant is their seemingly universal applicability to human behavior and potential for weaponization.
The sixth problem of knowing is misinformation. Misinformation is a term that broadly captures subcategories of incorrect information, regardless of intent. When used to refer to a specific incident of false information, misinformation is generally assumed to be false information that is created or shared without the intent of causing harm. However, when harm is intended, the subcategories of disinformation and malinformation are used. Disinformation “is an intentional spreading of misinformation in pursuit of a purpose-driven outcome.”26 Malinformation is data that reflects reality, but that is presented in a contextually misleading way.27 In each case, the information is shared in the form of a message, manifesting itself in many different forms to include oral or written stories, images, and videos.
The proliferation of social media creates a global IW battleground in which, according to some researchers, “the defining feature is that messages are the munition.”28 These messages shape knowledge to align with or counter narratives, that is, the individual and shared stories people use to establish and reinforce mental models while making sense of perceived information. Finally, propaganda is misinformation used to “promote or publicize a particular political cause, ideological perspective, or agenda.”29
Information Warfare Taxonomy
The elements of the IW theory outlined above are visualized beginning with the IW Trinity, which positions individual and group perceptions of knowledge in the center of three overlapping rings of trust, access, and cognition (figure 3). The six attack vectors of fragmentation, root of trust, uncertainty, misinformation, bias, and filtering are shown as red arrows pointing towards the IW elements that they exploit to create effects within the center. The resulting graphic depicts a taxonomy of IW
The graphic posits two unique inferences. First, the information environment is a blend of two domains, the cognitive domain imbued with trust and cognition, and the EMS domain, which serves as the medium for information access and extends cognitive expressions of trust into the EMS domain (green labels, figure 3). Second, there are three unique areas of overlap between each pair of rings that excludes the third ring. These areas possess unique characteristics and attack vectors. A non-exhaustive list of characteristics within each overlapping area is included in blue for clarity. Finally, all three rings exist simultaneously. The character of each ring is continuously shaped by its relationship and interactions with the other two.
This is a new way of conceptualizing IW based on its fundamental elements. These elements make up the IW Trinity and reveal six IW attack vectors that exist across the full spectrum of information conflict. The result is a theoretical foundation that supports and informs a richer definition of IW.
Information Warfare Defined
Given this theoretical foundation, the following working definition of IW is proposed: “The manipulation of knowledge through access, trust, and cognition to change the attitude or behavior of an individual or system.” The aim of this definition is attitude or behavioral change, a concept not captured in a single English word, but one captured within the Greek word metanoia, defined as “a shift in mind” caused by new information or a new perspective, and corresponding to a shift in behavior.30 Metanoia is the nature of IW.
This definition is supported by the three fundamental elements of the information environment and, in contrast to the Air Force description, allows for development of capabilities across all instruments of power to achieve effects throughout the IW taxonomy, regardless of the level of competition. Notably, this definition accommodates current US military IW functions of Cyberspace; Intelligence, Surveillance, and Reconnaissance; Electromagnetic Warfare; Electromagnetic Spectrum Management; and IO while achieving overlap with the IW doctrine of America’s competitors, such as Russia’s informatsionnaya voyna (Information War)functions of Network Operations, Electronic Warfare, Psychological Operations, and IO; as well as China’s concept of “Informatized War,” which privileges information advantage within the cyber, space, and electromagnetic domains.31
Perhaps most significant is the discovery that the secondary areas of overlap reveal a conspicuous area of the triad that is not currently captured as a US doctrinal IO function or information-related capability. This region, defined by the overlap of the fundamental elements of access and cognition, is where both physical and cognitive filtering mechanisms are at play. This is significant because “the highest forms of communicative-based power in networked societies are the abilities to set the parameters for and guide the directional flow of discussions taking place within the network.”32 This is the area within the triad where that occurs, and where external filtering trains cognitive heuristics which, in turn, filter out information that does not correspond to current mental models. This suggests a role within IW for managing this battle space, which manipulates the relationship between fragmentation and bias and which can be heavily influenced by human-machine filtering, i.e., machine learning algorithms. In contrast with the United States, this is an IW function that the United States’ adversaries, particularly Russia and China, are already aggressively pursuing.
Information Warfare Theory of Victory
Like conventional warfare, the objective of IW is to achieve political objectives by coercing the enemy to do one’s will.33 However, in contrast to the direct violence associated with conventional war, IW seeks to achieve its objective primarily through the manipulation of the fundamental elements of access, trust, and cognition.
Similarly, as the ultimate aim of conventional war is to disarm the enemy to impose one’s will, the ultimate aim of IW is to disable the enemy’s ability to use data, information, and knowledge to achieve their objective.34 This is achieved when “the previous direction of messages [which inform and motivate] a political or military effect is . . . changed,” thereby establishing a strategic, operational, or tactical information advantage.35 China’s theorists seem to agree, having stated in their 2013 Science of Military Strategy publication that information dominance is achieved when friendly forces can “seize and preserve the freedom and initiative to use information [while] simultaneously depriving an opponent” of the same.36
Lieutenant Colonel Daniel B. Morabito is an Air Force cyberspace operations officer and recent graduate of the US Army School of Advanced Military Studies (SAMS) at Fort Leavenworth, Kansas. Lieutenant Colonel Morabito has an undergraduate degree in Computer Science from Baylor University and holds masters degrees in Leadership and Information Technology, Cyberspace Operations, and Military Operational Art and Science from Duquesne University, the Air Force Institute of Technology, and the USAF Air Command and Staff College. He is a graduate of the USAF Air Command and Staff College Joint All Domain Strategist concentration. He can be reached at firstname.lastname@example.org.
Disclaimer: The views expressed are those of the author and do not necessarily reflect the official policy or position of the Department of the Air Force or the United States Government.
 Edward Waltz, Information Warfare Principles and Operations (Boston, MA: Artech House, 1998), 19-30.
 US Department of Defense, Joint Staff, Joint Doctrine Note (JDN) 1-19, Competition Continuum (Washington, DC: Government Publishing Office, 2019), 2-4; Bradley Young and Jonathan Wood, “The Army’s Information Operations Profession Has an Identity Crisis,” Proceedings 147, no. 3 (March 2021), accessed March 24, 2021, https://www.usni.org/magazines/proceedings/2021/march/armys-information-operations-profession-has-identity-crisis.
 US Department of Defense, Joint Staff, Joint Publication (JP) 3-13, Information Operations (Washington, DC: Government Publishing Office, 2014), GL-3.
 US Department of the Air Force, “Sixteenth Air Force Fact Sheet,” Department of the Air Force, August 27, 2020, accessed January 19, 2021, https://www.16af.af.mil/About-Us/Fact-Sheets/Display/Article/1957318/sixteenth-air-force-air-forces-cyber/; Joint Staff, JP 3-13, Information Operations, ix. Note that joint doctrine defines the information environment as “the aggregate of individuals, organizations, and systems that collect, process, disseminate, or act on information.”
 John Boyd, A Discourse on Winning and Losing (Maxwell Air Force Base, AL: Air University Press, 2018), 237; see also, Carl von Clausewitz, On War, ed. andtrans. Michael Howard and Peter Paret (Princeton, NJ: Princeton University Press, 1989), 75. Boyd’s observation is supported by Clausewitz who states that “in war more than in any other subject we must begin by looking at the nature of the whole; for here more than elsewhere the part and the whole must always be thought of together [emphasis added].”
 US Department of Defense, Joint Staff, Joint Publication (JP) 5-0, Joint Planning (Washington, DC: Government Publishing Office, 2020); US Department of Defense, Joint Staff, Joint Publication (JP) 5-0, Joint Planning (Washington, DC: Government Publishing Office, 2017). The 2017 version of JP 5-0 also only provided one reference to information operations. The 2020 version goes on to make seventeen references to the “information environment,” an improvement over the only five references included in the 2017 version.
 Joint Staff, JP 5-0, Joint Planning (2020), II-10, III-33. Note that the Chairman of the Joint Chiefs of Staff approved information as a joint function in only as recently as July, 2017. Additionally, joint doctrine still describes the EMS as separate from the information environment. This is confusing since access to and through the EMS is a fundamental requirement for access to the information environment.
 Michael Connell and Sarah Vogler, Russia’s Approach to Cyber Warfare (Arlington, VA: Center for Naval Analysis, 2016), 3; Edmund Burke, Kristen Gunness, Cortez Cooper, and Mark Cozad, People’s Liberation Army Operational Concepts (Arlington, VA: RAND Corporation, 2020), 6-8, accessed February 6, 2020, https://www.rand.org/content/dam/rand/pubs/research_reports/RRA300/RRA394-1/
 Iain King, “Toward an Information Warfare Theory of Victory,” Modern War Institute, October 19, 2020, accessed November 13, 2020, https://mwi.usma.edu/toward-an-information-warfare-theory-of-victory/.
 Venkatesh Rao, Tempo: Timing, Tactics and Strategy in Narrative Decision-Making (La Vergne, TN: Ribbonfarm, 2011), 42.
 Waltz, Information Warfare Principles and Operations, 83-85.
 Daniel Kahneman, Thinking, Fast and Slow (New York: Allen Lane, 2011), 21-24.
 Bryan Clark, Dan Patt, and Harrison Schramm, Mosaic Warfare: Exploiting Artificial Intelligence and Autonomous Systems to Implement Decision-Centric Operations (Washington, DC: Center for Strategic and Budgetary Assessments, 2020), 22; Joint Staff, JP 3-13, Information Operations, I-1–I-3.
 Peter Berger and Thomas Luckmann, The Social Construction of Reality: A Treatise in the Sociology of Knowledge (New York: Anchor Books, 1967), 3.
 Matt Bishop, Computer Security: Art and Science (Upper Saddle River, NJ: Addison-Wesley, 2002), 124.
 Berger and Luckmann, The Social Construction of Reality, 61.
 Ken Thompson, “Reflections on Trusting Trust,” Communications of the ACM 27, no. 8 (August 1984): 763, accessed September 13, 2020, https://www.cs.cmu.edu/~rdriley/487
 Jonathan M. McCune, Adrian Perrig, Arvind Seshadri, and Leendert van Doorn, “Turtles All the Way Down: Research Challenges in User-Based Attestation” (Conference Paper, Carnegie-Mellon University, Pittsburg, PA, 2007).
 Michael Janser, Cognitive Biases in Military Decision Making (Carlisle Barracks, PA: US Army War College, 2007), 1.
 Jonathan Haidt, The Righteous Mind: Why Good People Are Divided by Politics and Religion (New York: Vintage, 2012), 153.
 Ibid., 211.
 Ibid., 16-17.
 Ibid., 314.
 Haidt, The Righteous Mind, 221-222.
 Zachery Kluver, Skye Cooley, Robert Hinck, and Asya Cooley, “Quick Look: Propaganda: Indexing and Framing the Tools of Disinformation,” The Media Ecology and Strategic Analysis Group, December 2020, 3, accessed February 1, 2021, https://nsiteam.com/social/wp-content/uploads
 Claire Wardle and Hossein Derakhshan, “Information Disorder: Toward an Interdisciplinary Framework for Research and Policy Making,” Council of Europe Report 27, Harvard Kennedy School, September 2017, 20, accessed December 13, 2021, https://rm.coe.int/information-disorder-toward-an-interdisciplinary-framework-for-researc/168076277c.
 King, “Toward and Information Warfare Theory of Victory,” 4.
 Kluver et al., “Propaganda: Indexing and Framing the Tools of Disinformation,” 1.
 Peter M. Senge, The Fifth Discipline: The Art and Practice of the Learning Organization, rev ed. (New York: Currency, 2006), 13-14.
 Connell and Vogler, Russia’s Approach to Cyber Warfare, 3; Burke et al., People’s Liberation Army Operational Concepts, 6-8.
 Kluver et al., “Propaganda: Indexing and Framing the Tools of Disinformation,” 1.
 Carl von Clausewitz, On War,75.
 Carl von Clausewitz, On War, 77.
 King,“Toward an Information Warfare Theory of Victory,” 5; Timothy D. Haugh, Nicholas J. Hall, and Eugene H. Fan, “16th Air Force and Convergence for the Information War,” The Cyber Defense Review 5, no. 2 (Summer 2020): 29, accessed February 1, 2021, https://www.jstor.org/stable/26923520.
 Shou Xiaosong, ed., The Science of Military Strategy (Beijing: Military Science Press, 2013), 245.