Control the Information Environment Narrative…or the Threat Will
By Mario Hoffmann
Source: Small Wars Journal
“The advent of the internet, the expansion of information technology, the widespread availability of wireless communication, and the far-reaching impact of social media have dramatically impacted operations and changed the character of modern warfare”1
–James Mattis, Former Secretary of Defense
Strategic competitors like Russia and China are using old technologies in new ways while also employing new advanced technology to fight their enemies in all domains (space, cyber, air, sea, and land). This required the U.S. Army to evolve and adapt the way it wants to fight by publishing “Multi-Domain Operations (MDO) 2028” as the cornerstone for the Joint force to militarily compete, penetrate, dis-integrate, and exploit future adversaries.2 While air, land, and sea domains have been prevalent since World War II, the relative new-comers of Cyber and Space are still establishing their doctrinal foundation in modern warfare.
US adversaries have demonstrated they will use offensive cyber and electronic warfare (EW) capabilities within cyberspace and the electromagnetic spectrum (EMS) to complicate a commander’s decisions and mitigate his/her ability to employ the full range of warfighting capabilities to gain an advantage. Our adversaries can or will soon be able to:
- Intercept and disrupt advanced voice and data communications;
- Degrade air defense and target acquisition radars;
- Deny/deceive Global Positioning Systems to influence navigation and timing data;
- Deny or degrade friendly Intelligence, surveillance, and reconnaissance.
While all these multi-domain challenges are real and relevant, an already contentious battle exists in the “information environment” (IE). The Chairman of the Joint Chiefs of Staff approved “Information” as a seventh Joint Function, consistent with the 2016 DoD Strategy for Operations in the IE (SOIE), justifying the significance at the strategic, operational, and tactical levels of military operations.1 Adversaries within this IE will attempt to deceive our strategic leaders and senior commanders from the realities of their Operational Environment (OE). This misdirection is intended to sway their decisions to forestall desired outcomes, and promote false public perception to undermine our goals and support of our troops. It is a relentless engagement within the “competition” phase of MDO, which is meant to achieve their interests, or at least gain a position of relative advantage in shaping potential future “armed conflicts.”
To describe the IE, the U.S. Army tends to favor terms like cyber, space, electronic warfare, and information operations that imply categorized approaches that must be synchronized, whereas our opponents see such as mere ways and means to achieving a desired end-state via “Information Warfare (IW).” Adversaries conduct IW to deny or manipulate information trusted by users without their awareness to make decisions not in their interest, but rather for the benefit of the adversary.3
Opposing Force (OPFOR) doctrine emphasizes the importance of IW from tactical through strategic engagements. As a tactical combat multiplier it enhances leadership decisions and magnifies maneuver, firepower, and protection at decisive points. The U.S. Army’s OPFOR Training Circular 7-100 describes seven elements of IW (EW, Computer Warfare, Deception, Physical Destruction, Protection/Security, Perception Management, and Information Attack), which neither exist in isolation from each other nor are mutually exclusive.4
Much of today’s IE encompasses social media applications on the internet, though social media in itself does not make up or define the IE. Internet based social media tools provide a plethora of information often used by global state and non-state intelligence communities. Facebook alone adds approximately 250 million photos per day while Twitter adds 200 million, and YouTube reports viewings of 4 billion videos per day. 5 These posting provide intelligence communities (tactical-strategic) near real-time situational awareness of indicators and events as they unfold, as demonstrated by ‘bursts’ on tweeters that pre-empt conventional reporting. Sites like Facebook also provide insights into group behaviors and activities.5 Social media provides users an ideal platform to voice non-attributional comments, but also the ability to publish false information that can shape global perceptions. According to an MIT study, false information spreads on average six times faster than real information.6
Exacerbating the spread of false information are social bots (computer based impersonators), which in one example accounted for one-fifth of a conversation encompassing roughly 2.8 million tweets. This included the retweeting of bots by real humans that now represent friends, family, and co-workers and provide more credibility to disinformation. While creating bot profiles is cheap, quick, and easy for use on social forums and chat rooms, to include Twitter and Facebook, they become extremely effective when endorsed with advertising dollars. This was demonstrated by the Kremlin-linked Internet Research Agency (IRA) investment of $100,000 to reach more than 126 million users during the 2016 presidential election.7
For creating emotional responses to disinformation, Russia is also known to sponsor internet trolls that are well versed in multiple languages and customary behaviors. Their intent is to start quarrels by posting inflammatory and aggressive comments that provoke readers into emotional reaction. Often these trolls spoof their location to appear in local areas and affiliated within local groups, but were much more likely to be within Ukraine, Russia or other Eastern European countries.8Looking ahead, artificial Intelligence (AI) enabled technologies are becoming weaponized disinformation tools that include: 9
- Deepfake Productions – Videos constructed for making a person appear to say or do something that they never said or did. AI has improved this capability so greatly that it is extremely difficult to discern deepfakes from real video or imagery (e.g., adding or destroying bridges) by the naked eye and ear.
- Generative Adversarial Networks (GANs) – Artificial intelligence (AI) driven technologies used to create entirely original and fake faces and bodies, often for commercial applications (e.g., video games) but can also have a profound impact by providing visual reassurances of troll and bot armies.
- Text Generation Tools – AI assisted ability to compose original text in realistic prose, able to generate mass-produced and convincing headlines, posts, articles, and comments, entirely free from human input. These tools already demonstrated their ability to create false pretenses for war (road to wars) and highlighted the dangers in creating ‘Black Mirror’ Scenarios.10
Russia uses these IE tools in every phase of their operations, including covert disinformation during peace, which U.S. military policy does not allow.11 For example, to set operational conditions for the invasion of Ukraine, military officers in the Baltics claimed that Russia began their information operations 12-years prior to the annexation of Crimea by claiming eastern Ukraine historically belonged to Russia.12 To influence tactical aspects of the IE, Russia broadcasted false news of gangs and fascists who were terrorizing Kiev and fighting to ban the Russian language, which convinced ethnic-Russians to flee Ukraine for fear of persecution, creating localized discord.13
Leaders must be savvy in shaping proactive narratives within their IE to overcome the old cliché of “perception is 9/10th of reality.” This is critical as Army units are the only armed force to offer direct and continuous interaction with populations and opposition forces. Social media influences these interactions and shapes perceptions of operational variables (PMESII-PT), which military planners must address during mission analysis to inform mission variables (METT-TC).14 Integrating a competitive IE within brigade and above collective training exercises not only creates more realistic and relevant MDO training conditions, but moreover provides commanders and staff the situational awareness and experience of operating within this new warfighting function. To help our commanders, TRADOC provides two key resources:
Network Engagement Team (NET): Working with the Center for Strategic Leadership and the Army War College, is developing courseware that will train senior leaders in cognitive maneuver — an effort directed at more effectively crafting/shaping narratives and delivering such within the IE. This understanding is essential for integrating information as a warfighting function and maximizing the effectiveness of operations in the IE. For more information or to request assistants, please send an email to firstname.lastname@example.org.
Information Operations Network (ION): Replicates the social media aspects of the internet, which is Decisive Action Training Environment (DATE) compliant and adjusted for specific exercise needs. This is a government developed and operated tool globally accessible via the NIPRnet, but can also be hosted on local networks, such as a Multinational Partner Environment (MPE). It provides the replicated feeds and information of government websites, international news agencies, Twitter, Facebook, YouTube, etc. (https://oedata.army.mil/ion-browser/). TRADOC G2 provides units exclusive access to dedicated ION partitions, for which the unit themselves extend controlled access to users for updating, modifying, and posting new information relative to their scenario, including the OPFOR.
The views and opinions presented here are the author’s and not those of the US Army or US Department of Defense.
Information is such a powerful tool that it is recognized as an instrument of national power… [which] impacts all operations.”1
— James Mattis, Former Secretary of Defense
Mattis, James N., Memorandum for Commanders, “Information as a Joint Function.” 15 Sept 2017, U.S. Secretary of Defense, Department of Defense.
TRADOC Pamphlet 525-3-1, “The U.S. Army in Multi-Domain Ops 2028.” 6 DEC 18. Training & Doctrine Command (TRADOC), Department of the Army.
Glenn. Jerome. Chapter 9 Defense, Future Mind, Acropolis Books, Washington, DC 1989)
Training Circular 7-100, “Hybrid Threat.” https://odin.tradoc.army.mil/TC/TC_7-100_Hybrid_Threat Original publication September 2012, U.S. Army Training and Doctrine Command.
David Omand, Jamie Bartlett & Carl Miller (2012):” Introducing Social Media Intelligence (SOCMINT),” Intelligence and National Security, 27:6, 801-823. https://www.researchgate.net/profile/David_Omand/publication/262869934_Introducing_social_media_intelligence_SOCMINT/links/5703ebaf08ae74a08e245b3c/Introducing-social-media-intelligence-SOCMINT.pdf
Vosoughi, Soroush, Deb Roy, and Sinan Aral. “The spread of true and false news online.” Science 359, no. 6380 (2018): 1146-1151.)
Bessi, Alessandro, and Emilio Ferrara. “Social bots distort the 2016 US Presidential election online discussion.” First Monday 21.11 (2016).
Blondel, Vincent D, Jean-Loup Guillaume, Renaud Lambiotte, and Etienne Lefebvre. 2008. “Fast unfolding of communities in large networks.” Journal of Statistical Mechanics: Theory and Experiment. http://arxiv.org/abs/0803.0476.)
Mad Scientist Program, “The Death of Authenticity: New Era Information Warfare.” 30 May 2019, Training & Doctrine Command, Department of the Army. https://madsciblog.tradoc.army.mil/149-the-death-of-authenticity-new-era-information-warfare/
Knight, Will “An AI that writes convincing prose risks mass-producing fake news.” MIT Technology Review, 14 Feb 2019, https://www.technologyreview.com/s/612960/an-ai-tool-auto-generates-fake-news-bogus-tweets-and-plenty-of-gibberish/
Giles, Keir. “Handbook of Russian information warfare.” Research division NATO defense college. October 10, 2017, http://www.ndc.nato.int/news/news.php?icode=995. Pages 10-11; references quote made by Mark Laity, Chief of Strategic Communications, Supreme Headquarters Allied Powers Europe (SHAPE)
Graham-Harrison, Emma and Daniel Boffey. “Lithuania fears Russian propaganda is prelude to eventual invasion.” The Guardian, April 3, 2017. https://www.theguardian.com/world/2017/apr/03/lithuania-fears-russian-propaganda-is-prelude-to-eventual-invasion.
(Warner, Gregory, “’Rough Translation’: What Americans can learn from fake news in Ukraine,” NPR, audio, August 21, 2018. https://www.npr.org/2017/08/21/544952989/rough-translation-what-americans-can-learn-from-fake-news-in-ukraine.)
Army Doctrinal Reference Publication (ADRP) 3-0, “Unified Land Operations.” May 2012. Headquarters, Department of the Army.