Joe Palank is a Captain in the U.S. Army Reserve, where he leads a Psychological Operations Detachment. He has also previously served as an assistant to former Secretary of Homeland Security Jeh Johnson. He can be found on Twitter at @JoePalank. Divergent Options’ content does not contain information of an official nature nor does the content represent the official position of any government, any organization or any group.
National Security Situation: Disinformation as a cognitive threat poses a risk to the U.S.
Date Originally Written: January 17, 2022.
Date Originally Published: February 14, 2022.
Author and / or Article Point of View: The author is a U.S. Army Reservist specializing in psychological operations and information operations. He has also worked on political campaigns and for the U.S. Department of Homeland Security. He has studied psychology, political communications, disinformation, and has Masters degrees in Political Management and in Public Policy, focusing on national security.
Background: Disinformation as a non-lethal weapon for both state and non-state actors is nothing new. However the rise of the internet age and social media, paired with cultural change in the U.S., has given this once fringe capability new salience. Russia, China, Iran, North Korea, and violent extremist organizations pose the most pervasive and significant risks to the United States through their increasingly weaponized use of disinformation[1].
Significance: Due to the nature of disinformation, this cognitive threat poses a risk to U.S. foreign and domestic policy-making, undercuts a foundational principle of democracy, and has already caused significant disruption to the U.S. political process. Disinformation can be used tactically alongside military operations, operationally to shape the information environment within a theater of conflict, and strategically by potentially sidelining the U.S. or allies from joining international coalitions.
Option #1: The U.S. focuses domestically.
The U.S. could combat the threat of disinformation defensively, by looking inward, and take a two-pronged approach to prevent the effects of disinformation. First, the U.S. could adopt new laws and policies to make social media companies—the primary distributor of disinformation—more aligned with U.S. national security objectives related to disinformation. The U.S. has an asymmetric advantage in serving as the home to the largest social media companies, but thus far has treated those platforms with the same laissez faire approach other industries enjoy. In recent years, these companies have begun to fight disinformation, but they are still motivated by profits, which are in turn motivated by clicks and views, which disinformation can increase[2]. Policy options might include defining disinformation and passing a law making the deliberate spread of disinformation illegal or holding social media platforms accountable for the spread of disinformation posted by their users.
Simultaneously, the U.S. could embark on widescale media literacy training for its populace. Raising awareness of disinformation campaigns, teaching media consumers how to vet information for authenticity, and educating them on the biases within media and our own psychology is an effective line of defense against disinformation[3]. In a meta-analysis of recommendations for improving awareness of disinformation, improved media literacy training was the single most common suggestion among experts[4]. Equipping the end users to be able to identify real, versus fake, news would render most disinformation campaigns ineffective.
Risk: Legal – the United States enjoys a nearly pure tradition of “free speech” which may prevent the passage of laws combatting disinformation.
Political – Passing laws holding individuals criminally liable for speech, even disinformation, would be assuredly unpopular. Additionally, cracking down on social media companies, who are both politically powerful and broadly popular, would be a political hurdle for lawmakers concerned with re-election.
Feasibility – Media literacy training would be expensive and time-consuming to implement at scale, and the same U.S. agencies that currently combat disinformation are ill-equipped to focus on domestic audiences for broad-scale educational initiatives.
Gain: A U.S. public that is immune to disinformation would make for a healthier polity and more durable democracy, directly thwarting some of the aims of disinformation campaigns, and potentially permanently. Social media companies that are more heavily regulated would drastically reduce the dissemination of disinformation campaigns worldwide, benefiting the entire liberal economic order.
Option #2: The U.S. focuses internationally.
Strategically, the U.S. could choose to target foreign suppliers of disinformation. This targeting is currently being done tactically and operationally by U.S. DoD elements, the intelligence community, and the State Department. That latter agency also houses the coordinating mechanism for the country’s handling of disinformation, the Global Engagement Center, which has no actual tasking authority within the Executive Branch. A similar, but more aggressive agency, such as the proposed Malign Foreign Influence Response Center (MFIRC), could literally bring the fight to purveyors of disinformation[5].
The U.S. has been slow to catch up to its rivals’ disinformation capabilities, responding to disinformation campaigns only occasionally, and with a varied mix of sanctions, offensive cyber attacks, and even kinetic strikes (only against non-state actors)[6]. National security officials benefit from institutional knowledge and “playbooks” for responding to various other threats to U.S. sovereignty or the liberal economic order. These playbooks are valuable for responding quickly, in-kind, and proportionately, while also giving both sides “off-ramps” to de-escalate. An MFIRC could develop playbooks for disinformation and the institutional memory for this emerging type of warfare. Disinformation campaigns are popular among U.S. adversaries due to the relative capabilities advantage they enjoy, as well as for their low costs, both financially and diplomatically[7]. Creating a basket of response options lends itself to the national security apparatus’s current capabilities, and poses fewer legal and political hurdles than changing U.S. laws that infringe on free speech. Moreover, an MFIRC would make the U.S. a more equal adversary in this sphere and raise the costs to conduct such operations, making them less palatable options for adversaries.
Risk: Geopolitical – Disinformation via the internet is still a new kind of warfare; responding disproportionately carries a significant risk of escalation, possibly turning a meme into an actual war.
Effectiveness – Going after the suppliers of disinformation could be akin to a whack-a-mole game, constantly chasing the next threat without addressing the underlying domestic problems.
Gain: Adopting this approach would likely have faster and more obvious effects. A drone strike to Russia’s Internet Research Agency’s headquarters, for example, would send a very clear message about how seriously the U.S. takes disinformation. At relatively little cost and time—more a shifting of priorities and resources—the U.S. could significantly blunt its adversaries’ advantages and make disinformation prohibitively expensive to undertake at scale.
Other Comments: There is no reason why both options could not be pursued simultaneously, save for costs or political appetite.
Recommendation: None.
Endnotes:
[1] Nemr, C. & Gangware, W. (2019, March). Weapons of Mass Distraction: Foreign State-Sponsored Disinformation in the Digital Age. Park Advisors. Retrieved January 16, 2022 from https://2017-2021.state.gov/weapons-of-mass-distraction-foreign-state-sponsored-disinformation-in-the-digital-age/index.html
[2] Cerini, M. (2021, December 22). Social media companies beef up promises, but still fall short on climate disinformation. Fortune.com. Retrieved January 16, 2022 from https://fortune.com/2021/12/22/climate-change-disinformation-misinformation-social-media/
[3] Kavanagh, J. & Rich, M.D. (2018) Truth Decay: An Initial Exploration of the Diminishing Role of Facts and Analysis in American Public Life. RAND Corporation. https://www.rand.org/t/RR2314
[4] Helmus, T. & Keep, M. (2021). A Compendium of Recommendations for Countering Russian and Other State-Sponsored Propaganda. Research Report. RAND Corporation. https://www.rand.org/pubs/research_reports/RRA894-1.html
[5] Press Release. (2020, February 14). Following Passage of their Provision to Establish a Center to Combat Foreign Influence Campaigns, Klobuchar, Reed Ask Director of National Intelligence for Progress Report on Establishment of the Center. Office of Senator Amy Klobuchar. https://www.klobuchar.senate.gov/public/index.cfm/2020/2/following-passage-of-their-provision-to-establish-a-center-to-combat-foreign-influence-campaigns-klobuchar-reed-ask-director-of-national-intelligence-for-progress-report-on-establishment-of-the-center
[6] Goldman, A. & Schmitt, E. (2016, November 24). One by One, ISIS Social Media Experts Are Killed as Result of F.B.I. Program. New York Times. Retrieved January 15, 2022 from https://www.nytimes.com/2016/11/24/world/middleeast/isis-recruiters-social-media.html
[7] Stricklin, K. (2020, March 29). Why Does Russia Use Disinformation? Lawfare. Retrieved January 15, 2022 from https://www.lawfareblog.com/why-does-russia-use-disinformation