Options to Address Disinformation as a Cognitive Threat to the United States

Joe Palank is a Captain in the U.S. Army Reserve, where he leads a Psychological Operations Detachment. He has also previously served as an assistant to former Secretary of Homeland Security Jeh Johnson. He can be found on Twitter at @JoePalank. Divergent Options’ content does not contain information of an official nature nor does the content represent the official position of any government, any organization or any group.


National Security Situation:  Disinformation as a cognitive threat poses a risk to the U.S.

Date Originally Written:  January 17, 2022.

Date Originally Published:  February 14, 2022.

Author and / or Article Point of View:  The author is a U.S. Army Reservist specializing in psychological operations and information operations. He has also worked on political campaigns and for the U.S. Department of Homeland Security. He has studied psychology, political communications, disinformation, and has Masters degrees in Political Management and in Public Policy, focusing on national security.

Background:  Disinformation as a non-lethal weapon for both state and non-state actors is nothing new.  However the rise of the internet age and social media, paired with cultural change in the U.S., has given this once fringe capability new salience. Russia, China, Iran, North Korea, and violent extremist organizations pose the most pervasive and significant risks to the United States through their increasingly weaponized use of disinformation[1]. 

Significance:  Due to the nature of disinformation, this cognitive threat poses a risk to U.S. foreign and domestic policy-making, undercuts a foundational principle of democracy, and has already caused significant disruption to the U.S. political process. Disinformation can be used tactically alongside military operations, operationally to shape the information environment within a theater of conflict, and strategically by potentially sidelining the U.S. or allies from joining international coalitions.

Option #1:  The U.S. focuses domestically. 

The U.S. could combat the threat of disinformation defensively, by looking inward, and take a two-pronged approach to prevent the effects of disinformation. First, the U.S. could adopt new laws and policies to make social media companies—the primary distributor of disinformation—more aligned with U.S. national security objectives related to disinformation. The U.S. has an asymmetric advantage in serving as the home to the largest social media companies, but thus far has treated those platforms with the same laissez faire approach other industries enjoy. In recent years, these companies have begun to fight disinformation, but they are still motivated by profits, which are in turn motivated by clicks and views, which disinformation can increase[2]. Policy options might include defining disinformation and passing a law making the deliberate spread of disinformation illegal or holding social media platforms accountable for the spread of disinformation posted by their users.

Simultaneously, the U.S. could embark on widescale media literacy training for its populace. Raising awareness of disinformation campaigns, teaching media consumers how to vet information for authenticity, and educating them on the biases within media and our own psychology is an effective line of defense against disinformation[3]. In a meta-analysis of recommendations for improving awareness of disinformation, improved media literacy training was the single most common suggestion among experts[4]. Equipping the end users to be able to identify real, versus fake, news would render most disinformation campaigns ineffective.

Risk:  Legal – the United States enjoys a nearly pure tradition of “free speech” which may prevent the passage of laws combatting disinformation.

Political – Passing laws holding individuals criminally liable for speech, even disinformation, would be assuredly unpopular. Additionally, cracking down on social media companies, who are both politically powerful and broadly popular, would be a political hurdle for lawmakers concerned with re-election. 

Feasibility –  Media literacy training would be expensive and time-consuming to implement at scale, and the same U.S. agencies that currently combat disinformation are ill-equipped to focus on domestic audiences for broad-scale educational initiatives.

Gain:  A U.S. public that is immune to disinformation would make for a healthier polity and more durable democracy, directly thwarting some of the aims of disinformation campaigns, and potentially permanently. Social media companies that are more heavily regulated would drastically reduce the dissemination of disinformation campaigns worldwide, benefiting the entire liberal economic order.

Option #2:  The U.S. focuses internationally. 

Strategically, the U.S. could choose to target foreign suppliers of disinformation. This targeting is currently being done tactically and operationally by U.S. DoD elements, the intelligence community, and the State Department. That latter agency also houses the coordinating mechanism for the country’s handling of disinformation, the Global Engagement Center, which has no actual tasking authority within the Executive Branch. A similar, but more aggressive agency, such as the proposed Malign Foreign Influence Response Center (MFIRC), could literally bring the fight to purveyors of disinformation[5]. 

The U.S. has been slow to catch up to its rivals’ disinformation capabilities, responding to disinformation campaigns only occasionally, and with a varied mix of sanctions, offensive cyber attacks, and even kinetic strikes (only against non-state actors)[6]. National security officials benefit from institutional knowledge and “playbooks” for responding to various other threats to U.S. sovereignty or the liberal economic order. These playbooks are valuable for responding quickly, in-kind, and proportionately, while also giving both sides “off-ramps” to de-escalate. An MFIRC could develop playbooks for disinformation and the institutional memory for this emerging type of warfare. Disinformation campaigns are popular among U.S. adversaries due to the relative capabilities advantage they enjoy, as well as for their low costs, both financially and diplomatically[7]. Creating a basket of response options lends itself to the national security apparatus’s current capabilities, and poses fewer legal and political hurdles than changing U.S. laws that infringe on free speech. Moreover, an MFIRC would make the U.S. a more equal adversary in this sphere and raise the costs to conduct such operations, making them less palatable options for adversaries.

Risk:  Geopolitical – Disinformation via the internet is still a new kind of warfare; responding disproportionately carries a significant risk of escalation, possibly turning a meme into an actual war.

Effectiveness – Going after the suppliers of disinformation could be akin to a whack-a-mole game, constantly chasing the next threat without addressing the underlying domestic problems.

Gain:  Adopting this approach would likely have faster and more obvious effects. A drone strike to Russia’s Internet Research Agency’s headquarters, for example, would send a very clear message about how seriously the U.S. takes disinformation. At relatively little cost and time—more a shifting of priorities and resources—the U.S. could significantly blunt its adversaries’ advantages and make disinformation prohibitively expensive to undertake at scale.

Other Comments:  There is no reason why both options could not be pursued simultaneously, save for costs or political appetite.

Recommendation:  None.


Endnotes:

[1] Nemr, C. & Gangware, W. (2019, March). Weapons of Mass Distraction: Foreign State-Sponsored Disinformation in the Digital Age. Park Advisors. Retrieved January 16, 2022 from https://2017-2021.state.gov/weapons-of-mass-distraction-foreign-state-sponsored-disinformation-in-the-digital-age/index.html 

[2] Cerini, M. (2021, December 22). Social media companies beef up promises, but still fall short on climate disinformation. Fortune.com. Retrieved January 16, 2022 from https://fortune.com/2021/12/22/climate-change-disinformation-misinformation-social-media/

[3] Kavanagh, J. & Rich, M.D. (2018) Truth Decay: An Initial Exploration of the Diminishing Role of Facts and Analysis in American Public Life. RAND Corporation. https://www.rand.org/t/RR2314

[4] Helmus, T. & Keep, M. (2021). A Compendium of Recommendations for Countering Russian and Other State-Sponsored Propaganda. Research Report. RAND Corporation. https://www.rand.org/pubs/research_reports/RRA894-1.html

[5] Press Release. (2020, February 14). Following Passage of their Provision to Establish a Center to Combat Foreign Influence Campaigns, Klobuchar, Reed Ask Director of National Intelligence for Progress Report on Establishment of the Center. Office of Senator Amy Klobuchar. https://www.klobuchar.senate.gov/public/index.cfm/2020/2/following-passage-of-their-provision-to-establish-a-center-to-combat-foreign-influence-campaigns-klobuchar-reed-ask-director-of-national-intelligence-for-progress-report-on-establishment-of-the-center

[6] Goldman, A. & Schmitt, E. (2016, November 24). One by One, ISIS Social Media Experts Are Killed as Result of F.B.I. Program. New York Times. Retrieved January 15, 2022 from https://www.nytimes.com/2016/11/24/world/middleeast/isis-recruiters-social-media.html

[7] Stricklin, K. (2020, March 29). Why Does Russia Use Disinformation? Lawfare. Retrieved January 15, 2022 from https://www.lawfareblog.com/why-does-russia-use-disinformation

Cyberspace Influence Operations Information and Intelligence Joe Palank Option Papers Social Media United States

Options to Counter Foreign Influence Operations Targeting Servicemember and Veterans

Marcus Laird has served in the United States Air Force. He presently works at Headquarters Air Force Reserve Command as a Strategic Plans and Programs Officer. He can be found on Twitter @USLairdForce.  Divergent Options’ content does not contain information of an official nature nor does the content represent the official position of any government, any organization, or any group.  Divergent Options’ does not contain official information nor is it affiliated with the Department of Defense or the U. S. Air Force. The following opinion is of the author only, and is not official Air Force or Department of Defense policy. This publication was reviewed by AFRC/PA, and is cleared for public release and unlimited distribution.


National Security Situation:  Foreign Actors are using Social Media to influence Servicemember and Veteran communities. 

Date Originally Written:  December 2, 2021.

Date Originally Published:  January 3, 2022.

Author and / or Article Point of View:  The author is a military member who has previously researched the impact of social media on US military internal dialogue for professional military education and graduate courses. 

Background:  During the lead up to the 2016 election, members of the U.S. Army Reserve were specifically targeted by advertisements on Facebook purchased by Russia’s Internet Research Agency at least ten times[1]. In 2017, the Vietnam Veterans of America (VVA) also detected social media profiles which were sophisticated mimics of their official web pages. These web pages were created for several reasons to include identity theft, fraud, and disseminating disinformation favorable to Russia. Further investigation revealed a network of fake personas attempting to make inroads within online military and veteran communities for the purpose of bolstering persona credibility to spread disinformation. Because these mimics used VVA logos, VVA was able to have these web pages deplatformed after two months due to trademark infringement[2].  

Alternatively, military influencers, after building a substantial following, have chosen to sell their personas as a means of monetizing their social media brands. While foreign adversary networks have not incorporated this technique for building an audience, the purchase of a persona is essentially an opportunity to purchase a turnkey information operation platform. 

Significance:  Servicemembers and veterans are trusted voices within their communities on matters of national security. The special trust society places on these communities makes them a particularly lucrative target for an adversary seeking to influence public opinion and shape policy debates[3]. Social media is optimized for advertising, allowing specific demographics to be targeted with unprecedented precision. Unchecked, adversaries can use this capability to sow mistrust, degrade unit cohesion, and spread disinformation through advertisements, mimicking legitimate organizations, or purchasing a trusted persona. 

Option #1:  Closing Legislative Loopholes 

Currently, foreign entities are prohibited from directly contributing to campaigns. However, there is no legal prohibition on purchasing advertising by foreign entities for the purpose of influencing elections. Using legislative means to close this loophole would deny adversaries’ abuse of platforms’ microtargeting capabilities for the purpose of political influence[4].

Risk:  Enforcement – As evidenced during inquiries into election interference, enforcement could prove difficult. Enforcement relies on good faith efforts by platforms to conduct internal assessments of sophisticated actors’ affiliations and intentions and report them. Additionally, government agencies have neither backend system access nor adequate resources to forensically investigate every potential instance of foreign advertising.

Gain:  Such a solution would protect society as a whole, to include the military and veteran communities. Legislation would include reporting and data retention requirements for platforms, allowing for earlier detection of potential information operations. Ideally, regulation would prompt platforms to tailor their content moderation standards around political advertising to create additional barriers for foreign entities.  

Option #2:  Deplatforming on the Grounds of Trademark Infringement

Should a foreign adversary attempt to use sophisticated mimicry of official accounts to achieve a veneer of credibility, then the government may elect to request a platform remove a user or network of users on the basis of trademark infringement. This technique was successfully employed by the VVA in 2017. Military services have trademark offices, which license the use of their official logos and can serve as focal points for removing unauthorized materials[5].

Risk:  Resources – since trademark offices are self-funded and rely on royalties for operations, they may not be adequately resourced to challenge large-scale trademark infringement by foreign actors.

Personnel – personnel in trademark offices may not have adequate training to determine whether or not a U.S. person or a foreign entity is using the organization’s trademarked materials. Failure to adequately delineate between U.S. persons and foreign actors when requesting to deplatform a user potentially infringes upon civil liberties. 

Gain:  Developing agency response protocols using existing intellectual property laws ensures responses are coordinated between the government and platforms as opposed to a pickup game during an ongoing operation. Regular deplatforming can also help develop signatures for sophisticated mimicry, allowing for more rapid detection and mitigation by the platforms. 

Option #3:  Subject the Sale of Influence Networks to Review by the Committee on Foreign Investment in the United States (CFIUS) 

Inform platform owners of the intent of CFIUS to review the sale of all influence networks and credentials which specifically market to military and veteran communities. CFIUS review has been used to prevent the acquisition of applications by foreign entities. Specifically, in 2019 CFIUS retroactively reviewed the purchase of Grindr, an LGBTQ+ dating application, due to national security concerns about the potential for the Chinese firm Kunlun to pass sensitive data to the Chinese government.  Data associated with veteran and servicemember social networks could be similarly protected[6]. 

Risk:  Enforcement – Due to the large number of influencers and the lack of knowledge of the scope of the problem, enforcement may be difficult in real time. In the event a sale happens, then ex post facto CFIUS review would provide a remedy.  

Gain:  Such a notification should prompt platforms to craft governance policies around the sale and transfer of personas to allow for more transparency and reporting.

Other Comments:  None.

Recommendation:  None.


Endnotes:

[1] Goldsmith, K. (2020). An Investigation Into Foreign Entities Who Are Targeting Servicemembers and Veterans Online. Vietnam Veterans of America. Retrieved September 17, 2019, from https://vva.org/trollreport/, 108.

[2] Ibid, 6-7.

[3] Gallacher, J. D., Barash, V., Howard, P. N., & Kelly, J. (2018). Junk news on military affairs and national security: Social media disinformation campaigns against us military personnel and veterans. arXiv preprint arXiv:1802.03572.

[4] Wertheimer, F. (2019, May 28). Loopholes allow foreign adversaries to legally interfere in U.S. elections. Just Security. Retrieved December 10, 2021, from https://www.justsecurity.org/64324/loopholes-allow-foreign-adversaries-to-legally-interfere-in-u-s-elections/.

[5] Air Force Trademark Office. (n.d.). Retrieved December 3, 2021, from https://www.trademark.af.mil/Licensing/Applications.aspx.

[6] Kara-Pabani, K., & Sherman, J. (2021, May 11). How a Norwegian government report shows the limits of Cfius Data Reviews. Lawfare. Retrieved December 10, 2021, from https://www.lawfareblog.com/how-norwegian-government-report-shows-limits-cfius-data-reviews.

Cyberspace Influence Operations Marcus Laird Military Veterans and Military Members Option Papers Social Media United States

Analyzing Social Media as a Means to Undermine the United States

Michael Martinez is a consultant who specializes in data analysis, project management and community engagement. has a M.S. of Intelligence Management from University of Maryland University College. He can be found on Twitter @MichaelMartinez. Divergent Optionscontent does not contain information of an official nature nor does the content represent the official position of any government, any organization, or any group. 


Title:  Analyzing Social Media as a Means to Undermine the United States

Date Originally Written:  November 30, 2021.

Date Originally Published:  December 27, 2021.

Author and / or Article Point of View:  The author believes that social media is not inherently good nor bad, but a tool to enhance discussion. Unless the national security apparatus understands how to best utilize Open Source Intelligence to achieve its stated goals, i.e. engaging the public on social media and public forums, it will lag behind its adversaries in this space.

Summary:  Stopping online radicalization of all varieties is complex and includes the individual, the government, social media companies, and Internet Service Providers. Artificial intelligence reviewing information online and flagging potential threats may not be adequate. Only through public-private partnerships can an effective system by created to support anti-radicalization endeavors.

Text:  The adage, “If you’re not paying for the product, you are the product[1],” cannot be further from the truth in the age of social media. Every user’s click and purchase are recorded by private entities such as Facebook and Twitter. These records can be utilized by other nations to gather information on the United States economy, intellectual property, as well as information on government personnel and agencies. This collation of data can be packaged together and be used to inform operations to prey on U.S. personnel.  Examples include extortion through ransomware, an adversary intelligence service probing an employee for specific national information by appealing to their subject matter expertise, and online influence / radicalization.

It is crucial to accept that the United States and its citizens are more heavily reliant on social media than ever before. Social media entities such as Meta (formerly Facebook) have new and yet to be released products for children (i.e., the “Instagram for Kids” product) enabling adversaries to prey upon any age a potential. Terrorist organizations such as Al-Qaeda utilize cartoons on outlets like YouTube and Instagram to entice vulnerable youth to carry out attacks or help radicalize potential suicide bombers[2]. 

While Facebook and YouTube are the most common among most age groups, Tik-Tok and Snapchat have undergone a meteoric arise among youth under thirty[3]. Intelligence services and terrorist organizations have vastly improved their online recruiting techniques including video and media as the platforms have become just as sophisticated. Unless federal, state, and local governments strengthen their public-private partnerships to stay ahead of growth in next generation social media platforms this adversary behavior will continue.  The national security community has tools at its disposal to help protect Americans from being turned into cybercriminals through coercion, or radicalizing individuals from overseas entities such as the Islamic State to potentially carry out domestic attacks.

To counter such trends within social media radicalization, the National Institutes of Justice (NIJ) worked with the National Academies to identify traits and agendas to facilitate disruption of these efforts. Some of the things identified include functional databases, considering links between terrorism and lesser crimes, and exploring the culture of terrorism, including structure and goals[4]. While a solid federal infrastructure and deterrence mechanism is vital, it is also important for the social media platform themselves to eliminate radical media that may influence at-risk individuals. 

According to the NIJ, there are several characteristics that contribute to social media radicalization: being unemployed, a loner, having a criminal history, a history of mental illness, and having prior military experience[5]. These are only potential factors that do not apply to all who are radicalized[6]. However, these factors do provide a base to begin investigation and mitigation strategies. 

As a long-term solution, the Bipartisan Policy Center recommends enacting and teaching media literacy to understand and spot internet radicalization[7]. Social media algorithms are not fool proof. These algorithms require the cyberspace equivalent of “see something, say something” and for users to report any suspicious activity to the platforms. The risks of these companies not acting is also vital as their main goal is to monetize. Acting in this manner does not help companies make more money. This inaction is when the government steps in to ensure that private enterprise is not impeding national security. 

Creating a system that works will balance the rights of the individual with the national security of the United States. It will also respect the rights of private enterprise and the pipelines that carry the information to homes, the Internet Service Providers. Until this system can be created, the radicalization of Americans will be a pitfall for the entire National Security apparatus. 


Endnotes:

[1] Oremus, W. (2018, April 27). Are You Really the Product? Retrieved on November 15, 2021, from https://slate.com/technology/2018/04/are-you-really-facebooks-product-the-history-of-a-dangerous-idea.html. 

[2] Thompson, R. (2011). Radicalization and the Use of Social Media. Journal of Strategic Security, 4(4), 167–190. http://www.jstor.org/stable/26463917 

[3] Pew Research Center. (2021, April 7). Social Media Use in 2021. Retrieved from https://www.pewresearch.org/internet/2021/04/07/social-media-use-in-2021/ 

[4] Aisha Javed Qureshi, “Understanding Domestic Radicalization and Terrorism,” August 14, 2020, nij.ojp.gov: https://nij.ojp.gov/topics/articles/understanding-domestic-radicalization-and-terrorism.

[5] The National Counterintelligence and Security Center. Intelligence Threats & Social Media Deception. Retrieved November 15, 2021, from https://www.dni.gov/index.php/ncsc-features/2780-ncsc-intelligence-threats-social-media-deception. 

[6] Schleffer, G., & Miller, B. (2021). The Political Effects of Social Media Platforms on Different Regime Types. Austin, TX. Retrieved November 29, 2021, from http://dx.doi.org/10.26153/tsw/13987. 

[7] Bipartisan Policy Center. (2012, December). Countering Online Radicalization in America. Retrieved November 29, 2021, from https://bipartisanpolicy.org/download/?file=/wp-content/uploads/2019/03/BPC-_Online-Radicalization-Report.pdf 

Assessment Papers Cyberspace Influence Operations Michael Martinez Social Media United States

Assessing Russian Use of Social Media as a Means to Influence U.S. Policy

Alex Buck is a currently serving officer in the Canadian Armed Forces. He has deployed twice to Afghanistan, once to Ukraine, and is now working towards an MA in National Security.  Alex can be found on Twitter @RCRbuck.  Divergent Options’ content does not contain information of an official nature nor does the content represent the official position of any government, any organization, or any group. 


Title:  Assessing Russian Use of Social Media as a Means to Influence U.S. Policy

Date Originally Written:  August 29, 2021.

Date Originally Published:  December 13, 2021.

Author and / or Article Point of View: The author believes that without appropriate action, the United States’ political climate will continue to be exploited by Russian influence campaigns. These campaigns will have broad impacts across the Western world, and potentially generate an increased competitive advantage for Russia.

Summary:  To achieve a competitive advantage over the United States, Russia uses social media-based influence campaigns to influence American foreign policy. Political polarization makes the United States an optimal target for such campaigns. 

Text:  Russia aspires to regain influence over the international system that they once had as the Soviet Union. To achieve this aim, Russia’s interest lies in building a stronger economy and expanding their regional influence over Eastern Europe[1]. Following the Cold War, Russia recognized that these national interests were at risk of being completely destroyed by Western influence. The Russian economy was threatened by the United States’ unipolar hegemony over the global economy[2]. A strong North Atlantic Treaty Organization (NATO) has threatened Russia’s regional influence in Eastern Europe. NATO’s collective security agreement was originally conceived to counter the Soviet threat following World War II and has continued to do so to this day. Through the late 1990s and early 2000s, NATO expanded their membership to include former Soviet states in Eastern Europe. This expansion was done in an effort to reduce Russian regional influence [1]. Russia perceives these actions as a threat to their survival as a state, and needs a method to regain competitive advantage.

Following the Cold War, Russia began to identify opportunities they could exploit to increase their competitive advantage in the international system. One of those opportunities began to develop in the early-2000s as social media emerged. During this time, social media began to impact American culture in such a significant way that it could not be ignored. Social media has two significant impacts on society. First, it causes people to create very dense clusters of social connections. Second, these clusters are populated by very similar types of people[3]. These two factors caused follow-on effects to American society in that they created a divided social structure and an extremely polarized political system. Russia viewed these as opportunities ripe for their exploitation. Russia sees U.S. social media as a cost-effective medium to exert influence on the United States. 

In the late 2000s, Russia began experimenting with their concept of exploiting the cyber domain as a means of exerting influence on other nation-states. After the successful use of cyber operations against Ukraine, Estonia, Georgia and again in Ukraine in 2004, 2007, 2008, and 2014 respectively, Russia was poised to attempt utilizing their concept against the United States and NATO[4]. In 2014, Russia slowly built a network of social media accounts that would eventually begin sowing disinformation amongst American social media users[3]. The significance of the Russian information campaign leading up to the 2016 U.S. presidential election can not be underestimated. The Russian Internet Research Agency propagated ~10.4 million tweets on Twitter, 76.5 million engagements on Facebook, and 187 million engagements on Instagram[5]. Although within the context of 200 billion tweets sent annually this may seem like a small-scale effort, the targeted nature of the tweets contributed to their effectiveness. This Russian social media campaign was estimated to expose between 110 and 130 million American social media users to misinformation aimed at skewing the results of the presidential election[3]. The 2000 presidential election was decided by 537 votes in the state of Florida. To change the results of an American election like that of 2000, a Russian information campaign could potentially sway electoral results with a campaign that is 0.00049% effective.

The bifurcated nature of the current American political arena has created the perfect target for Russian attacks via the cyber domain. Due to the persistently slim margins of electoral results, Russia will continue to exploit this opportunity until it achieves its national aims and gains a competitive advantage over the United States. Social media’s influence offers Russia a cost effective and highly impactful tool that has the potential to sway American policies in its favor. Without coherent strategies to protect national networks and decrease Russian social influence the United States, and the broader Western world, will continue to be subject to Russian influence. 


Endnotes:

[1] Arakelyan, L. A. (2017). Russian Foreign Policy in Eurasia: National Interests and Regional Integration (1st ed.). Routledge. https://doi.org/10.4324/9781315468372

[2] Blank, S. (2008). Threats to and from Russia: An Assessment. The Journal of Slavic Military Studies, 21(3), 491–526. https://doi.org/10.1080/13518040802313746

[3] Aral, S. (2020). The hype machine: How social media disrupts our elections, our economy, and our health–and how we must adapt (First edition). Currency.

[4] Geers, K. & NATO Cooperative Cyber Defence Centre of Excellence. (2015). Cyber war in perspective: Russian aggression against Ukraine. https://www.ccdcoe.org/library/publications/cyber-war-in-perspective-russian-aggression-against-ukraine/

[5] DiResta, R., Shaffer, K., Ruppel, B., Sullivan, D., & Matney, R. (2019). The Tactics & Tropes of the Internet Research Agency. US Senate Documents.

Alex Buck Assessment Papers Cyberspace Influence Operations Russia Social Media United States

Assessing a Situation where the Mission is a Headline

Samir Srivastava is serving in the Indian Armed Forces. The views expressed and suggestions made in the article are solely of the author in his personal capacity and do not have any official endorsement.  Divergent Opinions’ content does not contain information of an official nature nor does the content represent the official position of any government, any organization, or any group.


Title:  Assessing a Situation where the Mission is a Headline

Date Originally Written:  July 5, 2021.

Date Originally Published:  July 26, 2021.

Author and / or Article Point of View:  The author is serving with the Indian Armed Forces.   The article is written from the point of view of India in its prevailing environment.

Summary:  While headlines in news media describe the outcome of military operations, in this information age, the world could now be heading towards a situation where military operations are the outcome of a desired headline.  In situations like this, goals can be achieved by taking into assured success, the target audience, connectivity in a retaliatory context, verifiability, and deniability.

Text:  When nations fight each other, there will be news media headlines. Through various mediums and platforms, headline(s) will travel to everyone – the belligerents, their allies/supporters and also neutral parties. Conflict will be presented as a series of headlines culminating in one headline that describes the final outcome. Thus, when operations happen, headlines also happen. Yet to be considered is when  an operation  is planned and executed to make a headline happen.

In nation versus nation conflict, the days of large scale wars are certainly not over, but as trends suggest these will be more of an exception rather than rule. The future war in all likelihood will be fought at a level without a formal war declaration and quite localised. The world has seen wars where each side endeavours to prevail upon the adversary’s bodies and materiel, but already greater emphasis is being laid on prevailing upon the enemy’s mind. In that case, a decision will be required regarding what objective is being pursued – attrition, territory or just a headline.

Today, a military operation is more often than not planned at the strategic level and executed at a tactical level. This model is likely to become a norm because if a strategic outcome is achievable through a standalone tactical action, there is no reason to let the fight get bigger and more costly in terms of blood and treasure. The Balakote Airstrike[1] by the Indian Air Force is a case in point. It has been over two years since that strike took place but there is nothing to show a change in Pakistan’s attitude, which continues to harbour terrorists on its soil who would very well be plotting the next strike on India. However, what has endured is the headlines of February 26-28, 2019, which carried different messages for different people and one for Pakistan as well.

Unlike propaganda where a story is made out of nothing, if the mission is to make a headline, then that particular operation will have taken place on ground.  In this context, Headline Selection and Target Selection are two sides of the same coin but the former is the driving force.  Beyond this, success is enhanced by taking into account the probability of success, the target audience, connectivity in a retaliatory context, verifiability and deniability.  

Without assured success, the outcome will be a mismatch between the desired headline and  target selection. Taking an example from movies, in the 1997 film  “Tomorrow Never Dies[2],” the entire plot focuses on  the protagonist, Agent 007,  spoiling antagonist Carver’s scheme of creating headlines to be beamed by his media network. Once a shot is fired or ordnance dropped, there will be a headline and it is best to make sure it is the desired one.

Regarding the target audience, it is not necessary that an event gains the interest of the masses. The recipient population may be receptive, non-receptive or simply indifferent.  A headline focused on  the largest receptive group who can further propagate it has the best chance of success. 

If the operation is carried out in a retaliatory context,  it is best to connect  the enemy action and friendly reaction. For example, while cyber-attacks or economic sanctions may be an apt response to an armed attack, the likelihood of achieving the desired headline is enhanced if there is something connecting the two- action and reaction.

The headline will have much more impact if the event and its effects can be easily verified, preferably by neutral agencies and individuals. A perfect headline would be that which an under resourced freelance journalist can easily report. To that end, targets in inaccessible locations or at places that don’t strike a chord with the intended audience will be of little use. No amount of satellite photos can match one reporter on ground.   

The headline cannot lend itself to any possibility of denial because even a feeble denial can lead to credibility being questioned. It therefore goes without saying that choice of target and mode of attack should be such. During U.S. Operation NEPTUNE SPEAR[3], the raid on Osama bin Laden’s compound in Abbotabad, Pakistan,  the first sliver of publicly available information was a tweet by someone nearby. This tweet could have very well closed any avenue for denial by Pakistan or Al Qaeda.

A well thought out headline can be the start point when planning an operation or even a campaign. This vision of a headline however needs different thinking tempered with a lot of imagination and creativity. Pre-planned headlines, understanding the expertise of journalists and having platforms at the ready can be of value.      

Every field commander, division and above should have some pre-planned headlines to speak of that their organization can create if given the opportunity. These headlines include both national headlines flowing out of the higher commander’s intent, and local headlines that are more focused on the immediate engagement area.

There is benefit to be gained from the expertise of journalists – both Indian and Foreign. Their practical experience will be invaluable when deciding on the correct headline and pinpointing a target audience. Journalists are already seen in war zones and media rooms as reporters, and getting them into the operations room as planners is worthy of consideration.

An array of reporters, platforms amd mediums can be kept ready to carry the desired headline far and wide. Freelance journalists in foreign countries coupled with internet will be a potent combination. In addition, the military’s public information organization cannot succeed in this new reality without restructuring.

Every battle in military history has name of some commander(s) attached to it. Hannibal crossing the Alps, U.S. General George S. Patton’s exploits during Battle of the Bulge, and then Indian Colonel Desmond Hayde in the Battle of Dograi. The day is not far when some field commander will etch his or her name in history fighting the Battle of the Headline or, more apt, the Battle for the Headline.      


Endnotes:

[1] BBC. (2019, February 26). Balakot: Indian air strikes target militants in Pakistan. BBC News. https://www.bbc.com/news/world-asia-47366718.

[2] IMDb.com. (1997, December 19). Tomorrow Never Dies. IMDb. https://www.imdb.com/title/tt0120347.

[3] Olson, P. (2011, August 11). Man Inadvertently Live Tweets Osama Bin Laden Raid. Forbes. https://www.forbes.com/sites/parmyolson/2011/05/02/man-inadvertently-live-tweets-osama-bin-laden-raid.

Assessment Papers India Influence Operations Information and Intelligence Samir Srivastava Social Media

An Australian Perspective on Identity, Social Media, and Ideology as Drivers for Violent Extremism

Kate McNair has a Bachelor’s Degree in Criminology from Macquarie University and is currently pursuing her a Master’s Degree in Security Studies and Terrorism at Charles Sturt University.  You can follow her on Twitter @kate_amc .  Divergent Options’ content does not contain information of any official nature nor does the content represent the official position of any government, any organization, or any group. 


Title:  An Australian Perspective on Identity, Social Media, and Ideology as Drivers for Violent Extremism

Date Originally Written:  December 2, 2017.

Date Originally Published:  January 8, 2018.

Summary:  Countering Violent Extremism (CVE) is a leading initiative by many western sovereigns to reduce home-grown terrorism and extremism.  Social media, ideology, and identity are just some of the issues that fuel violent extremism for various individuals and groups and are thus areas that CVE must be prepared to address.

Text:  On March 7, 2015, two brothers aged 16 and 17 were arrested after they were suspected of leaving Australia through Sydney Airport to fight for the Islamic State[1].  The young boys fouled their parents and forged school letters.  Then they presented themselves to Australian Immigration and Border Protection shortly after purchasing tickets to an unknown middle eastern country with a small amount of funds and claimed to be on their way to visit family for three months.  Later, they were arrested for admitting to intending to become foreign fighters for the Islamic State.  October 2, 2015, Farhad Khalil Mohammad Jabar, 15 years old, approached Parramatta police station in Sydney’s West, and shot civilian police accountant Curtis Cheng in the back[2].  Later it was discovered that Jabar was inspired and influenced by two older men aged 18 and 22, who manipulated him into becoming a lone wolf attacker, and supplied him the gun he used to kill the civilian worker.

In November 2016 Parliament passed the Counter-Terrorism Legislation Amendment Bill (No. 1) 2016 and stated that “Keeping Australians safe is the first priority of the Turnbull Government, which committed to ensuring Australian law enforcement and intelligence agencies have the tools they need to fight terrorism[3].”  More recently, the Terrorism (Police Powers) Act of 2002 was extensively amended to become the Terrorism Legislation Amendment (Police Powers and Parole) Act of 2017 which allows police to have more powers during investigations and puts stronger restrictions and requirements on parolees when integrating back into society.  Although these governing documents aim at honing in on law enforcement and the investigation side of terrorism efforts, in 2014 the Tony Abbot Government implemented a nation-wide initiative called Living Safe Together[4].  Living Safe Together opposed a law enforcement-centric approach and instead focused on community-based initiatives to address the growing appeal of violent extremist ideologies in young people.

Levi West, a well-known academic in the field of terrorism in Australia highlighted that, in the cases of the aforementioned individuals, they have lived there entire lives in a world where the war of terror has existed.  These young men were part of a Muslim minority and have grown up witnessing a war that has been painted by some as the West vs Islam.  These young men were influenced by many voices between school, work, social events, and at home[5].  This leads to the question on whether these young individuals are driven to violent extremism by the ideology or are they trying to find their identity and their purpose in this world.

For young adults in Australia, social media is a strong driver for violent extremism.  Young adults are vulnerable and uncertain about various things in their life.  When people feel uncertain about who they are, the accuracy of their perceptions, beliefs, and attitudes, they seek out people who are similar to them in order to make comparisons that largely confirm the veracity and appropriateness of their own attitudes.  Social media is being weaponised by violent extremist organizations such as the Islamic State.  Social media, and other communicative Peer-to-Peer sharing platforms, are ideal to facilitate virtual learning and virtual interactions between young adults and violent extremists.  While young adults who interact within these online forums may be less likely to engage in a lone wolf attack, these forums can reinforce prior beliefs and slowly manipulate people over time.

Is it violent extremist ideology that is inspiring young individuals to become violent extremists and participate in terrorism and political violence?  Decentralized command and control within violent extremist organizations, also referred to as leaderless resistance, is a technique to inspire young individuals to take it upon themselves, with no leadership, to commit attacks against western governments and communities[6].  In the case of the Islamic State and its use of this strategy, its ideology is already known to be extreme and violent, therefore its interpretation and influence of leaderless resistance is nothing less.  Decentralization has been implemented internationally as the Islamic State continues to provide information, through sites such as Insider, on how to acquire the materiel needed to conduct attacks.  Not only does the Islamic State provide training and skill information, they encourage others to spread the their ideology through the conduct of lone wolf attacks and glorify these acts as a divine right.  Together with the vulnerability of young individuals, the strategy of decentralized command and control with the extreme ideology, has been successful thus far.  Based upon this success, CVE’s effectiveness is likely tied to it being equally focused on combating identity as a driver for violent extremism, in addition to an extreme ideology, and the strategies and initiative that can prevent individuals to becoming violent extremists.

The leading strategies in CVE have been social media, social cohesion, and identity focused.  Policy leaders and academics have identified that young individuals are struggling with the social constraints of labels and identity, therefore need to take a community-based approach when countering violent extremism.  The 2015 CVE Regional Summit reveled various recommendations and findings that relate to the use of social media and the effects it has on young, vulnerable individuals and the realities that Australia must face as a country, and as a society.  With the growing threat of homegrown violent extremism and the returning of foreign fighters from fighting with the Islamic State, without programs that address individual identity and social cohesion, violent extremism will continue to be a problem.  The Australian Federal Police (AFP) have designated Community Liaison Team members whose role is to develop partnerships with community leaders to tackle the threat of violent extremism and enhance community relations, with the AFP also adopting strategies to improve dialogue with Muslim communities. The AFP’s efforts, combined with the participation of young local leaders, is paramount to the success of these strategies and initiatives to counter the violent extremism narrative.


Endnotes:

[1] Nick Ralston, ‘Parramatta shooting: Curtis Cheng was on his way home when shot dead’ October 3rd 2015 http://www.smh.com.au/nsw/parramatta-shooting-curtis-cheng-was-on-his-way-home-when-shot-dead-20151003-gk0ibk.html Accessed December 1, 2017.

[2] Lanai Scarr, ‘Immigration Minister Peter Dutton said two teenage brothers arrested while trying to leave Australia to fight with ISIS were ‘saved’’ March 8th 2015 http://www.news.com.au/national/immigration-minister-peter-dutton-said-two-teenage-brothers-arrested-while-trying-to-leave-australia-to-fight-with-isis-were-saved/news-story/90b542528076cbdd02ed34aa8a78d33a Accessed December 1, 2017.

[3] Australian Government media release, Parliament passes Counter Terrorism Legislation Amendment Bill No 1 2016. https://www.attorneygeneral.gov.au/Mediareleases/Pages/2016/FourthQuarter/Parliament-passes-Counter-Terrorism-Legislation-Amendment-Bill-No1-2016.aspx Accessed December 1, 2017.

[4] Australian Government, Living Safer Together Building community resilience to violent extremism. https://www.livingsafetogether.gov.au/pages/home.aspx Accessed December 1, 2017.

[5] John W. Little, Episode 77 Australian Approaches to Counterterrorism Podcast, Covert Contact. October 2, 2017.

[6] West, L. 2016. ‘#jihad: Understanding social media as a weapon’, Security Challenges 12 (2): pp. 9-26.

Assessment Papers Australia Cyberspace Islamic State Variants Kate McNair Social Media Violent Extremism