Assessing the Cognitive Threat Posed by Technology Discourses Intended to Address Adversary Grey Zone Activities

Zac Rogers is an academic from Adelaide, South Australia. Zac has published in journals including International Affairs, The Cyber Defense Review, Joint Force Quarterly, and Australian Quarterly, and communicates with a wider audience across various multimedia platforms regularly. Parasitoid is his first book.  Divergent Options’ content does not contain information of an official nature nor does the content represent the official position of any government, any organization, or any group.

Title:  Assessing the Cognitive Threat Posed by Technology Discourses Intended to Address Adversary Grey Zone Activities

Date Originally Written:  January 3, 2022.

Date Originally Published:  January 17, 2022.

Author and / or Article Point of View:  The author is an Australia-based academic whose research combines a traditional grounding in national security, intelligence, and defence with emerging fields of social cybersecurity, digital anthropology, and democratic resilience.  The author works closely with industry and government partners across multiple projects. 

Summary:  Military investment in war-gaming, table-top exercises, scenario planning, and future force design is increasing.  Some of this investment focuses on adversary activities in the “cognitive domain.” While this investment is necessary, it may fail due to it anchoring to data-driven machine-learning and automation for both offensive and defensive purposes, without a clear understanding of their appropriateness. 

Text:  In 2019 the author wrote a short piece for the U.S. Army’s MadSci website titled  “In the Cognitive War, the Weapon is You![1]” This article attempted to spur self-reflection by the national security, intelligence, and defence communities in Australia, the United States and Canada, Europe, and the United Kingdom.  At the time these communities were beginning to incorporate discussion of “cognitive” security/insecurity in their near future threat assessments and future force design discourses. The article is cited in in the North Atlantic Treaty Organization (NATO) Cognitive Warfare document of 2020[2]. Either in ways that demonstrate the misunderstanding directly, or as part of the wider context in which the point of that particular title is thoroughly misinterpreted, the author’s desired self-reflection has not been forthcoming. Instead, and not unexpectedly, the discourse on the cognitive aspects of contemporary conflict have consumed and regurgitated a familiar sequence of errors which will continue to perpetuate rather than mitigate the problem if not addressed head-on.  

What the cognitive threat is

The primary cognitive threat is us[3]. The threat is driven by a combination of, firstly, techno-futurist hubris which exists as a permanently recycling feature of late-modern military thought.  The threat includes a precipitous slide into scientism which military thinkers and the organisations they populate have not avoided[4].  Further contributing to the threat is the commercial and financial rent-seeking which overhangs military affairs as a by-product of private-sector led R&D activities and government dependence on and cultivation of those activities increasingly since the 1990s[5].  Lastly, adversary awareness of these dynamics and an increasing willingness and capacity to manipulate and exacerbate them via the multitude of vulnerabilities ushered in by digital hyper-connectivity[6]. In other words, before the cognitive threat is an operational and tactical menace to be addressed and countered by the joint force, it is a central feature of the deteriorating epistemic condition of the late modern societies in which said forces operate and from which its personnel, funding, R&D pathways, doctrine and operating concepts, epistemic communities and strategic leadership emerge. 

What the cognitive threat is not   

The cognitive threat is not what adversary military organisations and their patrons are doing in and to the information environment with regard to activities other than kinetic military operations. Terms for adversarial activities occurring outside of conventional lethal/kinetic combat operations – such as the “grey-zone” and “below-the-threshold” – describe time-honoured tactics by which interlocutors engage in methods aimed at weakening and sowing dysfunction in the social and political fabric of competitor or enemy societies.  These tactics are used to gain advantage in areas not directly including military conflict, or in areas likely to be critical to military preparedness and mobilization in times of war[7]. A key stumbling block here is obvious: its often difficult to know which intentions such tactics express. This is not cognitive warfare. It is merely typical of contending across and between cross-cultural communities, and the permanent unwillingness of contending societies to accord with the other’s rules. Information warfare – particularly influence operations traversing the Internet and exploiting the dominant commercial operations found there – is part of this mix of activities which belong under the normal paradigm of competition between states for strategic advantage. Active measures – influence operations designed to self-perpetuate – have found fertile new ground on the Internet but are not new to the arsenals of intelligence services and, as Thomas Rid has warned, while they proliferate, are more unpredictable and difficult to control than they were in the pre-Internet era[8]. None of this is cognitive warfare either. Unfortunately, current and recent discourse has lapsed into the error of treating it as such[9], leading to all manner of self-inflicted confusion[10]. 

Why the distinction matters

Two trends emerge from the abovementioned confusion which represent the most immediate threat to the military enterprise[11]. Firstly, private-sector vendors and the consulting and lobbying industry they employ are busily pitching technological solutions based on machine-learning and automation which have been developed in commercial business settings in which sensitivity to error is not high[12]. While militaries experiment with this raft of technologies, eager to be seen at the vanguard of emerging tech; to justify R&D budgets and stave off defunding; or simply out of habit, they incur opportunity cost.  This cost is best described as stultifying investment in the human potential which strategic thinkers have long identified as the real key to actualizing new technologies[13], and entering into path dependencies with behemoth corporate actors whose strategic goal is the cultivation of rentier-relations not excluding the ever-lucrative military sector[14]. 

Secondly, to the extent that automation and machine learning technologies enter the operational picture, cognitive debt is accrued as the military enterprise becomes increasingly dependent on fallible tech solutions[15]. Under battle conditions, the first assumption is the contestation of the electromagnetic spectrum on which all digital information technologies depend for basic functionality. Automated data gathering and analysis tools suffer from heavy reliance on data availability and integrity.  When these tools are unavailable any joint multinational force will require multiple redundancies, not only in terms of technology, but more importantly, in terms of leadership and personnel competencies. It is evidently unclear where the military enterprise draws the line in terms of the likely cost-benefit ratio when it comes to experimenting with automated machine learning tools and the contexts in which they ought to be applied[16]. Unfortunately, experimentation is never cost-free. When civilian / military boundaries are blurred to the extent they are now as a result of the digital transformation of society, such experimentation requires consideration  in light of all of its implications, including to the integrity and functionality of open democracy as the entity being defended[17]. 

The first error of misinterpreting the meaning and bounds of cognitive insecurity is compounded by a second mistake: what the military enterprise chooses to invest time, attention, and resources into tomorrow[18]. Path dependency, technological lock-in, and opportunity cost all loom large if  digital information age threats are misinterpreted. This is the solipsistic nature of the cognitive threat at work – the weapon really is you! Putting one’s feet in the shoes of the adversary, nothing could be more pleasing than seeing that threat self-perpetuate. As a first step, militaries could organise and invest immediately in a strategic technology assessment capacity[19] free from the biases of rent-seeking vendors and lobbyists who, by definition, will not only not pay the costs of mission failure, but stand to benefit from rentier-like dependencies that emerge as the military enterprise pays the corporate sector to play in the digital age. 


[1] Zac Rogers, “158. In the Cognitive War – The Weapon Is You!,” Mad Scientist Laboratory (blog), July 1, 2019,

[2] Francois du Cluzel, “Cognitive Warfare” (Innovation Hub, 2020),

[3] “us” refers primarily but not exclusively to the national security, intelligence, and defence communities taking up discourse on cognitive security and its threats including Australia, the U.S., U.K., Europe, and other liberal democratic nations. 

[4] Henry Bauer, “Science in the 21st Century: Knowledge Monopolies and Research Cartels,” Journal of Scientific Exploration 18 (December 1, 2004); Matthew B. Crawford, “How Science Has Been Corrupted,” UnHerd, December 21, 2021,; William A. Wilson, “Scientific Regress,” First Things, May 2016,; Philip Mirowski, Science-Mart (Harvard University Press, 2011).

[5] Dima P Adamsky, “Through the Looking Glass: The Soviet Military-Technical Revolution and the American Revolution in Military Affairs,” Journal of Strategic Studies 31, no. 2 (2008): 257–94,; Linda Weiss, America Inc.?: Innovation and Enterprise in the National Security State (Cornell University Press, 2014); Mariana Mazzucato, The Entrepreneurial State: Debunking Public vs. Private Sector Myths (Penguin UK, 2018).

[6] Timothy L. Thomas, “Russian Forecasts of Future War,” Military Review, June 2019,; Nathan Beauchamp-Mustafaga, “Cognitive Domain Operations: The PLA’s New Holistic Concept for Influence Operations,” China Brief, The Jamestown Foundation 19, no. 16 (September 2019),

[7] See Peter Layton, “Social Mobilisation in a Contested Environment,” The Strategist, August 5, 2019,; Peter Layton, “Mobilisation in the Information Technology Era,” The Forge (blog), N/A,

[8] Thomas Rid, Active Measures: The Secret History of Disinformation and Political Warfare, Illustrated edition (New York: MACMILLAN USA, 2020).

[9] For example see Jake Harrington and Riley McCabe, “Detect and Understand: Modernizing Intelligence for the Gray Zone,” CSIS Briefs (Center for Strategic & International Studies, December 2021),; du Cluzel, “Cognitive Warfare”; Kimberly Underwood, “Cognitive Warfare Will Be Deciding Factor in Battle,” SIGNAL Magazine, August 15, 2017,; Nicholas D. Wright, “Cognitive Defense of the Joint Force in a Digitizing World” (Pentagon Joint Staff Strategic Multilayer Assessment Group, July 2021),

[10] Zac Rogers and Jason Logue, “Truth as Fiction: The Dangers of Hubris in the Information Environment,” The Strategist, February 14, 2020,

[11] For more on this see Zac Rogers, “The Promise of Strategic Gain in the Information Age: What Happened?,” Cyber Defense Review 6, no. 1 (Winter 2021): 81–105.

[12] Rodney Brooks, “An Inconvenient Truth About AI,” IEEE Spectrum, September 29, 2021,

[13] Michael Horowitz and Casey Mahoney, “Artificial Intelligence and the Military: Technology Is Only Half the Battle,” War on the Rocks, December 25, 2018,

[14] Jathan Sadowski, “The Internet of Landlords: Digital Platforms and New Mechanisms of Rentier Capitalism,” Antipode 52, no. 2 (2020): 562–80,

[15] For problematic example see Ben Collier and Lydia Wilson, “Governments Try to Fight Crime via Google Ads,” New Lines Magazine (blog), January 4, 2022,

[16] Zac Rogers, “Discrete, Specified, Assigned, and Bounded Problems: The Appropriate Areas for AI Contributions to National Security,” SMA Invited Perspectives (NSI Inc., December 31, 2019),

[17] Emily Bienvenue and Zac Rogers, “Strategic Army: Developing Trust in the Shifting Strategic Landscape,” Joint Force Quarterly 95 (November 2019): 4–14.

[18] Zac Rogers, “Goodhart’s Law: Why the Future of Conflict Will Not Be Data-Driven,” Grounded Curiosity (blog), February 13, 2021,

[19] For expansion see Zac Rogers and Emily Bienvenue, “Combined Information Overlay for Situational Awareness in the Digital Anthropological Terrain: Reclaiming Information for the Warfighter,” The Cyber Defense Review, no. Summer Edition (2021),

Artificial Intelligence / Machine Learning / Human-Machine Teaming Assessment Papers Below Established Threshold Activities (BETA) Cyberspace Influence Operations Information Systems Zac Rogers