Assessment of the Virtual Societal Warfare Environment of 2035

Editor’s Note:  This article is part of our Civil Affairs Association and Divergent Options Writing Contest which took place from April 7, 2020 to July 7, 2020.  More information about the contest can be found by clicking here.


James Kratovil is a Civil Affairs Officer in the United States Army, currently working in the Asia-Pacifc region.

Hugh Harsono is currently serving as an Officer in the United States Army. He writes regularly for multiple publications about cyberspace, economics, foreign affairs, and technology. He can be found on LinkedIn @HughHarsono.

Divergent Options’ content does not contain information of an official nature nor does the content represent the official position of any government, any organization or any group


Title:  Assessment of the Virtual Societal Warfare Environment of 2035

Date Originally Written:  April 30, 2020.

Date Originally Published:  June 3, 2020.

Author and / or Article Point of View:  Both authors believe that emerging societal warfare is a risk to U.S. interests worldwide.

Summary:  The world of 2035 will see the continued fracturing of the online community into distinctive tribes, exacerbated by sophisticated disinformation campaigns designed to manipulate these siloed groups. Anonymity on the internet will erode, thus exposing individuals personally to the masses driven by this new form of Virtual Societal Warfare, and creating an entirely new set of rules for interaction in the digital human domain.

Text:  The maturation of several emerging technologies will intersect with the massive expansion of online communities and social media platforms in 2035 to create historic conditions for the conduct of Virtual Societal Warfare. Virtual Societal Warfare is defined by the RAND Corporation as a “broad range of techniques” with the aim of changing “people’s fundamental social reality[1].” This form of warfare will see governments and other organizations influencing public opinion in increasingly precise manners. Where once narratives were shaped by professional journalists, unaltered videos, and fact-checked sources, the world of 2035 will be able to convincingly alter history itself in real time. Citizens will be left to the increasingly difficult task of discerning reality from fantasy, increasing the rate at which people will pursue whatever source of news best fits their ideology.

By 2035, the maturation of artificial intelligence (AI) will transform the information landscape. With lessons learned from experiences such as Russia’s interference with the 2016 elections in the U.S.[2], AI will continue to proliferate the issue of deep fakes to the point where it will be substantially more challenging to identify disinformation on the internet, thus increasing the effectiveness of disinformation campaigns. These AI systems will be able to churn out news stories and video clips showing fabricated footage in a remarkably convincing fashion.

With the population of the global community currently continuing to trend upwards, there is no doubt that an increasing number of individuals will seek information from popular social media platforms. The current figures for social media growth support this notion, with Facebook alone logging almost 2.5 billion monthly active users[3] and Tencent’s WeChat possessing an ever-growing user base that currently totals over 1.16 billion individuals[4]. An explosion in the online population will solidify the complete fracturing of traditional news sites into ones that cater to specific ideologies and preferences to maintain profits. This siloed collection of tailored realities will better allow disinformation campaigns of the future to target key demographics with surgical precision, making such efforts increasingly effective.

Where social media, the information environment, and online disinformation were once in their infancy of understanding, in 2035 they will constitute a significant portion of future organizational warfare. States and individuals will war in the information environment over every potentially significant piece of news, establishing multiple realities of ever starker contrast, with a body politic unable to discern the difference. The environment will encompass digital participation from governments and organizations alike. Every action taken by any organizational representative, be it a public affairs officer, a Department of Defense spokesperson, or key leader will have to take into account their engagement with online communities, with every movement being carefully planned in order to account for synchronized messaging across all web-based platforms. Organizations will need to invest considerable resources into methods of understanding how these different communities interact and react to certain news.

A digital human domain will arise, one as tangible in its culture and nuances as the physical, and organizations will have to prepare their personnel to act appropriately in it. Ostracization from an online community could have rippling effects in the physical world. One could imagine a situation where running afoul of an influential group or individual could impact the social credit score of the offender more than currently realized. Witness the power of WeChat, which not only serves as a messaging app but continually evolves to encompass a multitude of normal transactions. Everything from buying movie tickets to financial services exist on a super application home to its own ecosystem of sub-applications[5]. In 2035 this application constitutes your identity and has been blurred and merged across the digital space into one unified identity for social interactions. The result will be the death of online anonymity. Offend a large enough group of people, and you could see your social rating plummet, impacting everything from who will do business with you to interactions with government security forces.

Enter the new age disinformation campaign. While the internet has become less anonymous, it has not become any less wild, even within the intranets of certain countries. Communities set up in their own bubbles of reality are more readily excited by certain touchpoints, flocking to news organizations and individuals that cater to their specific dopamine rush of familiar news. A sophisticated group wanting to harass a rival organization could unleash massive botnets pushing AI-generated deep fakes to generate perceived mass negative reaction, crashing the social score of an individual and cutting them off from society.

Though grim, several trends are emerging to give digital practitioners and the average person a fighting chance. Much of the digital realm can be looked at as a never-ending arms race between adversarial actors and those looking to protect information and the privacy of individuals. Recognizing the growing problem of deepfakes, AI is already in development to detect different types, with a consortium of companies recently coming together to announce the “Deepfake Detection Challenge[6].” Meanwhile, the privacy industry has continued development of increasingly sophisticated forms of anonymity, with much of it freely available to a tech savvy public. The proliferation of virtual machines, Virtual Private Networks, Onion Routers, blockchain[7], and encryption have prolonged a cat and mouse game with governments that will continue into the future.

Where social media, the information environment, and online disinformation were once in their infancy of understanding, in 2035 they will be key elements used by governments and organizations in the conduct of Virtual Societal Warfare. The merging and unmasking of social media will leave individuals critically exposed to these online wars, with casualties on both sides weighed not in lives lost, but rather everyday lives suppressed by the masses. Ultimately, it will be up to individuals, corporations, and governments working together to even the odds, even as they advance the technology they seek to counter.


Endnotes:

[1] Mazarr, M., Bauer, R., Casey, A., Heintz, S. & Matthews, L. (2019). The emerging risk of virtual societal warfare : social manipulation in a changing information environment. Santa Monica, CA: RAND.

[2] Mayer, J. (2018, September 24). How Russia Helped to Swing the Election for Trump. Retrieved April 16, 2020, from https://www.newyorker.com/magazine/2018/10/01/how-russia-helped-to-swing-the-election-for-trump

[2] Petrov, C. (2019, March 25). Gmail Statistics 2020. Retrieved April 17, 2020, from https://techjury.net/stats-about/gmail-statistics/#gref

[3] Clement, J. (2020, January 30). Number of Facebook users worldwide 2008-2019. Retrieved April 18, 2020, from https://www.statista.com/statistics/264810/number-of-monthly-active-facebook-users-

[4] Thomala, L. L. (2020, March 30). Number of active WeChat messenger accounts Q2 2011-Q4 2019. Retrieved April 18, 2020, from https://www.statista.com/statistics/255778/number-of-active-wechat-messenger-accounts

[5] Feng, Jianyun. (2019, September 26). What is WeChat? The super-app you can’t live without in China. Retrieved April 25, 2020 from https://signal.supchina.com/what-is-wechat-the-super-app-you-cant-live-without-in-china

[6] Thomas, Elise. (2019, November 25). In the Battle Against Deepfakes, AI is being Pitted Against AI. Retrieved April 30, 2020 from https://www.wired.co.uk/article/deepfakes-ai

[7] Shaan, Ray. (2018, May 4). How Blockchains Will Enable Privacy. Retrived April 30, 2020 from https://towardsdatascience.com/how-blockchains-will-enable-privacy-1522a846bf65

2020 - Contest: Civil Affairs Association Writing Contest Assessment Papers Civil Affairs Association Cyberspace James Kratovil Non-Government Entities