Options to Mitigate Cognitive Threats

John Chiment is a strategic threat intelligence analyst and has supported efforts across the Department of Defense and U.S. Intelligence Community. The views expressed herein are those of the author and do not reflect the official policy or position of the LinQuest Corporation, any of LinQuest’s subsidiaries or parents, or the U.S. Government.  Divergent Options’ content does not contain information of an official nature nor does the content represent the official position of any government, any organization, or any group. 


National Security Situation:  Cognitive attacks target the defender’s ability to accurately perceive the battlespace and react appropriately. If successful, these attacks may permit an attacker to defeat better equipped or positioned defenders. Defenders who deploy defenses poorly matched against the incoming threat – either due to mischaracterizing that threat or by rushing to respond – likely will suffer greater losses. Mitigation strategies for cognitive attacks all carry risks.

Date Originally Written:  January 31, 2022.

Date Originally Published:   March 7, 2022.

Author and / or Article Point of View:  The author is an American threat intelligence analyst with time in uniform, as a U.S. government civilian, and as a DoD contractor. 

Background:  Effectively countering an attack requires the defender to detect its existence, recognize the danger posed, decide on a course of action, and implement that action before the attack completes its engagement. An attacker can improve the odds of a successful strike by increasing the difficulty in each of these steps (via stealth, speed, deception, saturation, etc.) while defenders can improve their chances through preparation, awareness, and technical capabilities. Correct detection and characterization of a threat enables decision-makers to decide which available defense is the most appropriate. 

Significance:  A defender deploying a suboptimal or otherwise inappropriate defense benefits the attacker. Attackers who target the defender’s understanding of the incoming attack and their decision-making process may prompt defenders to select inappropriate defenses. Technological superiority – long a goal of western militaries – may be insufficient against such cognitive manipulations that target human decision-making processes rather than the capabilities the defender controls.

Option #1:  Defenders increase their number of assets collecting Intelligence, Surveillance, and Reconnaissance (ISR) data in order to more rapidly detect threats.

Risk:  Increasing ISR data collection consumes industrial and financial resources and may worsen relationships with other powers and the general public. Increasing collection may also overwhelm analytic capabilities by providing too much data [1].

Gain:  Event detection begins the defender’s process and earlier detection permits the defender to develop more options in subsequent stages. By increasing the number of ISR assets that can begin the defender’s decision-making process, the defender increases their opportunities to select an appropriate defense.

Option #2:  The defender increases the number of assets capable of analyzing information in order to more rapidly identify the threat.

Risk:  Increasing the number of assets capable of accurately processing, exploiting, and disseminating (PED) information consumes intellectual and financial resources. Threat characterization decisions can also be targeted in the same ways as defense deployment decisions [2].

Gain:   A larger network of available PED analysts may better address localized spikes in attacks, more evenly distribute stress among analysts and analytic networks within supporting agencies, and lower the risk of mischaracterizing threats, likely improving decision-maker’s chances of selecting an appropriate defense.

Option #3:  The defender automates defense deployment decisions in order to rapidly respond with a defense.

Risk:  Automated systems may possess exploitable logical flaws that can be targeted in much the same way as defender’s existing decision-making process. Automated systems operate at greater speeds, limiting opportunities for the defender to detect and correct inappropriate decisions [3].

Gain:  Automated systems operate at high speed and may mitigate time lost to late detection or initial mischaracterization of threats. Automating decisions also reduces the immediate cognitive load on the defender by permitting defensive software designers to explore and plan for complex potentials without the stress of an incoming attack.

Option #4:  The defender increases the number of assets authorized to make defense deployment decisions in order to more likely select an appropriate defense.

Risk:  Increasing the available pool of authorized decision-makers consumes communication bandwidth and financial resources. Larger communication networks have larger attack surfaces and increase the risk of both data leaks and attackers maliciously influencing decisions into far-off engagements. Attacking the network segment may produce delays resulting in defenders not deploying appropriate defenses in time [4].

Gain:  A larger network of authorized decision-makers may better address localized spikes in attacks, more evenly distribute stress among decision-making personnel, and lower the risk of rushed judgements that may prompt inappropriate defense deployments.

Option #5:  The defender trains authorized decision-makers to operate at higher cognitive loads in order to more likely select an appropriate defense.

Risk:  Attackers likely can increase attacks and overwhelm even extremely well-trained decision-makers.  As such, this option is a short-term solution. Increasing the cognitive load on an already limited resource pool likely will increase burnout rates, lowering the overall supply of experienced decision-makers [5].

Gain:  Improving decision-maker training can likely be achieved with minimal new investments as it focusses on better utilization of existing resources.

Option #6:  The defender prepositions improved defenses and defense response options in order to better endure attacks regardless of decision-making timelines.

Risk:  Prepositioned defenses and response options consume logistical and financial resources. Actions made prior to conflict risk being detected and planned for by adversaries, reducing their potential value. Rarely used defenses have maintenance costs that can be difficult to justify [6].

Gain:  Prepositioned defenses may mitigate attacks not detected before impact by improving the targeted asset’s overall endurance, and attackers knowledgeable of the defender’s defensive capabilities and response options may be deterred or slowed when pursuing goals that will now have to contend with the defender’s assets.

Other Comments:  Risks to the decision-making processes cannot be fully avoided. Options #3 and #6 attempt to make decisions before any cognitive attacks target decision-makers while Options #2 and #4 attempt to mitigate cognitive attack impact by spreading the load across a larger pool of assets. Options #1 and #2 may permit decision-makers to make better decisions earlier in an active attack while Option #5 attempts to improve the decision-making abilities of existing decision-makers. 

Recommendation:  None.


Endnotes:

[1] Krohley, N. (2017, 24 October). The Intelligence Cycle is Broken. Here’s How To Fix It. Modern Warfare Institute at West Point. https://mwi.usma.edu/intelligence-cycle-broken-heres-fix/

[2] Corona, I., Giancinto, G., & Roli, F. (2013, 1 August). Adversarial attacks against intrusion detection systems: Taxonomy, solutions and open issues. Information Sciences, 239, 201-225. https://doi.org/10.1016/j.ins.2013.03.022

[3] Eykholt, K., Evtimov, I., Fernandes, E., Li, B., Rahmati, A. Xiao, C., Prakash, A., Kohno, T., & Song, D. (2018). Robust Physical-World Attacks on Deep Learning Visual Classification [Paper Presentation]. Conference on Computer Vision and Pattern Recognition. https://arxiv.org/abs/1707.08945v5

[4] Joint Chiefs of Staff. (2016, 21 December). Countering Threat Networks (JP 3-25). https://www.jcs.mil/Portals/36/Documents/Doctrine/pubs/jp3_25.pdf

[5] Larsen, R. P. (2001). Decision Making by Military Students Under Severe Stress. Military Psychology, 13(2), 89-98. https://doi.org/10.1207/S15327876MP1302_02

[6] Gerritz, C. (2018, 1 February). Special Report: Defense in Depth is a Flawed Cyber Strategy. Cyber Defense Magazine. https://www.cyberdefensemagazine.com/special-report-defense-in-depth-is-a-flawed-cyber-strategy/

Cyberspace Influence Operations Information and Intelligence John Chiment Option Papers

Options to Address Disinformation as a Cognitive Threat to the United States

Joe Palank is a Captain in the U.S. Army Reserve, where he leads a Psychological Operations Detachment. He has also previously served as an assistant to former Secretary of Homeland Security Jeh Johnson. He can be found on Twitter at @JoePalank. Divergent Options’ content does not contain information of an official nature nor does the content represent the official position of any government, any organization or any group.


National Security Situation:  Disinformation as a cognitive threat poses a risk to the U.S.

Date Originally Written:  January 17, 2022.

Date Originally Published:  February 14, 2022.

Author and / or Article Point of View:  The author is a U.S. Army Reservist specializing in psychological operations and information operations. He has also worked on political campaigns and for the U.S. Department of Homeland Security. He has studied psychology, political communications, disinformation, and has Masters degrees in Political Management and in Public Policy, focusing on national security.

Background:  Disinformation as a non-lethal weapon for both state and non-state actors is nothing new.  However the rise of the internet age and social media, paired with cultural change in the U.S., has given this once fringe capability new salience. Russia, China, Iran, North Korea, and violent extremist organizations pose the most pervasive and significant risks to the United States through their increasingly weaponized use of disinformation[1]. 

Significance:  Due to the nature of disinformation, this cognitive threat poses a risk to U.S. foreign and domestic policy-making, undercuts a foundational principle of democracy, and has already caused significant disruption to the U.S. political process. Disinformation can be used tactically alongside military operations, operationally to shape the information environment within a theater of conflict, and strategically by potentially sidelining the U.S. or allies from joining international coalitions.

Option #1:  The U.S. focuses domestically. 

The U.S. could combat the threat of disinformation defensively, by looking inward, and take a two-pronged approach to prevent the effects of disinformation. First, the U.S. could adopt new laws and policies to make social media companies—the primary distributor of disinformation—more aligned with U.S. national security objectives related to disinformation. The U.S. has an asymmetric advantage in serving as the home to the largest social media companies, but thus far has treated those platforms with the same laissez faire approach other industries enjoy. In recent years, these companies have begun to fight disinformation, but they are still motivated by profits, which are in turn motivated by clicks and views, which disinformation can increase[2]. Policy options might include defining disinformation and passing a law making the deliberate spread of disinformation illegal or holding social media platforms accountable for the spread of disinformation posted by their users.

Simultaneously, the U.S. could embark on widescale media literacy training for its populace. Raising awareness of disinformation campaigns, teaching media consumers how to vet information for authenticity, and educating them on the biases within media and our own psychology is an effective line of defense against disinformation[3]. In a meta-analysis of recommendations for improving awareness of disinformation, improved media literacy training was the single most common suggestion among experts[4]. Equipping the end users to be able to identify real, versus fake, news would render most disinformation campaigns ineffective.

Risk:  Legal – the United States enjoys a nearly pure tradition of “free speech” which may prevent the passage of laws combatting disinformation.

Political – Passing laws holding individuals criminally liable for speech, even disinformation, would be assuredly unpopular. Additionally, cracking down on social media companies, who are both politically powerful and broadly popular, would be a political hurdle for lawmakers concerned with re-election. 

Feasibility –  Media literacy training would be expensive and time-consuming to implement at scale, and the same U.S. agencies that currently combat disinformation are ill-equipped to focus on domestic audiences for broad-scale educational initiatives.

Gain:  A U.S. public that is immune to disinformation would make for a healthier polity and more durable democracy, directly thwarting some of the aims of disinformation campaigns, and potentially permanently. Social media companies that are more heavily regulated would drastically reduce the dissemination of disinformation campaigns worldwide, benefiting the entire liberal economic order.

Option #2:  The U.S. focuses internationally. 

Strategically, the U.S. could choose to target foreign suppliers of disinformation. This targeting is currently being done tactically and operationally by U.S. DoD elements, the intelligence community, and the State Department. That latter agency also houses the coordinating mechanism for the country’s handling of disinformation, the Global Engagement Center, which has no actual tasking authority within the Executive Branch. A similar, but more aggressive agency, such as the proposed Malign Foreign Influence Response Center (MFIRC), could literally bring the fight to purveyors of disinformation[5]. 

The U.S. has been slow to catch up to its rivals’ disinformation capabilities, responding to disinformation campaigns only occasionally, and with a varied mix of sanctions, offensive cyber attacks, and even kinetic strikes (only against non-state actors)[6]. National security officials benefit from institutional knowledge and “playbooks” for responding to various other threats to U.S. sovereignty or the liberal economic order. These playbooks are valuable for responding quickly, in-kind, and proportionately, while also giving both sides “off-ramps” to de-escalate. An MFIRC could develop playbooks for disinformation and the institutional memory for this emerging type of warfare. Disinformation campaigns are popular among U.S. adversaries due to the relative capabilities advantage they enjoy, as well as for their low costs, both financially and diplomatically[7]. Creating a basket of response options lends itself to the national security apparatus’s current capabilities, and poses fewer legal and political hurdles than changing U.S. laws that infringe on free speech. Moreover, an MFIRC would make the U.S. a more equal adversary in this sphere and raise the costs to conduct such operations, making them less palatable options for adversaries.

Risk:  Geopolitical – Disinformation via the internet is still a new kind of warfare; responding disproportionately carries a significant risk of escalation, possibly turning a meme into an actual war.

Effectiveness – Going after the suppliers of disinformation could be akin to a whack-a-mole game, constantly chasing the next threat without addressing the underlying domestic problems.

Gain:  Adopting this approach would likely have faster and more obvious effects. A drone strike to Russia’s Internet Research Agency’s headquarters, for example, would send a very clear message about how seriously the U.S. takes disinformation. At relatively little cost and time—more a shifting of priorities and resources—the U.S. could significantly blunt its adversaries’ advantages and make disinformation prohibitively expensive to undertake at scale.

Other Comments:  There is no reason why both options could not be pursued simultaneously, save for costs or political appetite.

Recommendation:  None.


Endnotes:

[1] Nemr, C. & Gangware, W. (2019, March). Weapons of Mass Distraction: Foreign State-Sponsored Disinformation in the Digital Age. Park Advisors. Retrieved January 16, 2022 from https://2017-2021.state.gov/weapons-of-mass-distraction-foreign-state-sponsored-disinformation-in-the-digital-age/index.html 

[2] Cerini, M. (2021, December 22). Social media companies beef up promises, but still fall short on climate disinformation. Fortune.com. Retrieved January 16, 2022 from https://fortune.com/2021/12/22/climate-change-disinformation-misinformation-social-media/

[3] Kavanagh, J. & Rich, M.D. (2018) Truth Decay: An Initial Exploration of the Diminishing Role of Facts and Analysis in American Public Life. RAND Corporation. https://www.rand.org/t/RR2314

[4] Helmus, T. & Keep, M. (2021). A Compendium of Recommendations for Countering Russian and Other State-Sponsored Propaganda. Research Report. RAND Corporation. https://www.rand.org/pubs/research_reports/RRA894-1.html

[5] Press Release. (2020, February 14). Following Passage of their Provision to Establish a Center to Combat Foreign Influence Campaigns, Klobuchar, Reed Ask Director of National Intelligence for Progress Report on Establishment of the Center. Office of Senator Amy Klobuchar. https://www.klobuchar.senate.gov/public/index.cfm/2020/2/following-passage-of-their-provision-to-establish-a-center-to-combat-foreign-influence-campaigns-klobuchar-reed-ask-director-of-national-intelligence-for-progress-report-on-establishment-of-the-center

[6] Goldman, A. & Schmitt, E. (2016, November 24). One by One, ISIS Social Media Experts Are Killed as Result of F.B.I. Program. New York Times. Retrieved January 15, 2022 from https://www.nytimes.com/2016/11/24/world/middleeast/isis-recruiters-social-media.html

[7] Stricklin, K. (2020, March 29). Why Does Russia Use Disinformation? Lawfare. Retrieved January 15, 2022 from https://www.lawfareblog.com/why-does-russia-use-disinformation

Cyberspace Influence Operations Information and Intelligence Joe Palank Option Papers Social Media United States

Assessing a Situation where the Mission is a Headline

Samir Srivastava is serving in the Indian Armed Forces. The views expressed and suggestions made in the article are solely of the author in his personal capacity and do not have any official endorsement.  Divergent Opinions’ content does not contain information of an official nature nor does the content represent the official position of any government, any organization, or any group.


Title:  Assessing a Situation where the Mission is a Headline

Date Originally Written:  July 5, 2021.

Date Originally Published:  July 26, 2021.

Author and / or Article Point of View:  The author is serving with the Indian Armed Forces.   The article is written from the point of view of India in its prevailing environment.

Summary:  While headlines in news media describe the outcome of military operations, in this information age, the world could now be heading towards a situation where military operations are the outcome of a desired headline.  In situations like this, goals can be achieved by taking into assured success, the target audience, connectivity in a retaliatory context, verifiability, and deniability.

Text:  When nations fight each other, there will be news media headlines. Through various mediums and platforms, headline(s) will travel to everyone – the belligerents, their allies/supporters and also neutral parties. Conflict will be presented as a series of headlines culminating in one headline that describes the final outcome. Thus, when operations happen, headlines also happen. Yet to be considered is when  an operation  is planned and executed to make a headline happen.

In nation versus nation conflict, the days of large scale wars are certainly not over, but as trends suggest these will be more of an exception rather than rule. The future war in all likelihood will be fought at a level without a formal war declaration and quite localised. The world has seen wars where each side endeavours to prevail upon the adversary’s bodies and materiel, but already greater emphasis is being laid on prevailing upon the enemy’s mind. In that case, a decision will be required regarding what objective is being pursued – attrition, territory or just a headline.

Today, a military operation is more often than not planned at the strategic level and executed at a tactical level. This model is likely to become a norm because if a strategic outcome is achievable through a standalone tactical action, there is no reason to let the fight get bigger and more costly in terms of blood and treasure. The Balakote Airstrike[1] by the Indian Air Force is a case in point. It has been over two years since that strike took place but there is nothing to show a change in Pakistan’s attitude, which continues to harbour terrorists on its soil who would very well be plotting the next strike on India. However, what has endured is the headlines of February 26-28, 2019, which carried different messages for different people and one for Pakistan as well.

Unlike propaganda where a story is made out of nothing, if the mission is to make a headline, then that particular operation will have taken place on ground.  In this context, Headline Selection and Target Selection are two sides of the same coin but the former is the driving force.  Beyond this, success is enhanced by taking into account the probability of success, the target audience, connectivity in a retaliatory context, verifiability and deniability.  

Without assured success, the outcome will be a mismatch between the desired headline and  target selection. Taking an example from movies, in the 1997 film  “Tomorrow Never Dies[2],” the entire plot focuses on  the protagonist, Agent 007,  spoiling antagonist Carver’s scheme of creating headlines to be beamed by his media network. Once a shot is fired or ordnance dropped, there will be a headline and it is best to make sure it is the desired one.

Regarding the target audience, it is not necessary that an event gains the interest of the masses. The recipient population may be receptive, non-receptive or simply indifferent.  A headline focused on  the largest receptive group who can further propagate it has the best chance of success. 

If the operation is carried out in a retaliatory context,  it is best to connect  the enemy action and friendly reaction. For example, while cyber-attacks or economic sanctions may be an apt response to an armed attack, the likelihood of achieving the desired headline is enhanced if there is something connecting the two- action and reaction.

The headline will have much more impact if the event and its effects can be easily verified, preferably by neutral agencies and individuals. A perfect headline would be that which an under resourced freelance journalist can easily report. To that end, targets in inaccessible locations or at places that don’t strike a chord with the intended audience will be of little use. No amount of satellite photos can match one reporter on ground.   

The headline cannot lend itself to any possibility of denial because even a feeble denial can lead to credibility being questioned. It therefore goes without saying that choice of target and mode of attack should be such. During U.S. Operation NEPTUNE SPEAR[3], the raid on Osama bin Laden’s compound in Abbotabad, Pakistan,  the first sliver of publicly available information was a tweet by someone nearby. This tweet could have very well closed any avenue for denial by Pakistan or Al Qaeda.

A well thought out headline can be the start point when planning an operation or even a campaign. This vision of a headline however needs different thinking tempered with a lot of imagination and creativity. Pre-planned headlines, understanding the expertise of journalists and having platforms at the ready can be of value.      

Every field commander, division and above should have some pre-planned headlines to speak of that their organization can create if given the opportunity. These headlines include both national headlines flowing out of the higher commander’s intent, and local headlines that are more focused on the immediate engagement area.

There is benefit to be gained from the expertise of journalists – both Indian and Foreign. Their practical experience will be invaluable when deciding on the correct headline and pinpointing a target audience. Journalists are already seen in war zones and media rooms as reporters, and getting them into the operations room as planners is worthy of consideration.

An array of reporters, platforms amd mediums can be kept ready to carry the desired headline far and wide. Freelance journalists in foreign countries coupled with internet will be a potent combination. In addition, the military’s public information organization cannot succeed in this new reality without restructuring.

Every battle in military history has name of some commander(s) attached to it. Hannibal crossing the Alps, U.S. General George S. Patton’s exploits during Battle of the Bulge, and then Indian Colonel Desmond Hayde in the Battle of Dograi. The day is not far when some field commander will etch his or her name in history fighting the Battle of the Headline or, more apt, the Battle for the Headline.      


Endnotes:

[1] BBC. (2019, February 26). Balakot: Indian air strikes target militants in Pakistan. BBC News. https://www.bbc.com/news/world-asia-47366718.

[2] IMDb.com. (1997, December 19). Tomorrow Never Dies. IMDb. https://www.imdb.com/title/tt0120347.

[3] Olson, P. (2011, August 11). Man Inadvertently Live Tweets Osama Bin Laden Raid. Forbes. https://www.forbes.com/sites/parmyolson/2011/05/02/man-inadvertently-live-tweets-osama-bin-laden-raid.

Assessment Papers India Influence Operations Information and Intelligence Samir Srivastava Social Media

Alternative Future: The Perils of Trading Artificial Intelligence for Analysis in the U.S. Intelligence Community

John J. Borek served as a strategic intelligence analyst for the U.S. Army and later as a civilian intelligence analyst in the U.S. Intelligence Community.  He is currently an adjunct professor at Grand Canyon University where he teaches courses in governance and public policy. Divergent Options’ content does not contain information of an official nature nor does the content represent the official position of any government, any organization, or any group.


Title:  Alternative Future: The Perils of Trading Artificial Intelligence for Analysis in the U.S. Intelligence Community

Date Originally Written:  June 12, 2020.

Date Originally Published:  August 12, 2020.

Author and / or Article Point of View:  The article is written from the point of view of a U.S. Congressional inquiry excerpt into an intelligence failure and the loss of Taiwan to China in 2035.

Summary:  The growing reliance on Artificial Intelligence (AI) to provide situational awareness and predictive analysis within the U.S. Intelligence Community (IC) resulted in an opportunity for China to execute a deception plan.  This successful deception plan resulted in the sudden and complete loss of Taiwan’s independence in 2035.

Text:  The U.S. transition away from humans performing intelligence analysis to the use of AI was an inevitable progression as the amount of data collected for analysis reached levels humans could not hope to manage[1] while machine learning and artificial neural networks developed simultaneously to the level they could match, if not outperform, human reasoning[2]. The integration of data scientists with analytic teams, which began in 2020, resulted in the attrition of of both regional and functional analysts and the transformation of the duties of those remaining to that of editor and briefer[3][4].

Initial successes in the transition led to increasing trust and complacency. The “Black Box” program demonstrated its first major success in identifying terrorist networks and forecasting terrorist actions fusing social media, network analysis, and clandestine collection; culminating in the successful preemption of the 2024 Freedom Tower attack. Moving beyond tactical successes, by 2026 Black Box was successfully analyzing climatological data, historical migration trends, and social behavior models to correctly forecast the sub-Saharan African drought and resulting instability, allowing the State Department to build a coalition of concerned nations and respond proactively to the event, mitigating human suffering and unrest.

The cost advantages and successes large and small resulted in the IC transitioning from a community of 17 coordinating analytic centers into a group of user agencies. In 2028, despite the concerns of this Committee, all analysis was centralized at the Office of the Director of National Intelligence under Black Box. Testimony at the time indicated that there was no longer any need for competitive or agency specific analysis, the algorithms of Black Box considered all likely possibilities more thoroughly and efficiently than human analysts could. Beginning that Fiscal Year the data scientists of the different agencies of the IC accessed Black Box for the analysis their decision makers needed. Also that year the coordination process for National intelligence Estimates and Intelligence Community Assessments was eliminated; as the intelligence and analysis was uniform across all agencies of government there was no longer any need for contentious, drawn out analytic sessions which only delayed delivery of the analysis to policy makers.

Regarding the current situation in the Pacific, there was never a doubt that China sought unification under its own terms with Taiwan, and the buildup and modernization of Chinese forces over the last several decades caused concern within both the U.S. and Taiwan governments[5]. This committee could find no fault with the priority that China had been given within the National Intelligence Priorities Framework. The roots of this intelligence failure lie in the IC inability to factor the possibility of deception into the algorithms of the Black Box program[6].

AI relies on machine learning, and it was well known that machines could learn biases based on the data that they were given and their algorithms[7][8]. Given the Chinese lead in AI development and applications, and their experience in using AI it to manage people and their perceptions[9][10], the Committee believes that the IC should have anticipated the potential for the virtual grooming of Black Box. As a result of this intelligence postmortem, we now know that four years before the loss of Taiwan the People’s Republic of China began their deception operation in earnest through the piecemeal release of false plans and strategy through multiple open and clandestine sources. As reported in the National Intelligence Estimate published just 6 months before the attack, China’s military modernization and procurement plan “confirmed” to Black Box that China was preparing to invade and reunify with Taiwan using overwhelming conventional military forces in 2043 to commemorate the 150th anniversary of Mao Zedong’s birth.

What was hidden from Black Box and the IC, was that China was also embarking on a parallel plan of adapting the lessons learned from Russia’s invasions of Georgia and the Ukraine. Using their own AI systems, China rehearsed and perfected a plan to use previously infiltrated special operations forces, airborne and heliborne forces, information warfare, and other asymmetric tactics to overcome Taiwan’s military superiority and geographic advantage. Individual training of these small units went unnoticed and was categorized as unremarkable and routine.

Three months prior to the October 2035 attack we now know that North Korea, at China’s request, began a series of escalating provocations in the Sea of Japan which alerted Black Box to a potential crisis and diverted U.S. military and diplomatic resources. At the same time, biometric tracking and media surveillance of key personalities in Taiwan that were previously identified as being crucial to a defense of the island was stepped up, allowing for their quick elimination by Chinese Special Operations Forces (SOF).

While we can’t determine with certainty when the first Chinese SOF infiltrated Taiwan, we know that by October 20, 2035 their forces were in place and Operation Homecoming received the final go-ahead from the Chinese President. The asymmetric tactics combined with limited precision kinetic strikes and the inability of the U.S. to respond due to their preoccupation 1,300 miles away resulted in a surprisingly quick collapse of Taiwanese resistance. Within five days enough conventional forces had been ferried to the island to secure China’s hold on it and make any attempt to liberate it untenable.

Unlike our 9/11 report which found that human analysts were unable to “connect the dots” of the information they had[11], we find that Black Box connected the dots too well. Deception is successful when it can either increase the “noise,” making it difficult to determine what is happening; or conversely by increasing the confidence in a wrong assessment[12]. Without community coordination or competing analysis provided by seasoned professional analysts, the assessment Black Box presented to policy makers was a perfect example of the latter.


Endnotes:

[1] Barnett, J. (2019, August 21). AI is breathing new life into the intelligence community. Fedscoop. Retrieved from https://www.fedscoop.com/artificial-intelligence-in-the-spying

[2] Silver, D., et al. (2016). Mastering the game of GO with deep neural networks and tree search. Nature, 529, 484-489. Retrieved from https://www.nature.com/articles/nature16961

[3] Gartin. G. W. (2019). The future of analysis. Studies in Intelligence, 63(2). Retrieved from https://www.cia.gov/library/center-for-the-study-of-intelligence/csi-publications/csi-studies/studies/vol-63-no-2/Future-of-Analysis.html

[4] Symon, P. B., & Tarapore, A. (2015, October 1). Defense intelligence in the age of big data. Joint Force Quarterly 79. Retrieved from https://ndupress.ndu.edu/Media/News/Article/621113/defense-intelligence-analysis-in-the-age-of-big-data

[5] Office of the Secretary of Defense. (2019). Annual report to Congress: Military and security developments involving the People’s Republic of China 2019. Retrieved from https://media.defense.gov/2019/May/02/2002127082/-1/-1/1/2019_CHINA_MILITARY_POWER_REPORT.pdf

[6] Knight, W. (2019). Tainted data can teach algorithms the wrong lessons. Wired. Retrieved from https://www.wired.com/story/tainted-data-teach-algorithms-wrong-lessons

[7] Boghani, P. (2019). Artificial intelligence can be biased. Here’s what you should know. PBS / Frontline Retrieved from https://www.pbs.org/wgbh/frontline/article/artificial-intelligence-algorithmic-bias-what-you-should-know

[8] Angwin, J., Larson, J., Mattu, S., & Kirchner, L. (2016). Machine bias. ProPublica. Retrieved from https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing

[9] Fanning, D., & Docherty, N. (2019). In the age of AI. PBS / Frontline. Retrieved from https://www.pbs.org/wgbh/frontline/film/in-the-age-of-ai

[10] Westerheide, F. (2020). China – the first artificial intelligence superpower. Forbes. Retrieved from https://www.forbes.com/sites/cognitiveworld/2020/01/14/china-artificial-intelligence-superpower/#794c7a52f053

[11] National Commission on Terrorist Attacks Upon the United States. (2004). The 9/11 Commission report. Retrieved from https://govinfo.library.unt.edu/911/report/911Report_Exec.htm

[12] Betts, R. K. (1980). Surprise despite warning: Why sudden attacks succeed. Political Science Quarterly 95(4), 551-572. Retrieved from https://www.jstor.org/stable/pdf/2150604.pdf

Alternative Futures / Alternative Histories / Counterfactuals Artificial Intelligence / Machine Learning / Human-Machine Teaming Assessment Papers China (People's Republic of China) Information and Intelligence John J. Borek Taiwan

Assessment of Civilian Next-Generation Knowledge Management Systems for Managing Civil Information

This article is published as part of the Small Wars Journal and Divergent Options Writing Contest which runs from March 1, 2019 to May 31, 2019.  More information about the writing contest can be found here.


Ray K. Ragan, MAd (PM), PMP is a Civil Affairs Officer in the U.S. Army Reserve and an Assistant Vice President of Project Management for a large Credit Union.  As a civilian, Ray worked in defense and financial technology industries, bringing machine learning, intelligence systems, along with speech and predictive analytics to enterprise scale.  Ray holds a Master’s degree in Administration from Northern Arizona University and a Certificate in Strategic Decision and Risk from Stanford University. He is a credentialed Project Management Profession (PMP) and has several Agile Project Management certifications.  Ray has served small and big war tours in Iraq and the Philippines with multiple mobilizations around the world, working in the U.S. National Interests.  Divergent Options’ content does not contain information of an official nature nor does the content represent the official position of any government, any organization, or any group.


Title:  Assessment of Civilian Next-Generation Knowledge Management Systems for Managing Civil Information 

Date Originally Written:  May 25, 2019.

Date Originally Published:  August 19, 2019.

Summary:  Current Civil Information Management Systems are not taking advantage of the leaps of technology in knowledge management, specifically in the realm of predictive analytics, Natural Language Processing, and Machine Learning. This creates a time cost that commanders must pay in real-time in their operating environment, particularly felt in small wars. This cost also diverts resources away from direct mission-enabling operations.

Text:  Currently Civil Information Management (CIM) systems employed by the U.S. Military are not keeping pace with the current revolution seen in civilian next-generation knowledge management systems (KMS)[1][2]. These KMS are possible through the convergence of modern computing, predictive analytics, Natural Language Processing (NLP), and Machine Learning (ML)[3]. This CIM limitation is unnecessary and self-imposed as a KMS offers persistent and progressing inputs to the common operating picture. This assessment explores how civilian business harnessed this revolution and how to apply it to CIM.

Generally, CIM represents the operational variables (OV) of an operational environment (OE) and as practiced today, resides in the domain of information rather than knowledge[4]. The DIKW pyramid framework, named for its Data, Information, Knowledge, Wisdom structure informs the structure of learning[5]. Further, one can infer that traversing each step represents time and effort, a price paid by commanders in real-time during operations. Small wars demand speed and agility. Current CIM takes time to gather data, input it into a database, run queries, overlay on maps, and eventually infer some knowledge to inform decision making by the commander[6]. 

Using the 1999-invented Cynefin Framework to aid decision-making, commanders needlessly leave many of the OVs in the chaotic domain[7]. Moving from the chaotic to the complex domain the OVs must come from a KMS that is persistent and automatically progressing. Current CIMs do not automatically update by gathering information from public sources such as broadcast, print, and digital that are digitized with NLP and speech/text analytics[8].   Instead, human operators typically located in the OE, manually update these sources. Because of this, today’s CIMs go stale after the operators complete their mission or shift priorities, making what information was gathered devolve to historic data and the OE fog of war revert to chaos[9].

The single biggest advantage a quality KMS provides to a commander is time and decision-making in the OE[10]. Implemented as a simple search engine that is persistent and progressing for all OEs, would mean a commander does not need to spend operational time and effort on basic data gathering missions. Rather, a commander can focus spending operational resources on direct mission-enabling operations. Enticingly, this simple search engine KMS allows for the next advancement, one that businesses around the world are busily employing – operationalizing big data.

Business systems, such as search engines and other applications scour open sources like in court records and organizes them through a myriad of solutions. Data organized through taxonomy and algorithms allow businesses to offer their customers usable information[11]. The advent of ML permits the conversion of information to knowledge[12]. Civilian businesses use all these tools with their call centers to not only capture what customers are saying, but also the broader meta conversation: what most customers are not saying, but revealing through their behavior[13]. 

This leap in application of informatics, which civilian business use today, is absent in today’s CIM systems. The current model of CIM is not well adapted for tomorrow’s battlefield, which will almost certainly be a data-rich environment fed by robotics, signals, and public information[14]. Even the largest team of humans cannot keep up with the overwhelming deluge of data, let alone conduct analysis and make recommendations to the commander of how the civilian terrain will affect his OE[15].

In civilian business, empiricism is replacing the older model of eminence-based decision-making. No longer is it acceptable to take the word of the highest-paid person’s opinion, business decisions need to have evidence, especially at the multi-billion dollar level company level[16]. KMS enables for hypothesis, experimentation, and evidence. Applied in the civilian terrain, if the hypothesis is that by drilling a well reduces insurgency, a global KMS will reveal the truth through the metrics, which cannot be influenced, as former-U.S. Secretary of State Condoleezza Rice criticized[17]. 

Using text preprocessing with speech analytics and NLP, the KMS would solve an OE problem of data quality, as operators when supplementing the KMS with OE reports, would use speech whenever possible. This overcomes a persistent problem of garbage in and garbage out that plagues military and business systems alike. Rather than re-typing the field notes into a form, the human operator would simply use an interactive spoken dialog for input where feasible[18].

A persistent and progressive KMS also addresses a problem with expertise. During Operation Iraqi Freedom, the U.S. State Department could not find enough experts and professionals to fill the voids in transitional governance. This problem was such that then-Secretary of Defense Robert Gates volunteered to send Department of Defense civilians in their place[19]. With a KMS, commanders and policymakers can draw on a home-based cadre of experts to assess the data models of the KMS and offer contextualized insights into the system to commanders in the field.

As the breadth and quality of the data grows, system administrators can experiment with new algorithms and models on the data in a relentless drive to shorten OV-derived insights into operations planning. Within two years, this KMS data would be among the richest political science datasets ever compiled, inviting academia to write new hypothetical models and experiment. In turn, this will assist policy makers in sensing where new sources of instability emerge before they reveal themselves in actions[20].

“How do you put the genie of knowledge back in the bottle,” P. W. Singer rhetorically asked[21] in his book, Wired for War about the prospect of a robotic, data-enabled OE. This genie will not conveniently return to his bottle for robotics or data, instead commanders and policy makers will look to how to manage the data-enabled battlefield. While it may seem a herculean task to virtually recreate OEs in a future KMS, it is a necessary one. Working through the fog of war with a candle and ceding OVs to chaos is no longer acceptable. Civilian business already addressed this problem with next-generation knowledge management systems, which are ready for today’s OE.


Endnotes:

[1] APAN Staff (n.d.) Tools. Retrieved May 9, 2019, from https://www.apan.org/(S(12adofim0n1ranvobqiyfizu))/pages/tools-communities

[2] Williams, Gregory (2016, December 2). WFX 16 tests Civil Affairs Soldiers. Retrieved May 12, 2019, from https://www.dvidshub.net/news/189856/wfx-16-tests-civil-affairs-soldiers

[3] Szilagyi and P. Wira (2018) An intelligent system for smart buildings using machine learning and semantic technologies: A hybrid data-knowledge approach, 2018 IEEE Industrial Cyber-Physical Systems (ICPS), St. Petersburg, pp. 22-24.

[4] Chief, Civil Affairs Branch et al. (2011). Joint Civil Information Management Tactical Handbook, Tampa, FL, pp. 1-3 – 2-11.

[5] Fricke, Martin (2018, June 7). Encyclopedia of Knowledge Organization: Knowledge pyramid The DIKW hierarchy. Retrieved May 19, 2019, from http://www.isko.org/cyclo/dikw

[6] Chief, Civil Affairs Branch et al. (2011). Joint Civil Information Management Tactical Handbook, Tampa, FL, pp. 5-5, 5-11.

[7] Kopsch, Thomas and Fox, Amos (2016, August 22). Embracing Complexity: Adjusting Processes to Meet the Challenges of the Contemporary Operating Environment. Retrieved May 19, 2019, from https://www.armyupress.army.mil/Journals/Military-Review/Online-Exclusive/2016-Online-Exclusive-Articles/Embracing-Complexity-Adjusting-Processes/

[8] APAN Staff (n.d.) Tools. Retrieved May 9, 2019, from https://www.apan.org/(S(12adofim0n1ranvobqiyfizu))/pages/tools-communities

[9] Neubarth, Michael (2013, June 28). Dirty Email Data Takes Its Toll. Retrieved May 20, 2019, from https://www.towerdata.com/blog/bid/116629/Dirty-Email-Data-Takes-Its-Toll

[10] Marczewski, Andrzey (2013, August 5). The Effect of Time on Decision Making. Retrieved May 20, 2019, from https://www.gamified.uk/2013/08/05/the-effect-of-time-on-decision-making/

[11] Murthy, Praveen et al. (2014, September). Big Data Taxonomy, Big Data Working Group, Cloud Security Alliance, pp. 9-29.

[12] Edwards, Gavin (2018, November 18). Machine Learning | An Introduction. Retrieved May 25, 2019, from https://towardsdatascience.com/machine-learning-an-introduction-23b84d51e6d0

[13] Gallino, Jeff (2019, May 14). Transforming the Call Center into a Competitive Advantage. Retrieved May 25, 2019, from https://www.martechadvisor.com/articles/customer-experience-2/transforming-the-call-center-into-a-competitive-advantage/

[14] Vergun, David (2018, August 21). Artificial intelligence likely to help shape future battlefield, says Army vice chief.  Retrieved May 25, 2019, from https://www.army.mil/article/210134/artificial_intelligence_likely_to_help_shape_future_battlefield_says_army_vice_chief

[15] Snibbe, Alana Conner (2006, Fall). Drowning in Data. Retrieved May 25, 2019, from https://ssir.org/articles/entry/drowning_in_data

[16] Frizzo-Barker, Julie et al. An empirical study of the rise of big data in business scholarship, International Journal of Information Management, Burnaby, Canada, pp. 403-413.

[17] Rice, Condoleezza (2011) No Higher Honor. New York, NY, Random House Publishing, pp. 506-515.

[18] Ganesan, Kavita (n.d.) All you need to know about text preprocessing for NLP and Machine Learning. Retrieved May 25, 2019, from https://www.kdnuggets.com/2019/04/text-preprocessing-nlp-machine-learning.html

[19] Gates, Robert (2014). Duty. New York, NY, Penguin Random House Publishing, pp. 347-348.

[20] Lasseter, Tom (2019, April 26). ‘Black sheep’: The mastermind of Sri Lanka’s Easter Sunday bombs. Retrieved May 25, 2019, from https://www.reuters.com/article/us-sri-lanka-blasts-mastermind-insight/black-sheep-the-mastermind-of-sri-lankas-easter-sunday-bombs-idUSKCN1S21S8

[21] Singer, Peter Warren (2009). Wired for War. The Penguin Press, New York, NY, pp. 11.

2019 - Contest: Small Wars Journal Assessment Papers Information and Intelligence Information Systems Insurgency & Counteinsurgency Ray K. Ragan

An Assessment of the Small Wars Manual as an Implementation Model for Strategic Influence in Contemporary and Future Warfare

This article is published as part of the Small Wars Journal and Divergent Options Writing Contest which runs from March 1, 2019 to May 31, 2019.  More information about the writing contest can be found here.


Bradley L. Rees is a retired United States Army Lieutenant Colonel, retiring in March 2013 as a Foreign Area Officer, 48D (South Asia).  He has served in general purpose and special operations forces within the continental United States and in numerous combat deployments to Iraq and Afghanistan.  He is a graduate of the United States Marine Corps Command and Staff College and their School of Advanced Warfighting, and the Army War College’s Defense Strategy Course.  He presently works at United States Cyber Command where he is the Deputy Chief, Future Operations, J35.  Divergent Options’ content does not contain information of an official nature nor does the content represent the official position of any government, any organization, or any group.  The opinions expressed in this assessment are those of the author, and do not represent those of the United States Government, Department of Defense, Air Force, or Cyber Command.

Title:  An Assessment of the Small Wars Manual as an Implementation Model for Strategic Influence in Contemporary and Future Warfare

Date Originally Written:  March 17, 2019.

Date Originally Published:  April 29, 2019.

Summary:  A disparity between how most within the U.S. Department of Defense (DoD) understand 20th-century information operations and 21st-century information warfare and strategic influence has produced a cognitive dissonance.  If not addressed quickly, this dichotomy will further exasperate confusion about Information as the Seventh Joint Function[1] and long-term strategic competition[2] in and through the Information Environment[3].

Text:  The United States has ceded the informational initiative to our adversaries.  As Shakespeare said, “Whereof what is past is prologue.”  If the DoD is to (re)gain and maintain the initiative against our adversaries, its actions are best informed by such a prologue.  An analogy exists between how, in Shakespeare’s The Tempest, Antonio encourages Sebastian to kill his father in order for Sebastian to become king and how most within the DoD think about responsive and globally integrated[4] military information and influence activities.  Antonio’s attempts at conveying to Sebastian that all past actions are purely contextual – an introduction or prologue –  is meant to narrow Sebastian’s focus on the future rather than the past[5].

The Department likely finds value in viewing that anecdote entirely relevant when attempting to answer what it means for Information to be a Joint Function in contemporary and future warfare.  If the Department seeks to (re)gain and maintain the initiative, appreciating history is a valuable first step, while critically important from a contextual perspective, is second only to how society today holds operational and strategic information and influence activities at a much higher premium than in years’ past.  With that, there is much to learn from the U.S. Marine Corps’ (USMC) development of its Small Wars Manual (SWM).

Today, many may question what the relevance and utility are of a 1940 USMC reference publication that focuses on peacekeeping and counterinsurgency (COIN) best practices collected from the turn of the 20th century, particularly in relation to contemporary and future warfare framed by Information as a Joint Function, strategic influence operations and their nexus with technology, and long-term strategic competition.  However, the SWM is one of those rare documents that is distinct within the broader chronicles of military history, operational lessons learned, and best practices.  It is not doctrine; it is not an operational analysis of expeditionary operations, nor is it necessarily a strategy.  Its uniqueness, however, lies in how it conveys a philosophy – an underlying theory – that addresses complexity, the necessity for adaptability,  and the criticality given to understanding the social, psychological, and informational factors that affect conflict.  The SWM reflects how ill-defined areas of operations, open-ended operational timelines, and shifting allegiances are just as relevant today, if not more so than relative combat power analyses and other more materially oriented planning factors have been in most of two century’s worth of war planning.  More so, the SWM places significant weight on how behavior, emotions, and perceptions management are central in shaping decision-making processes.

Currently, the DoD does not have the luxury of time to develop new philosophies and theories associated with military information and influence as did the USMC regarding small wars.  Similarly, the Department cannot wait an additional 66 years to develop relevant philosophies, theories, strategies, and doctrine relating to information warfare as did the U.S. Army and the USMC when they released COIN doctrine in 2006.  The Department does, however, have within the SWM a historiographic roadmap that can facilitate the development of relevant theory relating to Information as a Joint Function and strategic influence relative to long-term strategic competition.

The DoD does not intrinsically rest the development of defense and military strategies on an overarching philosophy or theory.  However, it does link such strategies to higher-level guidance; this guidance resting on a broader, more foundational American Grand Strategy, which academia has addressed extensively[6][7][8],  and on what has been termed the “American Way of War” and the broader institutional thinking behind such American ways of warfighting for more than a century[9].  Such grand strategies and ways of warfighting are best informed by deductive reasoning.  Conversely, in the absence of deductive reasoning, practitioners usually rely on induction to guide sound judgment and decisive action[10].  Despite this fact, a considerable dearth of DoD-wide organizational, institutional, and operational observations and experiences burden the Department’s ability to fully embrace, conceptualize, and operationalize globally integrated information and influence-related operations.

While the USMC did not have a century’s worth of thinking on small wars,  their three decades of experiences in peacekeeping and COIN served as the foundation to the SWM.  Throughout those three decades, the Marine Corps paid particular attention to the psychological and sociological aspects of the environment that impacted operations.  They realized that military action was doomed for failure if it was undertaken absent a well-rounded understanding of what the DoD now refers to as systems within the Operational Environment[11][12].  The SWM has an entire section dedicated to the psychological and sociological aspects that potentially motivate or cause insurrection[13].  Such considerations are just as relevant today as they were in 1940.

Today, the DoD lacks a straightforward and applicable information and influence roadmap that can be used to navigate long-term strategic competition.  The SWM provides such a navigational guide.  Studying it can provide the insights on a wide variety of factors that the Marine Corps recognized as having a significant influence on the ever-changing character of the conduct in war, the relationships and interaction between a philosophy or theory to military practice, and how its understanding of small wars impacted the development of strategy and campaign planning.  The SWM can inform the DoD on how to quickly and effectively address Information as the Seventh Joint Function, strategic influence, and long-term strategic competition in contemporary and future warfare.


Endnotes:

[1] Joint Staff, Joint Publication 3-0, Operations, pp. xiii, III-1, III-17 through III-27, (Washington, D.C., United States Printing Office, October 22, 2018).

[2] Office of the Secretary of Defense, The 2018 National Defense Strategy of the United States of America, (Washington, D.C., United States Government Printing Office, January 19, 2018).

[3] Joint Staff.  Joint Publication 2-01.3, Joint Intelligence Preparation of the Operational Environment, pp. III-19 to III-26, (Washington, D.C., United States Government Printing Office, May 21, 2014).

[4] Joint Staff. (2018), Chairman’s Vision of Global Integration [Online] briefing.  Available:  www.jcs.mil\Portals\36\Documents\Doctrine\jdpc\11_global_integration15May.pptx [accessed March 17, 2019].

[5] Shakespeare, W. (1610), The Tempest, Act II, Scene 1 [Online]. Available:  https://www.folgerdigitaltexts.org/html/Tmp.html#line-2.1.0 [accessed March 16, 2019].

[6] Weigley, R. F., The American Way of War:  A History of United States Strategy and Policy, (Bloomington, Indiana, Indiana University Press, 1978).

[7] Biddle, T. M., “Strategy and Grand Strategy:  What Students and Practitioners Need to Know,” Advancing Strategic Thought Series, (Carlisle Barracks, Pennsylvania:  Strategic Studies Institute and Army War College Press, 2015).

[8] Porter, P., “Why America’s Grand Strategy has not Changed:  Power, Habit, and the U.S. Foreign Policy Establishment,” International Security, Vol. 42, No. 4 (Spring 2018), pp. 9–46, (Cambridge, Massachusetts, MIT Press, 2018).

[9] Weigley.

[10] Bradford, A. (2017), Deductive Reasoning vs. Inductive Reasoning [Online]. Available:  https://www.livescience.com/21569-deduction-vs-induction.html [accessed March 17, 2019].

[11] Joint Staff.  Joint Publication 2-01.3, Joint Intelligence Preparation of the Operational Environment, pp. III-38 to III-40, (Washington, D.C., United States Government Printing Office, May 21, 2014).

[12] Ibid, p. xi.

[13] Department of the Navy, Headquarters United States Marine Corps. Fleet Marine Force Reference Publication 12-15, Small Wars Manual, (Washington, D.C., United States Government Printing Office, 1940).

2019 - Contest: Small Wars Journal Assessment Papers Bradley L. Rees Information and Intelligence United States

#NatSecGirl Squad: The Conference Edition White Paper — The Role of the Intelligence Community in National Security and Defense

Editor’s Note:  On November 15, 2018, #NatSecGirlSquad hosted a conference in Washington D.C. at the International Institute for Strategic Studies.  Over the coming months Divergent Options, as a partner for this event, will be deviating from our traditional content and publishing a series of white papers in various formats that capture each panel at this event.


Abigail P. Gage is a U.S. Army Veteran.  She recently earned a Master of Arts from Johns Hopkins SAIS.  Previously, Abigail worked for the House Armed Services Committee and served on active duty in Iraq and Germany.  She continues to serve today in the U.S. Army National Guard. Find her on Twitter @AbigailPGage. Divergent Options’ content does not contain information of an official nature nor does the content represent the official position of any government, any organization, or any group.


Panel Title:  Creative Problem Solving: The Role of the Intelligence Community (IC) in National Security and Defense

Author and / or Article Point of View:  Abigail P. Gage is a national security policy professional with defense experience in the military and on Capitol Hill. This paper is the result of her work with the #NatSecGirlSquad Conference edition, where she designed a panel exploring modern challenges faced by the IC.

Background:  When most people imagine life inside the intelligence community, they either picture high-paced, action-filled adventures, or the tedious life of a computer-bound analyst. In reality, the IC is constantly and simultaneously shaping U.S. international relations, responding to the ever-changing diplomatic environment around the globe. There is an unprecedented opportunity to harness new technology and digital communication platforms, including social media, to solve traditional problems, while also anticipating new issues before they become problems.

Exploring these issues at #NatSecGirlSquad Conference edition, Erin Simpson, Director of Strategic Analysis at Northrop Grumman, led three panelists through a discussion on the modern IC: Paula Doyle, Professor at Georgetown Security Studies and former Associate Deputy Director of Operations, Central Intelligence Agency; Kirsten Gnipp, Chief, Homeland and Prevention Planning Group, Directorate of Strategic Operational Planning, National Counterterrorism Center; Cortney Weinbaum, Management Scientist, RAND Corporation.

Significance:  The IC plays a pivotal role in shaping U.S. foreign policy, enabling diplomats and policymakers to respond to ever-changing global conditions by providing crucial information on both U.S. allies and enemies. The IC can analyze a world leader’s Twitter to understand how she makes decisions, can use YouTube and Instagram to track a terrorist group’s growth, and can even track financial transactions to understand how a rogue state is financing itself. But how do these new tools support decision-making? The key question is: “How is the IC using new technology and tradition tradecraft for creative problem-solving in the modern world.”

Issue #1: Optimization of Technology for the IC mission.

Whether it is a question of using technology to enhance the IC’s recruitment cycle, improve information processing workflows, or uncover new threats, the IC can use technology to increase efficiency and accuracy. Technology enhances the IC’s recruitment cycle by enabling operations officers abroad to pre-identify individuals prime for targeting – using big data to find foreign assets who are accessible to local case officers. Once recruited, technology also allows the IC to validate people and their information. In this way, the IC is already successfully harnessing technology to improve productivity and accuracy in their work.

On the other hand, the IC might be missing a major opportunity to create new, unclassified workflows – theoretically decreasing insider threat risk by distributing the open-source workforce away from centralized Sensitive Compartmented Information Facilities. Given the growing length of the security clearance process, reducing the number of people who have to be cleared because they work in a specific space, not because of the work they do, could lead to increased productivity and decrease costs for the taxpayer. It could also have the effect of increasing the pool of analysts – after all, does everyone want to live in the Washington DC commuting area? Despite these benefits, the IC will have significant cultural barriers to overcome to implement a distributed workforce, including reconsidering how they process human resources information such as time-cards. More significantly open-source intelligence work would have to be siphoned off, generating an addition silo operating in isolation from the rest of the IC.

Finally, there are the ever-present questions: How do we discover if there is or is not a new threat? How do know if there is or is not a new organization? The IC was slow to recognize the threat posed by the emerging Afghan Taliban, which started out as a student group. Then the U.S. National Security apparatus collectively misunderstood the nature of the Islamic State until they were advancing rapidly through Iraq in the spring and summer of 2014. In both cases, the organizations were savvy about their use of digital media. The latter literally announced its moves play-by-play through social media, spreading its regime of terror like the wake before a storm. If the IC is to remain a player in today’s modern, digital environment, they too will have to master social media and information operations – both in the following and delivering.

Issue #2: Cyber Operations

The opportunity for the IC to increase its social media and digital presence leads to a second issue, managing cyber operations in national security. This can take the form of overwhelming amounts of data or complicated legal limitations. Leading cyber operations during the height of the Iraq War, Paula Doyle recalled rarely, if ever, having a name to match the large data sets they tracked – and yet her team was confident that they were tracking enemy operations. They found, over time and with increasingly refined the data sets that they could positively identify a target even without a name. The IC had to culturally adjust, to learn to manage without names. Historically, targets would be named with home and work addresses. Now it was the opposite. A target was no longer a name, the team might never know “who” it was, but had an email address or cell phone to target and track.

This increasing power of cyber operations is not without limits or risks. Boundaries do not exist on the on the world-wide internet as they do the geographic globe. Most intelligence agencies, with rare exception, are barred from domestic intelligence work. When it comes to running sources and traditional, physical collection techniques, these guidelines work well. After all, most were created in a pre-cyber era. But when faced with technology and national security efforts domestically, members of the IC have to consider ethical and legal questions, ensuring they respect national borders on a platform that does not have clear boundaries.

The biggest limitation could be considered consent. For example, Virginia uses drive-by boxes to test vehicle emission. This is a form of government surveillance, but people consent because it is easier than going for an inspection at a pre-designated location. As government surveillance expands, some experts worry about a slippery slope, envisioning a world where the psychological thriller Minority Report could become a reality. Digital data companies are already building comprehensive online pictures of each internet user: who she is, what he wants. We give away much of our privacy to the private sector giants like Amazon and Google. How much of our privacy will we eventually grant to the government?

Furthermore, if we can already surmise, through social media, emails, or even google searches, what an individual is thinking about doing, how long will it be before the IC, law-enforcement, or Google figures out how to predict what people will do. Then a new question arises: can the government ethically pull information on individuals to predict and prevent violence. Historically this is a nation-wide Catch-22: when the government is known to invade our privacy, Americans demand a rollback of intelligence programs. But if the government fails to uncover and prevent an attack, we demand answers to the perceived intelligence failure.

#NatSecGirlSquad Abigail P. Gage Information and Intelligence

An Assessment of Information Warfare as a Cybersecurity Issue

Justin Sherman is a sophomore at Duke University double-majoring in Computer Science and Political Science, focused on cybersecurity, cyberwarfare, and cyber governance. Justin conducts technical security research through Duke’s Computer Science Department; he conducts technology policy research through Duke’s Sanford School of Public Policy; and he’s a Cyber Researcher at a Department of Defense-backed, industry-intelligence-academia group at North Carolina State University focused on cyber and national security – through which he works with the U.S. defense and intelligence communities on issues of cybersecurity, cyber policy, and national cyber strategy. Justin is also a regular contributor to numerous industry blogs and policy journals.

Anastasios Arampatzis is a retired Hellenic Air Force officer with over 20 years’ worth of experience in cybersecurity and IT project management. During his service in the Armed Forces, Anastasios was assigned to various key positions in national, NATO, and EU headquarters, and he’s been honored by numerous high-ranking officers for his expertise and professionalism, including a nomination as a certified NATO evaluator for information security. Anastasios currently works as an informatics instructor at AKMI Educational Institute, where his interests include exploring the human side of cybersecurity – psychology, public education, organizational training programs, and the effects of cultural, cognitive, and heuristic biases.

Paul Cobaugh is the Vice President of Narrative Strategies, a coalition of scholars and military professionals involved in the non-kinetic aspects of counter-terrorism, defeating violent extremism, irregular warfare, large-scale conflict mediation, and peace-building. Paul recently retired from a distinguished career in U.S. Special Operations Command, and his specialties include campaigns of influence and engagement with indigenous populations.

Divergent Options’ content does not contain information of an official nature nor does the content represent the official position of any government, any organization, or any group.


Title:  An Assessment of Information Warfare as a Cybersecurity Issue

Date Originally Written:  March 2, 2018.

Date Originally Published:  June 18, 2018.

Summary:  Information warfare is not new, but the evolution of cheap, accessible, and scalable cyber technologies enables it greatly.  The U.S. Department of Justice’s February 2018 indictment of the Internet Research Agency – one of the Russian groups behind disinformation in the 2016 American election – establishes that information warfare is not just a global problem from the national security and fact-checking perspectives; but a cybersecurity issue as well.

Text:  On February 16, 2018, U.S. Department of Justice Special Counsel Robert Mueller indicted 13 Russians for interfering in the 2016 United States presidential election [1]. Beyond the important legal and political ramifications of this event, this indictment should make one thing clear: information warfare is a cybersecurity issue.

It shouldn’t be surprising that Russia created fake social media profiles to spread disinformation on sites like Facebook.  This tactic had been demonstrated for some time, and the Russians have done this in numerous other countries as well[2].  Instead, what’s noteworthy about the investigation’s findings, is that Russian hackers also stole the identities of real American citizens to spread disinformation[3].  Whether the Russian hackers compromised accounts through technical hacking, social engineering, or other means, this technique proved remarkably effective; masquerading as American citizens lent significantly greater credibility to trolls (who purposely sow discord on the Internet) and bots (automated information-spreaders) that pushed Russian narratives.

Information warfare has traditionally been viewed as an issue of fact-checking or information filtering, which it certainly still is today.  Nonetheless, traditional information warfare was conducted before the advent of modern cyber technologies, which have greatly changed the ways in which information campaigns are executed.  Whereas historical campaigns took time to spread information and did so through in-person speeches or printed news articles, social media enables instantaneous, low-cost, and scalable access to the world’s populations, as does the simplicity of online blogging and information forgery (e.g., using software to manufacture false images).  Those looking to wage information warfare can do so with relative ease in today’s digital world.

The effectiveness of modern information warfare, then, is heavily dependent upon the security of these technologies and platforms – or, in many cases, the total lack thereof.  In this situation, the success of the Russian hackers was propelled by the average U.S. citizen’s ignorance of basic cyber “hygiene” rules, such as strong password creation.  If cybersecurity mechanisms hadn’t failed to keep these hackers out, Russian “agents of influence” would have gained access to far fewer legitimate social media profiles – making their overall campaign significantly less effective.

To be clear, this is not to blame the campaign’s effectiveness on specific end users; with over 100,000 Facebook accounts hacked every single day we can imagine it wouldn’t be difficult for any other country to use this same technique[4].  However, it’s important to understand the relevance of cybersecurity here. User access control, strong passwords, mandated multi-factor authentication, fraud detection, and identity theft prevention were just some of the cybersecurity best practices that failed to combat Russian disinformation just as much as fact-checking mechanisms or counter-narrative strategies.

These technical and behavioral failures didn’t just compromise the integrity of information, a pillar of cybersecurity; they also enabled the campaign to become incredibly more effective.  As the hackers planned to exploit the polarized election environment, access to American profiles made this far easier: by manipulating and distorting information to make it seem legitimate (i.e., opinions coming from actual Americans), these Russians undermined law enforcement operations, election processes, and more.  We are quick to ask: how much of this information was correct and how much of it wasn’t?  Who can tell whether the information originated from un-compromised, credible sources or from credible sources that have actually been hacked?

However, we should also consider another angle: what if the hackers hadn’t won access to those American profiles in the first place?  What if the hackers were forced to almost entirely use fraudulent accounts, which are prone to be detected by Facebook’s algorithms?  It is for these reasons that information warfare is so critical for cybersecurity, and why Russian information warfare campaigns of the past cannot be equally compared to the digital information wars of the modern era.

The global cybersecurity community can take an even greater, active role in addressing the account access component of disinformation.  Additionally, those working on information warfare and other narrative strategies could leverage cybersecurity for defensive operations.  Without a coordinated and integrated effort between these two sectors of the cyber and security communities, the inability to effectively combat disinformation will only continue as false information penetrates our social media feeds, news cycles, and overall public discourse.

More than ever, a demand signal is present to educate the world’s citizens on cyber risks and basic cyber “hygiene,” and to even mandate the use of multi-factor authentication, encrypted Internet connections, and other critical security features.  The security of social media and other mass-content-sharing platforms has become an information warfare issue, both within respective countries and across the planet as a whole.  When rhetoric and narrative can spread (or at least appear to spread) from within, the effectiveness of a campaign is amplified.  The cybersecurity angle of information warfare, in addition to the misinformation, disinformation, and rhetoric itself, will remain integral to effectively combating the propaganda and narrative campaigns of the modern age.


Endnotes:

[1] United States of America v. Internet Research Agency LLC, Case 1:18-cr-00032-DLF. Retrieved from https://www.justice.gov/file/1035477/download

[2] Wintour, P. (2017, September 5). West Failing to Tackle Russian Hacking and Fake News, Says Latvia. Retrieved from https://www.theguardian.com/world/2017/sep/05/west-failing-to-tackle-russian-hacking-and-fake-news-says-latvia

[3] Greenberg, A. (2018, February 16). Russian Trolls Stole Real US Identities to Hide in Plain Sight. Retrieved from https://www.wired.com/story/russian-trolls-identity-theft-mueller-indictment/

[4] Callahan, M. (2015, March 1). Big Brother 2.0: 160,000 Facebook Pages are Hacked a Day. Retrieved from https://nypost.com/2015/03/01/big-brother-2-0-160000-facebook-pages-are-hacked-a-day/

Anastasios Arampatzis Assessment Papers Cyberspace Information and Intelligence Information Systems Justin Sherman Paul Cobaugh Political Warfare Psychological Factors

Assessing Al Suri’s Individual Terrorism Jihadist Against Lone Wolves

Cory Newton served as a Machinegunner in the United States Marine Corps from 1996-2000 and earned a B.S. in Philosophy, Politics, and Economics form Eastern Oregon University in 2012.  Cory authored Constitutional Capitalism and Common Defense in 2014 and can be found on Twitter @corynewton78 or on the web at www.corynewton.com.  Divergent Options’ content does not contain information of an official nature nor does the content represent the official position of any government, any organization, or any group.


Title:  Assessing Al Suri’s Individual Terrorism Jihadist Against Lone Wolves

Date Originally Written:  December 11, 2017.

Date Originally Published:  February 19, 2018.

Summary:  Terrorism is a tactic and often results in dead or wounded civilians.  Both individual terrorism jihadists and lone wolves use this tactic.  Despite this tactic producing similar results by whomever uses it, there is a distinct difference between individual terrorism jihadists and lone wolves.  Until governments understand and accept this difference, data related to attacks that use terrorism tactics will be skewed.

Text:  The Global Islamic Resistance Call was published by Abu Mus’ab al-Suri in January 2005[1].  The military theory of the Resistance Call is based on applying two forms of jihad.  The first form is individual terrorism jihad and secret operational activity of small units totally separated from each other.  The second form is participation in jihad at the open fronts wherever the necessary preconditions exist.  The individual terrorism jihadist differs from an open front jihadist in that the individual jihadist is unable to make it to the open front.  The individual terrorism jihadist also differs from the small cell jihadist in that their actions are truly independent.  Individual terrorism jihad was specifically designed to maximize feelings of helplessness of the targeted population by unleashing the innovation, initiative, and creativity inherent in a decentralized structure.

Individual terrorism jihad enables anyone, anywhere, at any time to wage jihad using terrorism without formally being affiliated with a terrorist organization.  All the individual terrorism jihadist must do is be properly motivated to take action in the name of jihad, identify a weakness or vulnerability, and apply force to exploit it.  Although the attacker does not have any direct ties to a terrorist organization, the attacker has rationally chosen to wage jihad using terrorism in a manner which they expect the attack to produce more benefits than costs.

There is a clear distinction between participation in what Al-Suri identified as individual terrorism jihad and lone wolf violent extremists who use terrorist tactics in the name of their cause.

Suppose a person who is inspired by, but not directly affiliated with, any one of the 917 hate groups in the United States identified by the Southern Poverty Law Center (SPLC)[2] carries out a lone wolf terrorist attack.  Despite the violent extremists’ non affiliation with an SPLC-identified hate group, the attack will likely be investigated as an act of terror.

On the other hand, suppose a marginalized person is seduced by an outside of the mainstream Islamist organization.  The person lacks affiliation to a terrorist organization but possess “a resolute, personal decision to perform the individual duty of jihad[1]” which motivates them to conduct an active shooting, knife attack, or vehicular ramming assault in which they verbalize their intentions with an Allahu Akbar war cry.  Despite the attacker’s non affiliation with a terrorist organization, the attack will likely be investigated as an act of terror.

One difference between the two acts of terror described above is that the former is carried out by a lone wolf using terrorism to wage war on a local scale, while the latter is performed by an individual terrorism jihadist locally waging war on a global scale.  The lone wolf who carries out a terrorist attack does not belong to a decentralized military theory of global Islamist resistance, as the individual terrorism jihadist does.  Individual terrorism jihad is similar to an independent franchise.  A lone wolf attack is independent, but usually does not occur within the context of a global resistance movement.

The individual terrorism jihadist and the lone wolf are two different threats.  As terroristic violence that specifically originates from the concept of individual terrorism jihad differs from terroristic violence that originates from the lone wolf, consideration should be given to classifying each differently in order to measure the frequency and severity of individual terrorism jihadist attacks.  If the frequency and severity of terrorist attacks by lone wolves is measured separately, terrorism data will be more accurate.  Both types of terrorist attacks will often have identical consequences.  The carnage wrought by an individual terrorism jihadist may very well be indistinguishable from the carnage wrought by a lone wolf white nationalist or lone wolf ecological extremist.  One is the result of global jihad attacking locally.  The other is a localized attack seeking national media attention.

As individual terrorism jihad and lone wolf attacks continue to increase, it is important properly identify and properly categorize each.  Theodore Kaczynski is the best example of a lone wolf who waged war using terrorism.  The threat posed by a person in that category is significantly different from an individual jihadist locally attacking a variety of soft targets using rifles, blades, explosives, or vehicles in the context of a global resistance movement.

Both individual terrorism jihad attacks and lone wolf attacks will continue to increase and evolve.  In order to combat these attacks in the future it is best if government officials understand whether the terrorist actions are part of global resistance movement or based on a personal or localized motivation.  In the case of individual terrorism jihad, these attacks will continue until the cost far exceeds the benefits.  The U.S. is very effective at determining the amount of force necessary to destroy enemy personnel and equipment.  Unfortunately, the U.S. still has a long way to go in determining the fine line between the amount of force necessary to destroy the enemies’ will to fight, and the amount of force that will galvanize the enemies’ will to resist.


Endnotes:

[1] Lia, Brynjar (2008) Columbia University Press, Architect of Global Jihad, The Global Islamic Resistance Call (Key Excerpts), Military Theory of The Global Islamic Resistance Call, Page 371

[2] Southern Poverty Law Center Hate Map. (n.d.). Retrieved December 13, 2017, from https://www.splcenter.org/hate-map

Assessment Papers Cory Newton Information and Intelligence Violent Extremism

Evolution of U.S. Cyber Operations and Information Warfare

Brett Wessley is an officer in the U.S. Navy, currently assigned to U.S. Pacific Command.   The contents of this paper reflect his own personal views and are not necessarily endorsed by U.S. Pacific Command, Department of the Navy or Department of Defense.  Connect with him on Twitter @Brett_Wessley.  Divergent Options’ content does not contain information of an official nature nor does the content represent the official position of any government, any organization, or any group.  


National Security Situation:  Evolving role of cyber operations and information warfare in military operational planning.

Date Originally Written:  April 19, 2017.

Date Originally Published:  May 25, 2017.

Author and / or Article Point of View:  This article is intended to present options to senior level Department of Defense planners involved with Unified Command Plan 2017.

Background:  Information Warfare (IW) has increasingly gained prominence throughout defense circles, with both allied and adversarial militaries reforming and reorganizing IW doctrine across their force structures.  Although not doctrinally defined by the U.S. Department of Defense (DoD), IW has been embraced with varying degrees by the individual branches of the U.S. armed forces[1].  For the purposes of this paper, the definition of IW is: the means of creating non-kinetic effects in the battlespace that disrupt, degrade, corrupt, or influence the ability of adversaries or potential adversaries to conduct military operations while protecting our own.

Significance:  IW has been embraced by U.S. near-peer adversaries as a means of asymmetrically attacking U.S. military superiority.  Russian Defense Minister Sergei Shoigu recently acknowledged the existence of “information warfare troops,” who conduct military exercises and real-world operations in Ukraine demonstrating the fusion of intelligence, offensive cyber operations, and information operations (IO)[2].   The People’s Republic of China has also reorganized its armed forces to operationalize IW, with the newly created People’s Liberation Army Strategic Support Force drawing from existing units to combine intelligence, cyber electronic warfare (EW), IO and space forces into a single command[3].

Modern militaries increasingly depend on sophisticated systems for command and control (C2), communications and intelligence.  Information-related vulnerabilities have the potential for creating non-kinetic operational effects, often as effective as kinetic fires options.  According to U.S. Army Major General Stephen Fogarty, “Russian activities in Ukraine…really are a case study for the potential for CEMA, cyber-electromagnetic activities…It’s not just cyber, it’s not just electronic warfare, it’s not just intelligence, but it’s really effective integration of all these capabilities with kinetic measures to actually create the effect that their commanders [want] to achieve[4].”  Without matching the efforts of adversaries to operationalize IW, U.S. military operations risk vulnerability to enemy IW operations.

Option #1:  United States Cyber Command (USCYBERCOM) will oversee Military Department efforts to man, train, and equip IW and IW-related forces to be used to execute military operations under Combatant Command (CCMD) authority.  Additionally, USCYBERCOM will synchronize IW planning and coordinate IW operations across the CCMDs, as well as execute some IW operations under its own authority.

Risk:  USCYBERCOM, under United States Strategic Command (USSTRATCOM) as a sub-unified command, and being still relatively new, has limited experience coordinating intelligence, EW, space and IO capabilities within coherent IW operations.  USSTRATCOM is tasked with responsibility for DoD-wide space operations, and the Geographic Combatant Commands (GCCs) are tasked with intelligence, EW, and IO operational responsibility[5][6][7].”  Until USCYBERCOM gains experience supporting GCCs with full-spectrum IW operations, previously GCC-controlled IO and EW operations will operate at elevated risk relative to similar support provided by USSTRATCOM.

Gain:  USCYBERCOM overseeing Military Department efforts to man, train, and equip IW and IW-related forces will ensure that all elements of successful non-kinetic military effects are ready to be imposed on the battlefield.  Operational control of IW forces will remain with the GCC, but USCYBERCOM will organize, develop, and plan support during crisis and war.  Much like United States Special Operations Command’s (USSOCOM) creation as a unified command consolidated core special operations activities, and tasked USSOCOM to organize, train, and equip special operations forces, fully optimized USCYBERCOM would do the same for IW-related forces.

This option is a similar construct to the Theater Special Operations Commands (TSOCs) which ensure GCCs are fully supported during execution of operational plans.  Similar to TSOCs, Theater Cyber Commands could be established to integrate with GCCs and support both contingency planning and operations, replacing the current Joint Cyber Centers (JCCs) that coordinate current cyber forces controlled by USCYBERCOM and its service components[8].

Streamlined C2 and co-location of IW and IW-related forces would have a force multiplying effect when executing non-kinetic effects during peacetime, crisis and conflict.  Instead of cyber, intelligence, EW, IO, and space forces separately planning and coordinating their stove-piped capabilities, they would plan and operate as an integrated unit.

Option #2:  Task GCCs with operational responsibility over aligned cyber forces, and integrate them with current IW-related planning and operations.

Risk:  GCCs lack the institutional cyber-related knowledge and expertise that USCYBERCOM maintains, largely gained by Commander, USCYBERCOM traditionally being dual-hatted as Director of the National Security Agency (NSA).  While it is plausible that in the future USCYBERCOM could develop equivalent cyber-related tools and expertise of NSA, it is much less likely that GCC responsibility for cyber forces could sustain this relationship with NSA and other Non-Defense Federal Departments and Agencies (NDFDA) that conduct cyber operations.

Gain:  GCCs are responsible for theater operational and contingency planning, and would be best suited for tailoring IW-related effects to military plans.  During all phases of military operations, the GCC would C2 IW operations, leveraging the full spectrum of IW to both prepare the operational environment and execute operations in conflict.  While the GCCs would be supported by USSTRATCOM/USCYBERCOM, in addition to the NDFDAs, formally assigning Cyber Mission Teams (CMTs) as the Joint Force Cyber Component (JFCC) to the GCC would enable the Commander influence the to manning, training, and equipping of forces relevant to the threats posed by their unique theater.

GCCs are already responsible for theater intelligence collection and IO, and removing administrative barriers to integrating cyber-related effects would improve the IW capabilities in theater.  Although CMTs currently support GCCs and their theater campaign and operational plans, targeting effects are coordinated instead of tasked[9].  Integration of the CMTs as a fully operational JFCC would more efficiently synchronize non-kinetic effects throughout the targeting cycle.

Other Comments:  The current disjointed nature of DoD IW planning and operations prevents the full impact of non-kinetic effects to be realized.  While cyber, intelligence, EW, IO, and space operations are carried out by well-trained and equipped forces, these planning efforts remain stove-piped within their respective forces.  Until these operations are fully integrated, IW will remain a strength for adversaries who have organized their forces to exploit this military asymmetry.

Recommendation:  None.


Endnotes:

[1]  Richard Mosier, “NAVY INFORMATION WARFARE — WHAT IS IT?,” Center for International Maritime Security, September 13, 2016. http://cimsec.org/navy-information-warfare/27542

[2]  Vladimir Isachenkov, “Russia military acknowledges new branch: info warfare troops,” The Associated Press, February 22, 2017. http://bigstory.ap.org/article/8b7532462dd0495d9f756c9ae7d2ff3c/russian-military-continues-massive-upgrade

[3]  John Costello, “The Strategic Support Force: China’s Information Warfare Service,” The Jamestown Foundation, February 8, 2016. https://jamestown.org/program/the-strategic-support-force-chinas-information-warfare-service/#.V6AOI5MrKRv

[4]  Keir Giles, “The Next Phase of Russian Information Warfare,” The NATO STRATCOM Center of Excellence, accessed April 20, 2017. http://www.stratcomcoe.org/next-phase-russian-information-warfare-keir-giles

[5]  U.S. Joint Chiefs of Staff, “Joint Publication 2-0: Joint Intelligence”, October 22, 2013, Chapter III: Intelligence Organizations and Responsibilities, III-7-10.

[6]  U.S. Joint Chiefs of Staff, “Joint Publication 3-13: Information Operations”, November 20, 2014, Chapter III: Authorities, Responsibilities, and Legal Considerations, III-2; Chapter IV: Integrating Information-Related Capabilities into the Joint Operations Planning Process, IV-1-5.

[7]  U.S. Joint Chiefs of Staff, “Joint Publication 3-12 (R): Cyberspace Operations”, February 5, 2013, Chapter III: Authorities, Roles, and Responsibilities, III-4-7.

[8]  Ibid.

[9]  U.S. Cyber Command News Release, “All Cyber Mission Force Teams Achieve Initial Operating Capability,” U.S. Department of Defense, October 24, 2016.  https://www.defense.gov/News/Article/Article/984663/all-cyber-mission-force-teams-achieve-initial-operating-capability/

Brett Wessley Cyberspace Information and Intelligence Option Papers Planning Psychological Factors United States

Options for United States Intelligence Community Analytical Tools

Marvin Ebrahimi has served as an intelligence analyst.  Divergent Options’ content does not contain information of an official nature nor does the content represent the official position of any government, any organization, or any group.


National Security Situation:  Centralization of United States Intelligence Community (USIC) Analytical Tools.

Date Originally Written:  March 6, 2017.

Date Originally Published:  May 1, 2017.

Author and / or Article Point of View:  Author has served as an all-source analyst.  The author has personal experience with the countless tool suites, programs, and platforms used by intelligence analysts within the USIC to perform their analytical function, and the fact that there is no centralized collaborative environment from which such tools can be learned, evaluated, or selected for unit-specific mission sets tailored to the end-user.

Background:  Various USIC agencies and components have access to countless tool suites, some which are unique to that agency, and some available across the community.  These tools vary according to intelligence function, and are available across the multiple information systems used by intelligence components within the community[1].  While a baseline of tools are available to all, there is little centralization of these tools.  This lack of centralization requires analysts to learn and retain knowledge of how to manipulate the information systems and search engines at their disposal in order to find specific tools required for specific analytical functions.

Significance:  Digital collocation or compilation of analytical tool suites, programs, and platforms in a collaborative space would benefit all-source analysts who require access to a diverse set of tools required of their broadly focused function.  The knowledge and ability required to conduct tool-specific searches, i.e. manipulating a basic search engine in order to navigate to a landing page or agency-specific site where tool-specific information can hopefully be found, is time-consuming and detracts from analysts’ time available to conduct and employ analytical tradecraft.  This loss of time from an analyst’s daily schedule creates an opportunity cost that has yet to be realized.

Option #1:  Centralize analytical training, visibility, and accessibility to tool suites, programs, and platforms through creation of a USIC-wide collaborative analytical environment such as the Director of National Intelligence’s Intelligence Community IT Enterprise initiative[2].  Centralization of analytical training is used here to describe the necessity to train analysts on how to manipulate various databases, search engines, and environments effectively to find the specific tool suite or program required for their function.  Option #1 does not refer to centralizing all USIC member analytical training as outlined in the Intelligence Community Directive 203, “Analytical Standards.”

Risk:  Centralization of analytical training and accessibility leads to conflict primarily within respective USIC members’ unique identities and analytical functions.  The tools used by one agency may not work or be required by another.  Various agencies provide different functions, and centralizing access to tools requires further integration of all USIC agencies in the same or at least a compatible digital space.

Creation of a USIC-wide entity to collate, evaluate, and manage access to the multitude of tool suites, etc. creates yet another potentially bureaucratic element in an already robust community.  Such an entity would have to be controlled by the Office of the Director of National Intelligence unless delegated to a subordinate element.

Compartmentalization required to protect agencies’ sources and methods, analytical tradecraft included, may preclude community-wide collaborative efforts and information sharing necessary to create an effective space.

Gain:  Shared information is a shared reality.  Creation of a community-wide space to colocate analytical tool suites enables and facilitates collaboration, information exchange, and innovation across agencies.  These are positive factors in an information-driven age where rapid exchange of information is expected.

Effectiveness of end-user analysts increases with greater access to collective knowledge, skills, and abilities used by other professionals who perform similar specific analytical functions.

Community-wide efforts to improve efficiency in mission execution, development of analytical tradecraft, and the associated tools or programs can be captured and codified for mass implementation or dissemination thereby improving the overall capability of the USIC analytical workforce.

Option #2:  Improve analyst education and training that increases knowledge, skills, and abilities to better manipulate current and existing tool suites, programs, and platforms.

Risk:  USIC components continue to provide function-specific training to their own analysts while other agency partners do not benefit from agency-specific tools, due to inaccessibility or ignorance of tool), thereby decreasing the analytical effectiveness of the community as a whole.  Worded differently, community-wide access to the full range of analytical tools available does not reach all levels or entities.

Analysts rely on unstructured or personal means such as informal on-the-job training and personal networks in learning to manipulate current tools, while possibly remaining unaware or untrained on other analytical tools at their disposal elsewhere in the community.  Analysts rely on ingenuity and resolve to accomplish tasks but are not as efficient or effective.

Gain:  Analytical tradecraft unique to specific community members remains protected and inaccessible to analysts that do not possess required access, thereby preserving developed tradecraft.

Other Comments:  The author does not possess expert knowledge of collaborative spaces or the digital infrastructure required to create such an information environment; however, the author does have deployment experience as an end-user of several tool suites and programs utilized to perform specific analytical functions.  The majority of these were somewhat proliferated across a small portion of USIC components but were not widely accessible or available past basic search engine functions.

Recommendation:  None.


Endnotes:

[1] Treverton, G. F. (2016). New Tools for Collaboration. Retrieved March, 2016, from https://csis-prod.s3.amazonaws.com/s3fs-public/legacy_files/files/publication/160111_Treverton_NewTools_Web.pdf

[2] Office of the Director of National Intelligence IC IT Enterprise Fact Sheet. Retrieved March, 2017, from https://www.dni.gov/files/documents/IC%20ITE%20Fact%20Sheet.pdf

Information and Intelligence Information Systems Marvin Ebrahimi Option Papers United States