BECO: Behavioral Economics of Cyberspace Operations

https://api.army.mil/e2/c/images/2019/10/24/568701/size0-full.jpg
Image Credit: US Army

Posted: February 9, 2016 | By: Victoria Fineberg

BECO Mitigation

Biased decisions are frequently made when individuals are in a “hot” state, i.e., their reflexive thinking dominates their logical thinking (Ariely, 2009, pp. 120-121). This paper proposes a structured mitigation approach for preparing friendly-side cyberactors for potential hot states as depicted in Figure 9.

becofig9

The mitigation framework covers a range of approaches starting with the hot state avoidance (1), proceeding to switching to a cold state in different parts of the process (2 and 3), and then moving to various approaches to the preparation to and management of the hot state itself (4 through 8). This framework formalizes recommendations found in discrete sources as follows:

  1. 1. Avoid some hot states all together (Ariely, 2009, pp. 130-131), because upon entering these states resistance to temptation becomes extremely difficult. A BECO example of such hot-state avoidance is blocking access to the Internet pornography sites.
  2. 2. When a choice is difficult to make, because all options have approximately the same utility even as the details vary, choose one option and stick to it instead of prolonging the analysis and getting paralyzed by choice (Ariely, 2009, pp. 194-196). In BECO, this corresponds to choosing certain critical operational responses in advance.
  3. 3. Powerful technical and process controls must be defined to activate cyberactors’ cold state at the point where they are likely to make critical mistakes, thus satisfying Kahneman’s wish to have “a warning bell that rings loudly whenever we are about to make a serious error” (2011, p. 417). An existing example of such a control is an operating system that asks users to confirm that they want to delete a file. In BECO, Tactics, Techniques, and Procedures (TTPs) must be defined for anticipated critical decision points, forcing cyber warriors to invoke their System 2 thinking.
  4. 4. Nudges (Thaler & Sunstein, 2009) and defaults are used to suggest a preferred option without forcing it. In BECO, this approach is more applicable to cyber users who are free to choose than to warfighters who can be compelled to act in certain ways by their organizations.
  5. 5. Taleb (2010) urges to consider Black-Swan events not as exceptions that are explained a posteriori, but as a class of low-probability high-impact events that cannot be individually predicted. The best preparation for potential Black Swans is to cultivate antifragility (Taleb, 2012) that would protect an entity from a broad range of calamities. A current example of such preparation is Continuity Of Operations Planning (COOP), which Fineberg recommends enhancing with random stress testing (2012). Stress testing for developing antifragility should also be incorporated into a variety of BECOscenarios, e.g., cyber flag exercises (Alexander, 2012, p. 14).
  6. 6. Behavioral economists warn that the knowledge of cognitive biases does not prevent people from committing these biases. Kahneman admits that his intuitive thinking is just as prone to overconfidence and other System 1 manifestations as it was before he started studying these issues. However, he has improved his ability to recognize situations in which errors are likely, and once he recognizes them, to slow down and invoke System 2 (2011, p. 417). It is also easier to recognize errors of others than one’s own, because “observers are less cognitively busy and more open to information than actions.” Kahneman recommends having water-cooler discussions to take advantage of the group influences. BECOtraining should include developing the recognition of error-prone situations, and BECO CONOPS should include activities that activate group influences.
  7. 7. Conditioning for specific situations prepares people for taking the correct action when the situation arises, as for example, practiced by psychiatrists in the behavioral therapy of Exposure Response Prevention (ERP) for treating conditions such as panic. ERP practitioners select appropriate frequency, duration, rate of build-up, and escape prevention to achieve high levels of effectiveness. Likewise, certain combat situations require a single instantaneous decisive action. For physical combat, Grossman and Christensen recommend operand conditioning, i.e., realistic training until a warrior performs required actions automatically without thinking, because “whatever you rehearse is what you do under stress” (2007, p. 47). For example, practice shooting at moving targets shaped as human silhouettes has increased the front-line firing rate from 15 to 20 percent in World War II to 90 percent during the Vietnam War. The BECO counterpart of such a conditioning is General Alexander’s request for a single standard for taking action (2012, p. 14).
  8. 8. People can be prepared for decision making in a process called “priming,” which is widely used in cognitive psychology experiments to affect the choices people make in a hot state. For example, Ariely (2009, p. 283) shows how reciting of the Ten Commandments prior to exams has resulted in significantly reduced student cheating. Likewise, inBECO cyber warfighters can be primed with the reminders of their honor code.

While cognitive biases have been extensively identified and thoroughly studied, their mitigation is challenging. A critical issue is that mitigation may work in laboratory experiments but not in real-life scenarios. Another problem is that any given mitigation may work in a short term but wear off with repetition. Nevertheless, the principal reason for identifying cognitive biases in BECO is the potential ability to develop effective responses. To facilitate research of mitigation, a full-scope cyber force such as USCYBERCOM can use its defense forces to test its attackers and use its attack forces to test its defenders as depicted in Figure 10.

  

Image14672_fmt

Figure 10. Bias testing architectures.

The top part of Figure 10 illustrates how Red Team (RT) probing of vulnerabilities of the friendly-side defenders can be used to strengthen friendly defenders (fD) and weaken adversary defenders (aD). For example, RTs may discover that defenders get accustomed to false alarms and start neglecting them. To mitigate this tendency with fDs, new TTPs will be implemented to vary the strength and appearance of the alarms using psychological techniques of irregular reinforcement. To exploit this tendency with aDs, friendly attackers (fA) will stage multiple false attacks before launching the actual attack.

The bottom part of Figure 10 illustrates how cognitive biases revealed by RTs can be used to strengthen friendly attackers (fA) and weaken adversary attackers (aA). For example, an attacker may be affected the paradox of choice, i.e., getting paralyzed with indecision when confronted with too many choices (Iyengar & Lepper, 2000). To exploit it, fDs can present to aAs many enticing choices. To mitigate it, fAs can be requested to follow strict decision making processes for selecting their targets and abandoning exploits upon reaching certain thresholds, thus eliminating the perils of choice.

These mitigating approaches and their details must be thoroughly researched and carefully implemented to provide the friendly side with tangible advantages in the cyber warfare. An important part of this research is the role of leaders and groups, who serve as psychological weapons (Grossman & Christensen, 2007, pp. 205-208) and, as most other organizations, “naturally think more slowly … and impose orderly procedures” (Kahneman, 2011, p. 418), thus mitigating the quirks of the individual human cognition.

Conclusions

This paper proposes a novel framework BECO of using the behavioral economics (BE) models of cognitive biases in judgment and decision making for hardening cyberspace operations (CO). BE adapts psychology research to economic models, thus creating more accurate representations of human interactions. BEC (Fineberg, 2014) uses BE discoveries to modify the risk management framework of cybersecurity by introducing a new class of vulnerabilities corresponding to persistent human biases. And now BECO applies the BEC framework to cyberspace operations by providing an overarching approach to the cognitive characteristics of the full spectrum of the CO actors and scenarios. Cyberspace operations are exemplified by the USCYBERCOM’s mission, and cyberactors include attackers, defenders, and users on both the friendly and adversary sides. The paper reviews selected BE biases applicable to CO and offers a structured approach to the cognitive bias mitigation.

BECO provides an asymmetric advantage to cyber superpowers that have resources to research cognitive biases in their operations and implement effective controls. While non-state actors may obtain technologies developed by major states, they cannot replicate a unique operational environment of a cyber power. Furthermore, full scope forces, such as USCYBERCOM, can use their attack and defense capabilities to cross-test and strengthen the cognitive aspects of both. BECO goals are to define interdisciplinary research of cognition in cyberoperations, develop cyberoperations policies and strategies, and train cyber workforce.

References

Alexander, K. B. (2012). Statement before the Senate Committee on Armed Services. Retrieved fromhttp://www.airforcemag.com/SiteCollectionDocuments/Reports/2012/March2012/Day28/032812alexander.pdf.

Ariely, D. (2009). Predictably irrational: The hidden forces that shape our decisions. Revised and expanded edition. New York, NY: Harper Perennial.

Ariely, D. (2012). The (honest) truth about dishonesty: How we lie to everyone—Especially ourselves. New York, NY: HarperCollins Publishers.

Ariely, D., Loewenstein, G., & Prelec, D. (2000). Coherent arbitrariness: Duration-sensitive pricing of hedonic stimuli around an arbitrary anchor. SSRN. Retrieved from http://papers.ssrn.com/sol3/papers.cfm?abstract_id=243109.

Ariely, D. & Norton, M. I. (2008). How actions create – not just reveal – preferences. Trends in Cognitive Sciences, 12 (1), 13-16.

Festinger, L. (1962). Cognitive dissonance. Scientific American, 207(4), 93-107.

Fineberg, V. (2012). COOP hardening against Black Swans. The Business Continuity and Resiliency Journal, 1(3), 14-24.

Fineberg, V. (2014). BEC: Applying behavioral economics to harden cyberspace. Journal of Cybersecurity and Information Systems, 2(1), 27-33. Retrieved from .

Grossman, D. & Christensen, L. W. (2007). On combat: The psychology and physiology of deadly conflict in war and in peace. 2nd Edition. PPCT Research Publications.

Holton, J. W. (2011). The Pashtun behavior economy: An analysis of decision making in tribal society. Master’s Thesis. Naval Postgraduate School. Monterey, CA. Retrieved from .

Iyengar, S. S. & Lepper, M. R. (2000). When choice is demotivating: Can one desire too much of a good thing? Journal of Personality and Social Psychology, 79(6), 995-1006. Retrieved from .

JP 3-13. (2012). Information operations. Joint Publication 3-13. Retrieved from http://www.dtic.mil/doctrine/new_pubs/jp3_13.pdf.

Kahneman, D. (2006). [Video File]. History and rationality lecture series. Hebrew University. Retrieved from http://www.youtube.com/watch?v=3CWm3i74mHI.

Kahneman, D. (2011). Thinking, fast and slow. New York, NY: Farrar, Straus and Giroux.

Kahneman, D. (2013). [Video File]. Annual Hans Maeder lecture with Nobel Prize-winning psychologist Daniel Kahneman. The New School. Retrieved from http://www.youtube.com/watch?v=l91ahHR5-i0&list=PLUWrLGgGJAm9pm4ANtiGk4VVflf45Hz0P&index=7.

Kahneman, D. & Renshon, J. (2009). Hawkish biases. Expanded version of an essay that appeared in American Foreign Policy and the Politics of Fear: Threat Inflation Since 9/11. New York, NY: Routledge Press, 79-96. Retrieved fromhttp://www.princeton.edu/~kahneman/docs/Publications/Hawkish%20Biases.pdf

Mackay, A. & Tatham S. (2011). Behavioural conflict: Why understanding people and their motives will prove decisive in future conflict. Saffron Walden, Essex, UK: Military Studies Press.

NIST 800-39. (2011). Managing information security risk: Organization, mission, and information system view. NIST Special Publication 800-39. Gaithersburg, MD: Computer Security Division, Information Technology Laboratory, National Institute of Standards and Technology. Retrieved fromhttp://csrc.nist.gov/publications/nistpubs/800-39/SP800-39-final.pdf.

Pellerin, C. (2013a). Cyber Command adapts to understand cyber battlespace. U.S. Department of Defense. Retrieved fromhttp://www.defense.gov/news/newsarticle.aspx?id=119470.

Pellerin, C. (2013b). DOD readies elements crucial to Cyber Operations. U.S. Department of Defense. Retrieved fromhttp://www.defense.gov/news/newsarticle.aspx?id=120381.

Rabin, M. (1996). Psychology and Economics. Retrieved from http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.42.9558&rep=rep1&type=pdf.

Stavridis, J. G. & Parker, E. C. III. (2012). Sailing the cyber sea. JFQ, 65(2), 61-67.

Taleb, N. N. (2010). The Black Swan: The impact of the highly improbable. New York, NY: Random House.

Taleb, N. N. (2012). Antifragile: Things that gain from disorder. New York, NY: Random House.

Thaler, R. H. & Sunstein, C. R. (2009). Nudge: Improving decisions about health, wealth, and happiness. London, England: Penguin Books.

U.S. Cyber Command. (2013). United States Strategic Command factsheet: U.S. Cyber Command. Retrieved from http://www.stratcom.mil/factsheets/Cyber_Command/.

Want to find out more about this topic?

Request a FREE Technical Inquiry!