Critical Thinking: Decisions and System 1 System 2 Thinking

Critical Thinking: Decisions and System1 System2 Thinking

 

Figure 1 Integrated System 1 and System 2 Approach

Daniel Kahneman identified System 1 and System 2 as two different ways to think through issues and solve problems. System 1 is quick and intuitive. System 2 is slower and process knowledge driven. Good teams fuse the two approaches together that leverages the strensths of both, as shown in Figure 1. Stankovich and West (2000, p. 658) define System 1 and System 2 as:

Greenman House
Greenman House

“System 1 is characterized as automatic, largely unconscious, and relatively undemanding of computational capacity. Thus, it conjoins properties of automaticity and heuristic processing as these constructs have been variously discussed in the literature.”

“System 2 conjoins the various characteristics that have been viewed as typifying controlled processing. System 2 encompasses the processes of analytic intelligence that have traditionally been studied by information processing theorists trying to uncover the computational components underlying intelligence.”

As part of their discussion on Knowledge Management facilitation of decision-making, McKenzie et al. discuss three types of decisions (McKenzie et al., 2011, p. 406) :

  1. “Simple decisions are not necessarily easy decisions, but cause-and-effect linkages are readily identifiable and action produces a foreseeable outcome.
  2. Complicated decisions arise less frequently. Cause and effect linkages are still identifiable, but it is harder. It takes expertise to make sense of the situation and evaluate options.
  3. Complex decisions have no right answers. Although infrequent, they have big consequences. Cause and effect are indeterminable because of the interdependent factors and influences. The outcome of actions is unpredictable; patterns can only be identified in retrospect.”

These three categories provide a useful way to bin decisions and potentially govern how decision systems manage and processes them.

The System 1 decisions are based on heuristics, which are tied to intuition. That is why the figure in the piece on critical thinking and time uses the Hermit icon. The Hermit seeks to cast light on the darkness and understand our intuition and the value of the heuristics it implicitly uses. Effective organizations understand they make many System 1 decisions and try to understand the heuristics and whether the situation is within their relevant range.

The System 2 decisions are based on decision science and processes. However, they almost always include assumptions to span information gaps. The Jester Icon reminds us to test and validate these assumptions. In a medieval court, the Jester was often the only one who could question a decision, because he did it through humor and the ruler could readily deflect the humor, but still get the message. Culture is critical in how the Jester in an organization functions. If someone does not question the decisions, the organization could use flawed assumptions that could lead to poor decisions.

Table 1 Decision Types and Approaches
Decision Type System Type Approach
Simple System 1 Validate relevant range.
Complicated System 1 and System 2 Establish relevant rages. Establish hypothesis with System 1. Analyze and test hypothesis with System 2.
Complex System 1 and System 2 Set solution boundaries. Develop scenarios with System 1. Test and assess scenarios with System 2. Pick optimum scenario. May require multiple iterations.

The Cognitive Framework shown in Figure 2 shows how System 1 and System 2 fit into an integrated decision support approach to address complicated and complex problems..

The foundational components on the left are based on a paper by Morrison and Fletcher on Cognitive Readiness (Morrison & Fletcher, 2002). This framework updates them somewhat for current research and shows how they fit into a System 1 and System 2 model and integrates them into a holistic approach to decision-making, as shown in Figure 2.

Figure 2 Cognitive Readiness Framework

The goal is to use critical thinking to test biases and relevant ranges for heuristics. Ideally, we initially engage System 1 thinking to generate initial solutions and then System 2 thinking to refine and test the solutions to select the best, time permitting. Figure 2 also reflects the effects of Design Thinking and Scientific/Operational/Technical decision-making. While Design Thinking has a process, for many, it is a softer process, so it potentially blends elements of System 1 side of the decision-making process. Depending upon organizational culture, the Design Thinking spiral icon could sit between the System 1 and System 2 icons or on either side. We cannot discount the impact of organizational culture.

Literature Review

Akinci and Sadler-Smith (2018) write about the intuitive aspects of decision-making, which can bypass structured decision-making processes. They focus on the collaborative aspects of knowledge-building that can create a group intuition and learning. They wrote, “Organizational learning occurs as a result of cognitions initiated by individuals’ intuitions transcending to the group level by way of articulations and interactions between the group members through the processes of interpreting (cf. ‘externalization’) and integrating, and which become institutionalized at the organization level” (Akinci & Sadler-Smith, 2019, p. 560). Akinci and Sadler-Smith then bring in Wieck’s work and the concept of System 1 and System 2 thinking. The intuitive decision-making is a System 1, and the structured approach is a System 2 approach.

Lawson et al. (2020) and Arnott and Gao (2019) bring in System 1 and System 2, also discussed as thinking fast and thinking slow. Lawson et al. focus more on individual decisions, which may or may not require a decision-making system. They find that having the “right” rules and cognitive faculties are key to effective decision-making as well as mitigating decision bias. Arnott and Gao discuss these in terms of Behavioral economics (BE) and discuss ways to incorporate BE into structured decision-making and Decision Support Systems (DSS). They cite the work of Nobel Prize winner Daniel Kahneman in this field and cite his work on bounded rationality and how it affects decision-making. In this work, Kahneman also discusses System 1 and System 2 thinking and the concept of cognitive biases. An effective decision-making system must recognize cognitive biases.

McKenzie at al. (2011) discuss how knowledge management (KM) may help mitigate bias and facilitate other aspects of decision-making. Like Lawson et al. and Arnott and Goa, they cite Kahneman a great deal. They discuss the impact of emotions and biases on rational choice and how KM can mitigate it. Table II in their paper provides KM approaches to common cognitive biases (McKenzie et al., 2011, p. 408). Ghasemaghaei (2019) concurs that KM is an integral part of decision-making and discusses how KM can help to manage data’s variety, volume, and velocity. Their findings also show data analytic tools enhance the sharing of knowledge sharing improves decision-making quality.

Cognitive biases impact both deliberate and intuitive decision-making and individual and group decision-making. Both Lawson et al. and Arnott and Gao note that rules may facilitate decision-making and mitigate bias. Rules may be heuristics or automatically applied rules. Heuristics are highly correlated with System 1 decisions, but groups can use them in System 2 as well. They can also be part of the model. The model’s rules may be static or adaptive. In a static model, planners and data scientists must manually adjust the rules. In an adaptive model, Machine Learning (ML) or Artificial Intelligence (AI) may change the rules as it learns the system. However, as Ntoutsi et al. note (2020) and Righetti et al. (2019), even AI systems may have their own sets of biases.

Shrestha et al. (2019) note AI’s impact on organizational decision-making and discusses key factors, among which are large data sets and cost. Samek and Muller note they can also become black boxes if users do not understand how ML and AI work and the assumptions built into them. While AI and ML have potential, their costs and complexity may exceed the capabilities and capacities of smaller organizations.

Dogma and belief and mental models are essentially System 1 thinking masquerading as System 2 thinking. In a paper on policy legitimization, Jensen (Jensen, 2003, pp. 524-525) notes, “Cognitive legitimacy can occur when a policy is “taken for granted,” when it is viewed as necessary or inevitable (Suchman 1995). In some cases, cultural models that provide justification for the policy and its objectives may be in place. In other cases, “taken-for-granted” legitimacy rises to a level where dissent is not possible because an organization, innovation, or policy is part of the social structure (Scott 1995).”The cognitive framework must help decision-makers understand when System 1 thinking drives System 2 type decisions and biases and heuristics inherent in System 1 thought shape or even cut off discourse and dialog. This is also vital to an effective Community of Practice.

Pöyhönen (Pöyhönen, 2017) and Baggio et al. (Baggio et al., 2019) both show that cognitive diversity is valuable in complex and difficult problem sets, but less so in more routine cases. However, they do not explicitly state whether they focus their study on System 1 and System 2 decisions. Stankovich and West (2000, p. 658) define System 1 as subconscious decisions dominated by heuristics, and System 2 as deliberate decisions based on an assessment of alternatives. Their research shows that 80% of the decisions people make are System 1 decisions. Therefore, these two papers potentially miss a critical area of decision-making—the everyday decisions that execute an operational plan. This is especially true with subconscious biases that can sway both System 1 and System 2 thinking. Without cognitive diversity, these biases may be completely overlooked and never be challenged.

As we continue to refine our approaches to conflict, more research into trust, adaptive leadership, and adaptive cultures may help to validate Olson et al.’s approach to competence-based trust. This could be combined with research on the relationship between cognitive diversity and complex problems and validating the value of cognitive diversity in mitigating the affects of bias on System 1 thinking. Does trust help identify hidden biases?If so cognitive diversity shows relevance for both System 1 and System 2 thinking, regardless of the level of complexity and difficulty.

Selected Bibliography

Akinci, C., & Sadler-Smith, E. (2019). Collective Intuition: Implications for Improved Decision Making and Organizational Learning. British Journal of Management, 30(3), 558–577. https://doi.org/10.1111/1467-8551.12269.

Arnott, D., & Gao, S. (2019). Behavioral economics for .decision support systems researchers. Decision Support Systems, 122(February), 113063. https://doi.org/10.1016/j.dss.2019.05.003’

Aven, T. (2018). How the integration of System 1-System 2 thinking and recent risk perspectives can improve risk assessment and management. Reliability Engineering and System Safety, 180(July), 237–244. https://doi.org/10.1016/j.ress.2018.07.031.

Ghasemaghaei, M. (2019). Does data analytics use improve firm decision making quality? The role of knowledge sharing and data analytics competency. Decision Support Systems, 120(January), 14–24. https://doi.org/10.1016/j.dss.2019.03.004.

Kahneman, D. (2011). Thinking, Fast and Slow. Farrar, Straus, Giroux.

Jensen, J. L. (2003). Policy Diffusion through Institutional Legitimation: State Lotteries. Journal of Public Administration Research and Theory, 13(4), 521–541. https://doi.org/10.1093/jpoart/mug033.

Lawson, M. A., Larrick, R. P., & Soll, J. B. (2020). Comparing fast thinking and slow thinking: The relative benefits of interventions, individual differences, and inferential rules. Judgment and Decision Making, 15 (5), 660–684.

McKenzie, J., van Winkelen, C., & Grewal, S. (2011). Developing organisational decision-making capability: A knowledge manager’s guide. Journal of Knowledge Management.

Morrison, J. E., & Fletcher, J. D. (2002). Cognitive Readiness. October, (October), 48. Retrieved from http://www.dtic.mil/cgi-bin/GetTRDoc?AD=ADA417618.

Ntoutsi, E., Fafalios, P., Gadiraju, U., Iosifidis, V., Nejdl, W., Vidal, M. E., Ruggieri, S., Turini, F., Papadopoulos, S., Krasanakis, E., Kompatsiaris, I., Kinder-Kurlanda, K., Wagner, C., Karimi, F., Fernandez, M., Alani, H., Berendt, B., Kruegel, T., Heinze, C., … Staab, S. (2020). Bias in data-driven artificial intelligence systems—An introductory survey. Wiley Interdisciplinary Reviews: Data Mining and Knowledge Discovery, 10(3), 1–14. https://doi.org/10.1002/widm.1356.

Pöyhönen, S. (2017). Value of cognitive diversity in science. Synthese, 194(11), 4519–4540. https://doi.org/10.1007/s11229-016-1147-4.

Righetti, L., Madhavan, R., & Chatila, R. (2019). Unintended Consequences of Biased Robotic and Artificial Intelligence Systems [Ethical, Legal, and Societal Issues]. IEEE Robotics and Automation Magazine, 26(3), 11–13. https://doi.org/10.1109/MRA.2019.2926996..

SCOTT, W. R. (2014). W. Richard SCOTT (1995), Institutions and Organizations. Ideas, Interests and Identities. Management, 17(2). https://doi.org/10.3917/mana.172.0136.

Shrestha, Y. R., Ben-Menahem, S. M., & von Krogh, G. (2019). Organizational Decision-Making Structures in the Age of Artificial Intelligence. California Management Review, 66–83. https://doi.org/10.1177/0008125619862257.

Stanovich, K. E., & West, R. F. (2003). “Individual differences in reasoning: Implications for the rationality debate? Behavioral and Brain Sciences, 26(4), 527. https://doi.org/10.1017/S0140525X03210116.

Follow AFNN:

Facebook: https://m.facebook.com/afnnusa

Telegram: https://t.me/joinchat/2_-GAzcXmIRjODNh

Twitter: https://twitter.com/AfnnUsa

GETTR: https://gettr.com/user/AFNN_USA

CloutHub: @AFNN_USA

 

2 thoughts on “Critical Thinking: Decisions and System 1 System 2 Thinking”

  1. “It pays to be obvious, especially if you have a reputation for subtlety”
    Mixing the two methods to create a third
    Non-linear thinking

    In grade school doing math problems, teacher wanted me to “show” my work, well . . . I do it my head without thinking much.

    47 US Code Section 230 (c)(2)(A) Congress made a law which abridges 1st Amendment

  2. (a)Findings
    The Congress finds the following:
    (1)The rapidly developing array of Internet and other interactive computer services available to individual Americans represent an extraordinary advance in the availability of educational and informational resources to our citizens.
    (2)These services offer users a great degree of control over the information that they receive, as well as the potential for even greater control in the future as technology develops.
    (3)The Internet and other interactive computer services offer a forum for a true diversity of political discourse, unique opportunities for cultural development, and myriad avenues for intellectual activity.
    (4)The Internet and other interactive computer services have flourished, to the benefit of all Americans, with a minimum of government regulation.
    (5)Increasingly Americans are relying on interactive media for a variety of political, educational, cultural, and entertainment services.
    (b)Policy
    It is the policy of the United States—
    (1)to promote the continued development of the Internet and other interactive computer services and other interactive media;
    (2)to preserve the vibrant and competitive free market that presently exists for the Internet and other interactive computer services, unfettered by Federal or State regulation;
    (3)to encourage the development of technologies which maximize user control over what information is received by individuals, families, and schools who use the Internet and other interactive computer services;
    (4)to remove disincentives for the development and utilization of blocking and filtering technologies that empower parents to restrict their children’s access to objectionable or inappropriate online material; and
    (5)to ensure vigorous enforcement of Federal criminal laws to deter and punish trafficking in obscenity, stalking, and harassment by means of computer.
    (c)Protection for “Good Samaritan” blocking and screening of offensive material
    (1)Treatment of publisher or speaker
    No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.

    (2)Civil liability
    No provider or user of an interactive computer service shall be held liable on account of—
    (A)any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected; or
    (B)any action taken to enable or make available to information content providers or others the technical means to restrict access to material described in paragraph (1).[1]
    (d)Obligations of interactive computer service
    A provider of interactive computer service shall, at the time of entering an agreement with a customer for the provision of interactive computer service and in a manner deemed appropriate by the provider, notify such customer that parental control protections (such as computer hardware, software, or filtering services) are commercially available that may assist the customer in limiting access to material that is harmful to minors. Such notice shall identify, or provide the customer with access to information identifying, current providers of such protections.

    (e)Effect on other laws
    (1)No effect on criminal law
    Nothing in this section shall be construed to impair the enforcement of section 223 or 231 of this title, chapter 71 (relating to obscenity) or 110 (relating to sexual exploitation of children) of title 18, or any other Federal criminal statute.

    (2)No effect on intellectual property law
    Nothing in this section shall be construed to limit or expand any law pertaining to intellectual property.

    (3)State law
    Nothing in this section shall be construed to prevent any State from enforcing any State law that is consistent with this section. No cause of action may be brought and no liability may be imposed under any State or local law that is inconsistent with this section.

    (4)No effect on communications privacy law
    Nothing in this section shall be construed to limit the application of the Electronic Communications Privacy Act of 1986 or any of the amendments made by such Act, or any similar State law.

    (5)No effect on sex trafficking law
    Nothing in this section (other than subsection (c)(2)(A)) shall be construed to impair or limit—
    (A)any claim in a civil action brought under section 1595 of title 18, if the conduct underlying the claim constitutes a violation of section 1591 of that title;
    (B)any charge in a criminal prosecution brought under State law if the conduct underlying the charge would constitute a violation of section 1591 of title 18; or
    (C)any charge in a criminal prosecution brought under State law if the conduct underlying the charge would constitute a violation of section 2421A of title 18, and promotion or facilitation of prostitution is illegal in the jurisdiction where the defendant’s promotion or facilitation of prostitution was targeted.
    (f)Definitions
    As used in this section:
    (1)Internet
    The term “Internet” means the international computer network of both Federal and non-Federal interoperable packet switched data networks.

    (2)Interactive computer service
    The term “interactive computer service” means any information service, system, or access software provider that provides or enables computer access by multiple users to a computer server, including specifically a service or system that provides access to the Internet and such systems operated or services offered by libraries or educational institutions.

    (3)Information content provider
    The term “information content provider” means any person or entity that is responsible, in whole or in part, for the creation or development of information provided through the Internet or any other interactive computer service.

    (4)Access software provider
    The term “access software provider” means a provider of software (including client or server software), or enabling tools that do any one or more of the following:
    (A)filter, screen, allow, or disallow content;
    (B)pick, choose, analyze, or digest content; or
    (C)transmit, receive, display, forward, cache, search, subset, organize, reorganize, or translate content.
    (June 19, 1934, ch. 652, title II, § 230, as added Pub. L. 104–104, title V, § 509, Feb. 8, 1996, 110 Stat. 137; amended Pub. L. 105–277, div. C, title XIV, § 1404(a), Oct. 21, 1998, 112 Stat. 2681–739; Pub. L. 115–164, § 4(a), Apr. 11, 2018, 132 Stat. 1254.)
    U.S. Code Toolbox
    Law about… Articles from Wex
    Table of Popular Names
    Parallel Table of Authorities
    How current is this?
    SPONSORED LISTINGS

    Monte E. Kuligowski
    Monte E. Kuligowski
    PREMIUM
    (757) 424-5434
    Virginia Beach, VA
    Criminal Law, DUI & DWI, Personal Injury, Medical Malpractice, White Collar Crime, Traffic Tickets
    Website
    Email
    Profile
    Richard Neal Shapiro Esq
    Richard Neal Shapiro Esq
    PREMIUM
    (833) 997-1774
    Virginia Beach, VA
    Asbestos & Mesothelioma, Products Liability, Personal Injury, Medical Malpractice
    Website
    Email
    Profile
    John Cooper
    John Cooper
    PREMIUM
    (757) 309-4711
    Virginia Beach, VA
    Personal Injury
    Website
    Email
    Profile
    Brian M. Latuga
    Brian M. Latuga
    PREMIUM
    (757) 687-3657
    Virginia Beach, VA
    Criminal Law, DUI & DWI, Domestic Violence, Traffic Tickets, Juvenile Law, White Collar Crime, Cannabis & Marijuana Law, Animal & Dog Law
    Website
    Email
    Profile
    David J. Pierce
    David J. Pierce
    PREMIUM
    (888) 377-0422
    Virginia Beach, VA
    Medical Malpractice, Personal Injury, Nursing Home Abuse
    Website
    Email
    Profile
    Avery T. Waterman Jr.
    PRO label
    Avery T. Waterman Jr.
    Hampton, VA

    (757) 881-9881
    Personal Injury, Medical Malpractice, Nursing Home Abuse, Products Liability
    Website
    Email
    Profile
    ACCESSIBILITY
    ABOUT LII

Leave a Comment