Can computerized decision support reduce cognitive biases in decision making?
by Dan Power
Editor, DSSResources.COM
Individuals have predictable thinking patterns. Some patterns are positive and others lead to poor choices. A person's predictable thinking patterns are generally termed cognitive biases. An individual may learn many of the patterns, but the frequency and impact may differ among individuals. Thinking patterns impact what a person likes and dislikes. A computerized decision support system may reinforce some known biases or change how a user thinks about a situation in a favorable way. People have limitations as information processors and biases can and often do reduce the amount of thinking and processing that a person does to make a choice. A decision aid or decision support system can also impact how much thinking and information gathering a person must do in a situation.
In the early days of computerized decision support, the Sabre Reservation system exploited human information processing biases and limitations to increase sales of tickets on American Airlines flights. Flights on American Airlines were displayed first in output displays. Because of this system, the U.S. Department of Transportation and the U.S. Courts restricted and prohibited such practices. The Computer Reservations System (CRS) Regulations originally adopted in 1984 prohibited display bias. The current regulation (2004) notes "Display bias has been a concern since the systems were first developed. Experience has demonstrated that travel agents are likely to book one of the first services displayed by a system in response to a travel agent's request for information, even if services shown later in the display would better satisfy the customer's needs. If systems give preferential display positions to one airline's services, that display bias will harm airline competition and cause consumers to be misled."
Cognitive biases exist. People are predisposed to make choices by the way information is presented and the way analyses are conducted. But debiasing or unbiased presentation has often been a secondary motivation for building DSS. It is often easy for managers to accept that some people are biased decision makers, but that doesn't mean they think their decision making is biased or at least not in the situation where a proposed DSS will be used. Also, DSS builders assume their targeted users are rational thinkers (cf. Power, 2004).
In general, cognitive bias has been an issue raised more by academic researchers than one raised by industry consultants and practitioners. If DSS builders are consciously attempting to expand the boundary of rational managerial decision making behavior, then they must be familiar with the cognitive biases that can impact human information processing. We MUST ask and explore how DSS can reduce or even eliminate significant cognitive biases.
Also, DSS can encourage and even promote biased decision making, building such systems may not however be ethical or legal. As DSS builders we must ask if it is ever desirable and ethical to reinforce or exploit known cognitive biases when building a DSS. And if it is, when and in what circumstances? This Ask Dan! won't resolve or even offer specific guidance on these troubling questions. Rather the focus is on exploring biases in the context of decision support.
Below is a list of common cognitive biases with comments related to building decision support systems. The following list is based upon various sources including Tversky and Kahneman (1974), Kahneman, Slovic, and Tversky (1982), and Wikipedia.
- Anchoring and adjustment - Decision-makers "anchor" on the initial information they receive and that influences how subsequent information is interpreted. So for example, in a data-driven DSS for business performance management the dashboard screen metrics will significantly impact how subsequent data and analyses are interpreted.
- Attribution - Decision-makers tend to attribute successes to their own actions and abilities, but attribute failures to bad luck and external factors. Also there is a tendency to attribute a competitor's success to good luck, and a competitor's failure to mistakes. In a data-driven DSS, managers should be encouraged to ask why questions about summary data values. Why did profit increase 25% in the most recent quarter? Why did the in-process inventory increase 20%?
- Availability - Decision-makers estimate the probability of an outcome based upon how easy that outcome is to imagine. In a model-driven DSS, when probabilities are elicited a DSS should encourage information gathering prior to the input of any probability estimates. Competing scenarios can potentially reduce this bias.
- Causal attribution - Decision-makers tend to ascribe causal explanations even when the evidence only suggests correlation. In data-driven DSS, cross-tabulation displays should be "interpreted" when possible or a disclaimer should be provided about this problem.
- Confirmation - Decision-makers tend to explain away inconsistent and negative evidence. Negative evidence is sometimes used to confirm a pre-existing hypothesis. A data-driven DSS should be used early in a decision making process and multiple decision-makers should have access to and use a specific DSS.
- Conservatism, tradition and inertia - Decision-makers repeat previously successful thought patterns and analyses when confronted with new circumstances. In a knowledge-driven DSS, it is important to periodically check that circumstances have not changed. Model-driven DSS also need to be periodically reviewed and updated. Decision makers need to monitor changes in situations and circumstances.
- Escalating commitment - Decision-makers often look at a decision as a small step in a sequential decision process and this encourages commitment to a course of action. DSS that are tightly linked to a particular strategy reinforce this tendency. Also, the selection of critical success factors in data-driven DSS can reinforce commitment to a course of action. Managers needed to periodically revisit the metrics used to monitor organization performance.
- Experience limitations - Decision-makers are often constrained by past experiences. A planning-oriented DSS should include a wide-range of scenarios from multiple stakeholders to expand the experience horizon of decision-makers.
- Faulty generalizations - Decision-makers simplify complex interactions and group ideas, things and people. These generalizations influence decisions. A DSS builder should explicitly state assumptions and generalizations about the models in a DSS.
- Inconsistency - Decision-makers do not consistently apply the same decision criteria in similar decision situations. Screening and evaluation models in model-driven DSS can help insure consistency. Consistency is only desirable however when the criteria are appropriately identified and appropriately weighted.
- Premature closure - Decision-makers tend to terminate the search for evidence quickly and accept the first alternative that is feasible. Data and document-driven DSS can make search easier and a user friendly interface can encourage further search.
- Recency - Decision-makers tend to place the greatest attention on more recent information and either ignore or forget historical information. When possible, data-driven DSS should put recent information in a context of historical information. For example, the current month, prior month and the year ago month's sales data should be presented.
- Repetition - Decision-makers often believe what they have been told repeatedly and by the greatest number of different sources. Data and document-driven DSS need to help identify the source of data and a single source should not be presented many times to bolster the same conclusion. In web-based search, the same source can often appear in many results.
- Representativeness -- Decision-makers often judge events, people and things based upon how similar they are to a prior case example. This approach can work effectively in some situations and it is used in case-based reasoning in some knowledge-driven DSS. DSS builders need to monitor systems that rely on a representativeness heuristic.
- Role fulfillment - Decision-makers often conform to the expectations that others have of them. If the expectation is that a manager will use a specific DSS, then it is more likely s/he will use the DSS. The reverse of this also holds. DSS builders should examine the role of a decision maker/user as part of DSS analysis and design.
- Selective perception - Decision-makers actively screen-out information that is considered as irrelevant and unimportant. This perceptual bias helps reduce information load, but if the decision-maker is prejudiced about the decision outcome then important information will be ignored. A data-driven DSS can present predefined information and the rationale for what information is presented can be examined and disclosed.
- Selective search for evidence - Decision-makers tend to gather facts that support certain conclusions, but ignore other facts that might support different conclusions. This tendency encourages some managers to use decision support to bolster previously made decisions and to rationalize their conclusions. When possible, DSS should attempt to encourage unbiased search. Often a review of search efforts can identify additional search topics.
- Source credibility - Decision-makers sometimes reject information because of the source. A healthy skepticism about source credibility should be encouraged in data and document-driven DSS. Information about a source's race, nationality, religion or other potentially prejudicial source information should not however be readily available to DSS users. Source information should focus on relevant qualifications.
- Underestimating uncertainty and having an illusion of control - Decision-makers tend to underestimate uncertainty about future events and outcomes. This occurs because people believe they have more control over outcomes than they often do. The tendency is to believe one has adequate control to minimize potential problems from decisions. If decision-makers will use DSS for contingency planning, such systems can potentially help reduce this bias.
- Wishful thinking and unwarranted optimism - Decision-makers tend to assume the "best" outcome will occur. It is a natural tendency to view events in a positive frame of reference and this bias can distort perception and thinking. DSS should present multiple scenarios when possible including "worst case" scenarios.
George Dvorsky (2013), a Canadian futurist, science writer, and ethicist, argues that there are 12 cognitive biases that prevent people from being rational. He explains briefly the biases labeled 1) confirmation bias, 2) in-group bias, 3) gambler's fallacy, 4) post-purchase rationalization, 5) neglecting probability, 6) observational selection bias, 7) status-quo bias, 8) negativity bias, 9) bandwagon effect, 10) projection bias, 11) the current moment bias, and 12) anchoring effect. Lists of cognitive biases are long and the terminology varies from author to author. Computerized decision support can reduce the impact of some of the 70 or 80 identified biases. More attention must be given to cognitive biases when we design computerized analytical systems. Using a decision support or predictive analytics system does NOT guarantee rational decision making.
According to the Wikipedia entry on Decision Making (2005), "Due to the large number of considerations involved in many decisions, decision support systems have been developed to assist decision makers in considering the implications of various courses of action. They can help reduce the risk of errors."
Huber and Power (1985) identify 4 primary reasons why informants provide inaccurate or biased data: 1) The are motivated to do so; 2) Their perceptual and cognitive limitations result in inadvertent errors; 3) They lack crucial information about the event of interest; and 4) They have been questioned with an inappropriate data elicitation technique (p. 172). Well-designed decision support can help overcome the last 3 reasons.
Davis, Kulick, and Egner (2005) recommend that "Decision support should appeal to both the rational-analytic and the intuitive capabilities of the decision-maker, with a balance of 'cold' and story-based presentation of analysis and recommendations. The particular balance should depend on characteristics of the decision, the decision environment, and the decision-maker. (p. xix)"
Computerized decision support systems can and do impact, eliminate, exploit, and reduce cognitive biases in decision making. While many DSS are intended to reduce the effects of judgmental biases, there has been little consideration or investigation of how using a DSS may contribute to biased decision making, cf., Davis, Kulick, and Egner, 2005. Bias is shaped, taught, learned and unlearned, avoided and enhanced. Perhaps there is no possibility of debiasing a decision maker or of structuring an unbiased decision process. Perhaps the most we can hope for is awareness of and sensitivity to our biases and of biases in general.
References
Cognitive Technologies, "Biases in Decision Making", http://www.cog-tech.com/projects/Biases.htm.
Computer Reservations System (CRS) Regulations, http://www.dot.gov/affairs/Computer%20Reservations%20System.htm, Office of the Secretary, Department of Transportation, January 31, 2004.
Davis, P. K., J. Kulick, and M. Egner, "Implications of Modern Decision Science for Military Decision-Support Systems," RAND Corporation, 2005 at URL https://www.rand.org/pubs/monographs/MG360.html.
Dvorsky, G., "The 12 cognitive biases that prevent you from being rational," io9, 1/09/13, at URL http://io9.com/5974468/the-most-common-cognitive-biases-that-prevent-you-from-being-rational
Huber, G. P. and D.J. Power, "Retrospective reports of strategic-level managers: Guidelines for increasing their accuracy," Strategic Management Journal 6 (2), 1985, pp. 171-180
Kahneman, D., P. Slovic, and A. Tversky (Eds.). Judgment under Uncertainty: Heuristics and Biases.� Cambridge, UK: Cambridge University Press, 1982.
Power, D., Do DSS builders assume their targeted users are rational thinkers? DSS News, Vol. 5, No. 21, October 10, 2004.
Psych Central, http://psychcentral.com/psypsych/List_of_cognitive_biases
Tversky, A. and Kahneman, D. "Judgment under uncertainty: Heuristics and biases".� Science, 185, 1974, 1124-1131.
Wikipedia, Cognitive bias, http://en.wikipedia.org/wiki/Cognitive_bias .
Wikipedia, Decision making, http://en.wikipedia.org/wiki/Decision_making .
Please cite as:
Power, D. J. "Can computerized decision support systems impact, eliminate, exploit, or reduce cognitive biases in decision making?" DSS News, Vol. 6, No. 20, September 11, 2005; updated September 13, 2014 for Decision Support News Vol. 15, No. 19; updated December 7, 2016 for Decision Support News 12-11-2016 Vol. 17 No. 25. On December 7, 2016 the title of this column was shortened to "Can computerized decision support reduce cognitive biases in decision making?"
Last update: 2019-01-10 10:12
Author: Daniel Power
Print this record
Show this as PDF file
You cannot comment on this entry