Can computerized decision support reduce cognitive biases in decision making?

by Dan Power
Editor, DSSResources.COM

Individuals have predictable thinking patterns. Some patterns are positive and others lead to poor choices. A person's predictable thinking patterns are generally termed cognitive biases. An individual may learn many of the patterns, but the frequency and impact may differ among individuals. Thinking patterns impact what a person likes and dislikes. A computerized decision support system may reinforce some known biases or change how a user thinks about a situation in a favorable way. People have limitations as information processors and biases can and often do reduce the amount of thinking and processing that a person does to make a choice. A decision aid or decision support system can also impact how much thinking and information gathering a person must do in a situation.

In the early days of computerized decision support, the Sabre Reservation system exploited human information processing biases and limitations to increase sales of tickets on American Airlines flights. Flights on American Airlines were displayed first. Because of this system, the U.S. Department of Transportation and the U.S. Courts restricted and prohibited such practices. The Computer Reservations System (CRS) Regulations originally adopted in 1984 prohibited display bias. The current regulation (2004) notes "Display bias has been a concern since the systems were first developed. Experience has demonstrated that travel agents are likely to book one of the first services displayed by a system in response to a travel agent's request for information, even if services shown later in the display would better satisfy the customer's needs.  If systems give preferential display positions to one airline's services, that display bias will harm airline competition and cause consumers to be misled."

Cognitive biases exist. People are predisposed to make choices by the way information is presented and the way analyses are conducted. But debiasing or unbiased presentation has often been a secondary motivation for building DSS. It is often easy for managers to accept that some people are biased decision makers, but that doesn't mean they think their decision making is biased or at least not in the situation where a proposed DSS will be used. Also, DSS builders assume their targeted users are rational thinkers (cf. Power, 2004).

In general, cognitive bias has been an issue raised more by academic researchers than one raised by industry consultants and practitioners. If DSS builders are consciously attempting to expand the boundary of rational managerial decision making behavior, then they must be familiar with the cognitive biases that can impact human information processing.  We MUST ask and explore how DSS can reduce or even eliminate significant cognitive biases.

Also, DSS can encourage and even promote biased decision making, building such systems may not however be ethical or legal. As DSS builders we must ask if it is ever desirable and ethical to reinforce or exploit known cognitive biases when building a DSS. And if it is, when and in what circumstances? This Ask Dan! won't resolve or even offer specific guidance on these troubling questions. Rather the focus is on exploring biases in the context of decision support.

Below is a list of common cognitive biases with comments related to building decision support systems. The following list is based upon various sources including Tversky and Kahneman (1974), Kahneman, Slovic, and Tversky (1982), and Wikipedia.

  1. Anchoring and adjustment - Decision-makers "anchor" on the initial information they receive and that influences how subsequent information is interpreted. So for example, in a data-driven DSS for business performance management the dashboard screen metrics will significantly impact how subsequent data and analyses are interpreted.

  2. Attribution - Decision-makers tend to attribute successes to their own actions and abilities, but attribute failures to bad luck and external factors. Also there is a tendency to attribute a competitor's success to good luck, and a competitor's failure to mistakes. In a data-driven DSS, managers should be encouraged to ask why questions about summary data values. Why did profit increase 25% in the most recent quarter? Why did the in-process inventory increase 20%?

  3. Availability - Decision-makers estimate the probability of an outcome based upon how easy that outcome is to imagine.  In a model-driven DSS, when probabilities are elicited a DSS should encourage information gathering prior to the input of any probability estimates. Competing scenarios can potentially reduce this bias.

  4. Causal attribution - Decision-makers tend to ascribe causal explanations even when the evidence only suggests correlation. In data-driven DSS, cross-tabulation displays should be "interpreted" when possible or a disclaimer should be provided about this problem.

  5. Confirmation - Decision-makers tend to explain away inconsistent and negative evidence. Negative evidence is sometimes used to confirm a pre-existing hypothesis. A data-driven DSS should be used early in a decision making process and multiple decision-makers should have access to and use a specific DSS.

  6. Conservatism, tradition and inertia - Decision-makers repeat previously successful thought patterns and analyses when confronted with new circumstances. In a knowledge-driven DSS, it is important to periodically check that circumstances have not changed.  Model-driven DSS also need to be periodically reviewed and updated. Decision makers need to monitor changes in situations and circumstances.

  7. Escalating commitment - Decision-makers often look at a decision as a small step in a sequential decision process and this encourages commitment to a course of action. DSS that are tightly linked to a particular strategy reinforce this tendency.  Also, the selection of critical success factors in data-driven DSS can reinforce commitment to a course of action. Managers needed to periodically revisit the metrics used to monitor organization performance.

  8. Experience limitations - Decision-makers are often constrained by past experiences. A planning-oriented DSS should include a wide-range of scenarios from multiple stakeholders to expand the experience horizon of decision-makers.

  9. Faulty generalizations - Decision-makers simplify complex interactions and group ideas, things and people. These generalizations influence decisions. A DSS builder should explicitly state assumptions and generalizations about the models in a DSS.

  10. Inconsistency - Decision-makers do not consistently apply the same decision criteria in similar decision situations. Screening and evaluation models in model-driven DSS can help insure consistency. Consistency is only desirable however when the criteria are appropriately identified and appropriately weighted.

  11. Premature closure - Decision-makers tend to terminate the search for evidence quickly and accept the first alternative that is feasible. Data and document-driven DSS can make search easier and a user friendly interface can encourage further search.

  12. Recency - Decision-makers tend to place the greatest attention on more recent information and either ignore or forget historical information. When possible, data-driven DSS should put recent information in a context of historical information.  For example, the current month, prior month and the year ago month's sales data should be presented.

  13. Repetition - Decision-makers often believe what they have been told repeatedly and by the greatest number of different sources. Data and document-driven DSS need to help identify the source of data and a single source should not be presented many times to bolster the same conclusion. In web-based search, the same source can often appear in many results.

  14. Representativeness -- Decision-makers often judge events, people and things based upon how similar they are to a prior case example. This approach can work effectively in some situations and it is used in case-based reasoning in some knowledge-driven DSS.  DSS builders need to monitor systems that rely on a representativeness heuristic.

  15. Role fulfillment - Decision-makers often conform to the expectations that others have of them. If the expectation is that a manager will use a specific DSS, then it is more likely s/he will use the DSS.  The reverse of this also holds. DSS builders should examine the role of a decision maker/user as part of DSS analysis and design.

  16. Selective perception - Decision-makers actively screen-out information that is considered as irrelevant and unimportant. This perceptual bias helps reduce information load, but if the decision-maker is prejudiced about the decision outcome then important information will be ignored. A data-driven DSS can present predefined information and the rationale for what information is presented can be examined and disclosed.

  17. Selective search for evidence - Decision-makers tend to gather facts that support certain conclusions, but ignore other facts that might support different conclusions. This tendency encourages some managers to use decision support to bolster previously made decisions and to rationalize their conclusions.  When possible, DSS should attempt to encourage unbiased search. Often a review of search efforts can identify additional search topics.

  18. Source credibility - Decision-makers sometimes reject information because of the source. A healthy skepticism about source credibility should be encouraged in data and document-driven DSS. Information about a source's race, nationality, religion or other potentially prejudicial source information should not however be readily available to DSS users. Source information should focus on relevant qualifications.

  19. Underestimating uncertainty and having an illusion of control - Decision-makers tend to underestimate uncertainty about future events and outcomes. This occurs because people believe they have more control over outcomes than they often do. The tendency is to believe one has adequate control to minimize potential problems from decisions.  If decision-makers will use DSS for contingency planning, such systems can potentially help reduce this bias.

  20. Wishful thinking and unwarranted optimism - Decision-makers tend to assume the "best" outcome will occur. It is a natural tendency to view events in a positive frame of reference and this bias can distort perception and thinking.  DSS should present multiple scenarios when possible including "worst case" scenarios.

George Dvorsky (2013), a Canadian futurist, science writer, and ethicist, argues that there are 12 cognitive biases that prevent people from being rational. He explains briefly the biases labeled 1) confirmation bias, 2) ingroup bias, 3) gambler's fallacy, 4) post-purchase rationalization, 5) neglecting probability, 6) observational selection bias, 7) status-quo bias, 8) negativity bias, 9) bandwagon effect, 10) projection bias, 11) the current moment bias, and 12) anchoring effect. Lists of cognitive biases are long and the terminology varies from author to author. Computerized decision support can reduce the impact of some of the 70 or 80 identified biases. More attention must be given to cognitive biases when we design computerized analytical systems. Using a decision support or predictive analytics system does NOT guarantee rational decision making.

According to the Wikipedia entry on Decision Making (2005), "Due to the large number of considerations involved in many decisions, decision support systems have been developed to assist decision makers in considering the implications of various courses of action. They can help reduce the risk of errors." Do you agree? I do.

So YES, computerized decision support systems can and do impact, eliminate, exploit, and reduce cognitive biases in decision making.


Cognitive Technologies, "Biases in Decision Making",

Computer Reservations System (CRS) Regulations,, Office of the Secretary, Department of Transportation, January 31, 2004.

Dvorsky, G., "The 12 cognitive biases that prevent you from being rational," io9, 1/09/13, at URL

Kahneman, D., P. Slovic, and A. Tversky (Eds.). Judgment under Uncertainty: Heuristics and Biases.  Cambridge, UK: Cambridge University Press, 1982.

Power, D., Do DSS builders assume their targeted users are rational thinkers? DSS News, Vol. 5, No. 21, October 10, 2004.

Psych Central,

Tversky, A. and Kahneman, D. "Judgment under uncertainty: Heuristics and biases".  Science, 185, 1974, 1124-1131.

Wikipedia, Cognitive bias, .

Wikipedia, Decision making, .

Please cite as:

Power, D. J. "Can computerized decision support systems impact, eliminate, exploit, or reduce cognitive biases in decision making?" DSS News, Vol. 6, No. 20, September 11, 2005; updated September 13, 2014 for Decision Support News Vol. 15, No. 19; updated December 7, 2016 for Decision Support News 12-11-2016 Vol. 17 No. 25. On December 7, 2016 the title of this column was shortened to "Can computerized decision support reduce cognitive biases in decision making?"

Last update: 2016-12-07 01:06
Author: Daniel Power

Print this record Print this record
Show this as PDF file Show this as PDF file

Please rate this entry:

Average rating: 1.96 from 5 (25 Votes )

completely useless 1 2 3 4 5 most valuable

You cannot comment on this entry

DSS Home |  About Us |  Contact Us |  Site Index |  Subscribe | What's New
Please Tell Your Friends about DSSResources.COMCopyright © 1995-2015 by D. J. Power (see his home page).
DSSResources.COMsm is maintained by Daniel J. Power. Please contact him at with questions. See disclaimer and privacy statement.


powered by phpMyFAQ 1.5.3