Dr. Lee McGaan  

  Office:  WH 308  (ph. 309-457-2155);  email lee@monmouthcollege.edu
  Home:  418 North Sunny Lane (ph. 309-734-5431, cell 309-333-5447)

Fall 2016 Office Hours:   MWF:  9:30 - 10am, 11am - Noon & 1 -2pm TTh:  2-3pm & by apt.  |  copyright (c) by Lee McGaan, 2006-2016



 

last updated 10/11/2016

Informational Biases and Heuristics that Damage Decision-making

 

  • Availability - information that is familiar or highly salient is weighed more heavily in decision-making than less available information (which can easily be ignored even when it is more important and useful - e.g. the "shared knowledge" effect).
  • Shared Knowledge Effect - information that is shared by many people in a group decision-making setting  takes on more significance even when it is not the most important or relevant information (e.g. expert information is often ignored).

  •  Confirmation Bias – starting with an answer and searching for evidence to back it up.  (e.g. Should the college move to a 4:4 calendar?)

·        Recency Bias – weighing the most recent evidence we encounter very heavily or even exclusively.  (e.g. the GOP can’t win the presidency)

  •   Representativeness - people tend to make judgements about unknown people or events by assuming they are similar (representative) of something that we do know that is similar in some way.

·        Backfire Effect – strengthening your belief in a view that has been challenged seriously with evidence. (e.g. Karl Rove on Ohio and Romney)

·        Anchoring – letting a single piece of evidence govern the general view of an issue, esp. allowing quantitative information determine what is “normal, average, etc.” (e.g. A high price equals high quality or desirablility.  How many jelly beans in the jar?)

·        Framing Bias – reacting differently to evidence depending on the context in which it is presented.  (MC football: "In 2012 we extended our record to not a single losing season in a decade" vs "2012 was our worst season in 10 years.")

·        Pessimism/optimism  Bias – wrongly estimating odds based on personal perspective, interest or outlook.  (e.g. "I never do well on standardized tests.")

 

·        Halo Effect – positive information about a previous similar circumstance favorably colors our judgment about the current information.  (e.g. sibling comparisons by teachers)

·        Illusion of Control – crediting to skill or wisdom successes that were mostly luck.  (B. Schilling is a great candidate having won the 17th district for GOP after 28 years of Democrats in control.)

·        Escalation of Commitment – doubling down on an unsuccessful plan/position. 

·        Ostrich Bias – failing to notice or accept readily available evidence. (e.g. not attending to the fact that the stock market has gone up in the last three years.)

·        Risk Perception Bias – exposing yourself unthinkingly to serious risks while focusing on avoiding other risks (e.g. vaccine vs disease risks).  We tend to underestimate risks in situations where we feel in control (e.g. flying vs driving).