Moore

Accuracy

Lab

at Berkeley-Haas




Home
People
Research
Press
Opportunities
Teaching

Blog

Book



Google Scholar
Open Science Framework
ORCID
Haas webpage

Overconfidence

The mother of all biases

  

Overconfidence is the mother of all psychological biases. I mean that in two ways. First, overconfidence is one of the largest and most ubiquitous of the many biases to which human judgment is vulnerable. For example, 93 percent of American drivers claim to be better than the median,[1] which is statistically impossible.[2]

Another way in which people can indicate their confidence about something is by providing a 90 percent confidence interval around some estimate; when they do so, the truth often falls inside their confidence intervals less than 50 percent of the time,[3] suggesting they did not deserve to be 90 percent confident of their accuracy. In his 2011 book, Thinking Fast and Slow, Daniel Kahneman[4] called overconfidence “the most significant of the cognitive biases.”

Among many other things, overconfidence has been blamed for the sinking of the Titanic, the nuclear accident at Chernobyl, the loss of Space Shuttles Challenger and Columbia, the subprime mortgage crisis of 2008 and the great recession that followed it, and the Deepwater Horizon oil spill in the Gulf of Mexico.[5] Overconfidence may contribute to excessive rates of trading in the stock market, high rates of entrepreneurial failure, legal disputes, political partisanship, and even war.[6]

The second way overconfidence earns its title as the mother of all biases is by giving the other decision-making biases teeth. If we were appropriately humble about psychological vulnerabilities, we would be better able to protect ourselves from the errors to which human nature makes us prone.[7] Instead, an excessive faith in ourselves and our judgment means that we too often ignore our vulnerability to bias and error.[8] Decades of research on judgment and decision making have documented these heuristics and the biases they create. They include, but are not limited to, availability, representativeness, anchoring, framing, reference-dependence, and egocentrism.

This list will be familiar to anyone who has read popular books on decision making by Kahneman, Ariely, Bazerman, Gilovich, Heath, and others. Reading these books and their cautionary warnings against overconfidence, one might conclude that it would be wise to reduce the confidence with which we go through life. If overconfidence can get us into so much trouble, it seems to follow that we should reduce it—but how much? Should we completely minimize confidence? That is a recipe for perpetual doubt and inaction.

If you instead turn to self-help books for guidance, you might be tempted to come to the opposite conclusion: the challenge is keeping your confidence up. These books come with exciting titles like Confidence: How to Overcome Your Limiting Beliefs and Achieve Your Goals and You Are a Badass: How to Stop Doubting Your Greatness and Start Living an Awesome Life. Books like these make greater confidence sound awfully inviting. But surely the right answer is not that we should be maximally confident. Maximal confidence about your future earning potential is likely to lead to unsustainable spending. Maximal confidence about your popularity is likely to make you insufferably annoying.[9] And if it leads you to take more risks, maximal confidence in your immortality may actually decrease your life expectancy.

There is another way—a middle way, between too much and not enough confidence. This Goldilocks zone of confidence is where rational beliefs meet reality. It is fundamentally based on truth and good sense. It is built on beliefs that can be justified by evidence and honest self-examination. It steers between the perilous cliff of overconfidence and the quicksand of underconfidence. It is not always easy to find this narrow path; it takes honest self-reflection, level-headed analysis, and the courage to resist wishful thinking.

This middle way is not the path to mediocrity—far from it. It is exceptionally rare to be well-calibrated in one’s confidence.[10] It requires that you understand yourself and what you are capable of achieving. It requires that you know your limitations and what opportunities are not worth pursuing. It requires that you act confidently based on what you know, even if it means taking a stand, making a bet, or speaking up for a viewpoint that is unpopular. But it also requires the willingness to consider the possibility that you are wrong, to listen to evidence, and to change your mind. This is a rare combination of courage and intellectual humility, which leads to actively open-minded thinking. It takes just the right amount of confidence.


References

[1] Ola Svenson, ‘Are We Less Risky and More Skillful than Our Fellow Drivers?’, Acta Psychologica, 47 (1981), 143–51.

[2] Provided everyone agrees on what how to assess driving; Eric van den Steen, ‘Rational Overoptimism (and Other Biases)’, American Economic Review, 94.4 (2004), 1141–51.

[3] Marc Alpert and Howard Raiffa, ‘A Progress Report on the Training of Probability Assessors’, in Judgment under Uncertainty: Heuristics and Biases, ed. by Daniel Kahneman, Paul Slovic, and Amos Tversky (Cambridge: Cambridge University Press, 1982).

[4] Daniel Kahneman, Thinking Fast and Slow (New York: Farrar, Straus and Giroux, 2011).

[5] Ashraf Labib and Martin Read, ‘Not Just Rearranging the Deckchairs on the Titanic: Learning from Failures through Risk and Reliability Analysis’, Safety Science, 51.1 (2013), 397–413; Don A Moore and Samuel A Swift, ‘The Three Faces of Overconfidence in Organizations’, in Social Psychology of Organizations, ed. by Rolf Van Dick and J Keith Murnighan (Oxford: Taylor & Francis, 2010), pp. 147–84.

[6] Brad M Barber and Terrance Odean, ‘Trading Is Hazardous to Your Wealth: The Common Stock Investment Performance of Individual Investors’, Journal of Finance, 55.2 (2000), 773–806; Dominic D P Johnson, Overconfidence and War: The Havoc and Glory of Positive Illusions (Cambridge, MA: Harvard University Press, 2004).

[7] Emily Pronin, Daniel Y Lin, and Lee Ross, ‘The Bias Blind Spot: Perceptions of Bias in Self versus Others’, Personality and Social Psychology Bulletin, 28.3 (2002), 369–81.

[8] Max H Bazerman and Don A Moore, Judgment in Managerial Decision Making, 8th edn (New York: Wiley, 2013).

[9] Cameron Anderson and others, ‘Knowing Your Place: Self-Perceptions of Status in Face-to-Face Groups’, Journal of Personality and Social Psychology, 91.6 (2006), 1094–1110.

[10] Philip E Tetlock and Dan Gardner, Superforecasting: The Art and Science of Prediction (New York: Signal, 2015).