Posted by & filed under Industrial Organizational Psychlology, Industrial Organizational Psychology, Intelligence, Language-Thought, Persuasion.

Description: How well do you think you would do in the following situation? You are asked to predict how likely a number of world events are to occur (from 0 to 100%). Question may include: Will North Korea launch a nuclear missile in the next year? Or Is Greece going to leave the European Union in the next 6 months? Yes, you are an amateur predictor not an analyst or diplomat but let’s up the ante further. Imagine you are making your predictions as a participant in a predicting contest that includes representatives of a number of large American government agencies including the CIA (who have access to intelligence they themselves have collected). What is your prediction of how well you would do in a competition like that? I bet you do not like your odds! Now, would it surprise you to hear that a group of amateur forecasters who participated in some research on forecasting with a psychologist (Phil Tetlock) beat the experts so badly that the conveners of the competition (who are interested in improving prediction of such events) kicked out ALL the experts and studied the processes used by Tetlock’s ‘Superforecasters.’ You can read about this in the linked article below or you can listen to an interview with Phil Tetlock regarding the work that lead to his book “Superforecasting” The Art and Science of Prediction”

Source: Predicting the Future is Possible. These ‘Superforecasters’ Know How. Phil Tetlock, The Ezra Kline Show and The New York Times

Date: December 3, 2021

Image by geralt from Pixabay

Article Link:


So, did it surprise you to hear that having a lot of knowledge in an area and being a good predictor of what will happen next in that area (a typical definition of an expert) are largely unrelated? Tetlock’s work and the work of others that he draws upon, speaks to some of the ways through which Superforecasters succeed. The distinction between taking an inside versus and outside view to a forecasting problem ties directly into the line of research that won Daniel Kahneman and Amos Tversky a Nobel prize (e.g., humans avoid base-rate data (outside view) when they really do not know anything particular about a case (inside view) they are being asked to predict). It is good to know that we can be better forecasters, though the challenge is to actually do what we should be doing to improve.

Questions for Discussion:

  1. Who beat the experts in the forecasting competition?
  2. What did the Superforecasters do that the experts did not do?
  3. What sorts of things can you take away from the discussion of Superforecasters and use in your own life predictions?

References (Read Further):

Tetlock, P. E., & Gardner, D. (2016). Superforecasting: The art and science of prediction. Random House. Publisher Link

Schoemaker, P. J., & Tetlock, P. E. (2016). Superforecasting: How to upgrade your company’s judgment. Harvard Business Review, 94(5), 73-78. Link

Katsagounos, I., Thomakos, D. D., Litsiou, K., & Nikolopoulos, K. (2021). Superforecasting reality check: Evidence from a small pool of experts and expedited identification. European journal of operational research, 289(1), 107-117. Link

Karvetski, C. W. Superforecasters: A Decade of Stochastic Dominance. Link

Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. science, 185(4157), 1124-1131. Link

Kahneman, D., & Tversky, A. (1977). Intuitive prediction: Biases and corrective procedures. Decisions and Designs Inc Mclean Va. Link