Beliefs as Hypotheses: The Superforecaster’s Mindset

Superforecasters’ Toolbox: Beliefs as Hypotheses

Nine years after the conclusion of the IARPA forecasting tournament, one Good Judgment discovery remains the most consequential idea in today’s dynamic world of forecasting: the discovery of Superforecasters. The concept of Superforecasting has at its heart a simple but transformative idea: The best calibrated forecasters treat their beliefs not as sacrosanct truths, but as hypotheses to be tested.

Superforecasting emerged as a game-changer in the four-year, $20-million research tournament run by the US Office of the Director of National Intelligence to see whether crowd-sourced forecasting techniques could deliver more accurate forecasts than existing approaches. The answer was a resounding yes—and there was more. About 2% of the participants in the tournament were consistently better than others in calling correct outcomes early. What gave them the edge, the research team behind the Good Judgment Project (GJP) found, was not some supernatural ability to see the future but the way they approached forecasting questions. For example, they routinely engaged in what Tetlock calls in his seminal book on Superforecasting “the hard work of consulting other perspectives.”

Central to the practice of Superforecasters is a mindset that encourages a continual reassessment of assumptions in light of new evidence. It is an approach that prizes being actively open-minded, constantly challenging our own perspectives to improve decision-making and forecasting accuracy. As we continue to explore the tools Superforecasters use in their daily work at Good Judgment, we look at what treating beliefs as hypotheses means and how it can be done in practice.

Belief Formation

Beliefs are shaped by our experiences and generally reinforced by our desire for consistency. When we encounter new information, our cognitive processes work to integrate it with our existing knowledge and perspectives. Sometimes this leads to the modification of prior beliefs or the formation of new ones. More often, however, this process is susceptible to confirmation bias and the anchoring effect. (Both Daniel Kahneman’s Thinking, Fast and Slow and Noise, the latter co-authored with Olivier Sibony and Cass R. Sunstein, provide an accessible overview of how cognitive biases affect our thinking and belief formation.)

It is not surprising then that traditionally in forecasting, beliefs have been viewed as conclusions drawn from existing knowledge or expertise. These beliefs tended to be steadfast and were slow to change. Leaders and forecasters alike didn’t like being seen as flip-flops.

During the GJP, Superforecasters challenged this notion. In forecasting, where accuracy and adaptability are paramount, they demonstrated that the ability to change one’s mind brought superior results.

The Superforecaster’s Toolkit

What does this mean in practice? Treating beliefs as hypotheses means being actively open-minded. That in turn requires an awareness and mitigation of cognitive biases to ensure a more balanced and objective approach to evaluation of information.

  • To begin with, Superforecasters constantly question themselves—and each other—whether their beliefs are grounded in evidence rather than assumption.
  • As practitioners of Bayesian thinking, they update their probabilities based on new evidence.
  • They also emphasize the importance of diverse information sources, ensuring a comprehensive perspective.
  • They have the courage to listen to contrasting viewpoints and integrate them into their analysis.

This method demands rigorous evidence-based reasoning, but it is worth the effort, as it transforms forecasting from mere guesswork into a systematic evaluation of probabilities. It is this willingness to engage in the “hard work of consulting other perspectives” that has enabled the Superforecasters to beat the otherwise effective futures markets in foreseeing the US Fed’s policy changes.

Cultivating a Superforecaster’s Mindset

Adopting this mindset is not without challenges. Emotional attachments to long-held beliefs can impede objectivity, and the deluge of information available can be overwhelming. But a Superforecaster’s mindset can and should be cultivated wherever good calibration is the goal. Viewing beliefs as flexible hypotheses is a strategy that champions open-mindedness over rigidity, ensuring that our conclusions are always subject to revision and refinement. It allows for a more effective interaction with information, fostering a readiness to adapt when faced with new data.

It is the surest path to better decisions.

Good Judgment Inc offers public and private workshops to help your organization take your forecasting skills to the next level.

We also provide forecasting services via our FutureFirst™ dashboard.

Explore our subscription options ranging from the comprehensive service to select channels on questions that matter to your organization.

Superforecasters’ Toolbox: Fermi-ization in Forecasting

Superforecasters’ Toolbox: Fermi-ization in Forecasting

Although usually a very private person, Superforecaster Peter Stamp agreed to be interviewed by a major Polish daily, Rzeczpospolita, on Good Judgment’s request. The reporter started the interview with a pop quiz. He asked Peter to estimate the number of tram cars that serve the city of Warsaw, Poland’s capital. Without using the internet, or having ever been to Warsaw, in under three minutes Peter came up with a remarkably accurate answer (only 10% away from the actual number, according to the reporter, Marek Wierciszewski). All he needed to know for his calculations were the typical size of a Warsaw tram and the relative importance of this means of transportation.

The method Peter used was Fermi-ization, and it is one of the key techniques Superforecasters employ to tackle complex questions even with minimal information.

What Is Fermi-ization?

In his day, physicist Enrico Fermi (1901-1954) was known not only for his groundbreaking contributions to nuclear physics. He was also able to come up with surprisingly accurate estimates using scarce information. The technique he used was elegant in its simplicity: He would break down grand, seemingly intractable questions into smaller sub-questions or components that could be analyzed or researched. He would then make educated guesses about each component until he arrived at his final estimate.

Many science and engineering faculties today teach this method, including through assignments like “estimate the number of square inches of pizza the students will eat during one semester.” Instead of blurting out a random number, students are expected to break the question down into smaller bits and engage with each one to produce a thoughtful answer (in this example, the estimate would depend on such factors as the number of students, the number of pizzas a student would eat per week, and the size of an average pizza).

Fermi-ization is a valuable tool in a Superforecaster’s toolbox. Since the days of the original Good Judgment Project and continuing in Good Judgment Inc’s work today, Superforecasters have proved the usefulness of this technique in producing accurate forecasts on seemingly impossible questions—from the scale of bird-flu epidemics, oil prices, and interest rates to election outcomes, regional conflict, and vaccinations during the Covid-19 pandemic.

Uses of Fermi-ization in Forecasting

In their seminal book Superforecasting, Philip Tetlock and Dan Gardner list Fermi-ization as the second of the Ten Commandments for Aspiring Superforecasters. This placement is not a coincidence. In the world of Superforecasters—experts known for their consistently accurate forecasts—Fermi-ization is a fundamental tool, enabling them to arrive at accurate predictions even in response to questions that initially seem impossible to quantify.

“Channel the playful but disciplined spirit of Enrico Fermi,” Tetlock and Gardner write. “Decompose the problem into its knowable and unknowable parts. Flush ignorance into the open. Expose and examine your assumptions. Dare to be wrong by making your best guesses. Better to discover errors quickly than to hide them behind vague verbiage.”

Depending on the question, this process can take just a few minutes, as it did when Peter worked out an estimated number of Warsaw’s tram cars, or it could be methodical, slow, and painstaking. But it is an invaluable road map whether accuracy is the goal.

Fermi-ization in forecasting has multiple uses:

    • It helps the forecaster to avoid the classic cognitive trap of relying on quick-and-easy—and often incorrect!—answers where more thought is called for.
    • It forces the forecaster to sort the relevant components from the irrelevant ones.
    • It enables the forecaster to separate the elements of the question that are knowable from those that are unknowable.
    • It makes the forecasters examine their assumptions more carefully and pushes them toward making educated—rather than blind—guesses.
    • It informs both the outside and the inside view in approaching the question.


Three Steps in Fermi-ization

Fermi-ization becomes easier and increasingly effective with practice. Keep these three steps in mind as you give it a try.

    1. Unpack the question by asking, “What would it take for the answer to be yes? What would it take for it to be no?” or “What information would allow me to answer the question?”
    2. Give each scenario your best estimate.
    3. Dare to be wrong.


Not the Only Tool

Of course, Fermi-ization is not the only tool in a Superforecaster’s toolbox. Mitigation of cognitive biases, ability to recognize and minimize noise, being actively open-minded, and keeping scores are all crucial components of the Superforecasting process. You can learn these techniques during one of our Superforecasting Workshops, or you can pose your own questions for Superforecasters to engage with through a subscription to FutureFirst™.

Superforecasters: Still Crème de la Crème Six Years On

Superforecasters: Still Crème de la Crème Six Years On

The multi-year geopolitical forecasting tournament sponsored by the research arm of the US Intelligence Community (IARPA) that led to the groundbreaking discovery of “Superforecasters” ended in 2015. Since then, public and private forecasting platforms and wisdom-of-the-crowd techniques have only proliferated. Six years on, are Good Judgment’s Superforecasters still more accurate than a group of regular forecasters? What, if anything, sets their forecasts apart from the forecasts of a large crowd?

A bar graph showing the Superforecasters' error scores are lower than those of regular forecasters
From the paper: Superforecasters’ accuracy outstrips wisdom-of-the-crowd scores.

A new white paper by Dr. Chris Karvetski, senior data and decision scientist with Good Judgment Inc (GJ Inc), compares six years’ worth of forecasts on the GJ Inc Superforecaster platform and the GJ Open public forecasting platform to answer these questions.

Key takeaway: Superforecasters, while a comparatively small group, are significantly more accurate than their GJ Open forecasting peers. The analysis shows they can forecast outcomes 300 days prior to resolution better than their peers do at 30 days from resolution.

Who are “Superforecasters”?

During the IARPA tournament, Superforecasters routinely placed in the top 2% of accuracy among their peers and were a winning component of the experimental research program of the Good Judgment Project, one of five teams that competed in the initial tournaments. Notably, these elite forecasters were over 30% more accurate than US intelligence analysts forecasting the same events with access to classified information.

Key Findings

Calibration plot showing the Superforecasters are 79% closer to perfect calibration
From the paper: Regular forecasters tend to show overconfidence, whereas the Superforecasters are close to perfect calibration.

Dr. Karvetski’s analysis presented in “Superforecasters: A Decade of Stochastic Dominance” uses forecasting data over a six-year period (2015-2021) on 108 geopolitical forecasting questions that were posted simultaneously on Good Judgment Inc’s Superforecaster platform (available to FutureFirst™ clients) as well as the Good Judgment Open (GJ Open) forecasting platform, an online forecasting platform that allows anyone to sign up, make forecasts, and track their accuracy over time and against their peers.

The data showed:

  • Despite being relatively small in number, the Superforecasters are much more prolific, and make almost four times more forecasts per question versus GJ Open forecasters.
  • They are also much more likely to update their beliefs via small, incremental changes to their forecast.
  • Based on the Superforecasters’ daily average error scores, they are 35.9% more accurate than their GJ Open counterparts.
  • Aggregation has a notably larger effect on GJ Open forecasters; yet, the Superforecaster aggregate forecasts are, on average, 25.1% more accurate than the aggregate forecasts using GJ Open forecasts.
  • The average error score for GJ Open forecasters at 30 days from resolution is larger than any of the average error scores of Superforecasters on any day up to 300 days prior to resolution.
  • GJ Open forecasters, in general, were over-confident in their forecasts. The Superforecasters, in contrast, are 79% better calibrated. “This implies a forecast from Superforecasters can be taken at its probabilistic face value,” Dr. Karvetski explains.
  • Finally, the amount of between-forecaster noise is minimal, implying the Superforecasters are better at translating the variety of different signals into a numeric estimate of chance.

You can read the full paper here.

Where Can I Learn More About Superforecasting?

Subscription to FutureFirst, Good Judgment’s exclusive monitoring tool, gives clients 24/7 access to Superforecasters’ forecasts to help companies and organizations quantify risk, improve judgment, and make better decisions about future events.

Our Superforecasting workshops incorporate Good Judgment research findings and practical Superforecaster know-how. Learn more about private workshops, tailored to the needs of your organization, or public workshops that we offer.

A journey to becoming a Superforecaster begins at GJ Open. Learn more about how to become a Superforecaster.