Beliefs as Hypotheses: The Superforecaster’s Mindset

Superforecasters’ Toolbox: Beliefs as Hypotheses

Nine years after the conclusion of the IARPA forecasting tournament, one Good Judgment discovery remains the most consequential idea in today’s dynamic world of forecasting: the discovery of Superforecasters. The concept of Superforecasting has at its heart a simple but transformative idea: The best calibrated forecasters treat their beliefs not as sacrosanct truths, but as hypotheses to be tested.

Superforecasting emerged as a game-changer in the four-year, $20-million research tournament run by the US Office of the Director of National Intelligence to see whether crowd-sourced forecasting techniques could deliver more accurate forecasts than existing approaches. The answer was a resounding yes—and there was more. About 2% of the participants in the tournament were consistently better than others in calling correct outcomes early. What gave them the edge, the research team behind the Good Judgment Project (GJP) found, was not some supernatural ability to see the future but the way they approached forecasting questions. For example, they routinely engaged in what Tetlock calls in his seminal book on Superforecasting “the hard work of consulting other perspectives.”

Central to the practice of Superforecasters is a mindset that encourages a continual reassessment of assumptions in light of new evidence. It is an approach that prizes being actively open-minded, constantly challenging our own perspectives to improve decision-making and forecasting accuracy. As we continue to explore the tools Superforecasters use in their daily work at Good Judgment, we look at what treating beliefs as hypotheses means and how it can be done in practice.

Belief Formation

Beliefs are shaped by our experiences and generally reinforced by our desire for consistency. When we encounter new information, our cognitive processes work to integrate it with our existing knowledge and perspectives. Sometimes this leads to the modification of prior beliefs or the formation of new ones. More often, however, this process is susceptible to confirmation bias and the anchoring effect. (Both Daniel Kahneman’s Thinking, Fast and Slow and Noise, the latter co-authored with Olivier Sibony and Cass R. Sunstein, provide an accessible overview of how cognitive biases affect our thinking and belief formation.)

It is not surprising then that traditionally in forecasting, beliefs have been viewed as conclusions drawn from existing knowledge or expertise. These beliefs tended to be steadfast and were slow to change. Leaders and forecasters alike didn’t like being seen as flip-flops.

During the GJP, Superforecasters challenged this notion. In forecasting, where accuracy and adaptability are paramount, they demonstrated that the ability to change one’s mind brought superior results.

The Superforecaster’s Toolkit

What does this mean in practice? Treating beliefs as hypotheses means being actively open-minded. That in turn requires an awareness and mitigation of cognitive biases to ensure a more balanced and objective approach to evaluation of information.

  • To begin with, Superforecasters constantly question themselves—and each other—whether their beliefs are grounded in evidence rather than assumption.
  • As practitioners of Bayesian thinking, they update their probabilities based on new evidence.
  • They also emphasize the importance of diverse information sources, ensuring a comprehensive perspective.
  • They have the courage to listen to contrasting viewpoints and integrate them into their analysis.

This method demands rigorous evidence-based reasoning, but it is worth the effort, as it transforms forecasting from mere guesswork into a systematic evaluation of probabilities. It is this willingness to engage in the “hard work of consulting other perspectives” that has enabled the Superforecasters to beat the otherwise effective futures markets in foreseeing the US Fed’s policy changes.

Cultivating a Superforecaster’s Mindset

Adopting this mindset is not without challenges. Emotional attachments to long-held beliefs can impede objectivity, and the deluge of information available can be overwhelming. But a Superforecaster’s mindset can and should be cultivated wherever good calibration is the goal. Viewing beliefs as flexible hypotheses is a strategy that champions open-mindedness over rigidity, ensuring that our conclusions are always subject to revision and refinement. It allows for a more effective interaction with information, fostering a readiness to adapt when faced with new data.

It is the surest path to better decisions.

Good Judgment Inc offers public and private workshops to help your organization take your forecasting skills to the next level.

We also provide forecasting services via our FutureFirst™ dashboard.

Explore our subscription options ranging from the comprehensive service to select channels on questions that matter to your organization.

Superforecasters vs FT Readers: Results of an Informal Contest in the Financial Times

Superforecasters vs FT Readers: Results of an Informal Contest in the Financial Times

In early 2023, the Financial Times launched an informal contest that pitted the predictive prowess of the FT’s readership against that of Good Judgment’s Superforecasters. The contest involved forecasting key developments in the year ahead, ranging from geopolitical events to economic indicators to heat waves to sport. The results? Illuminating.

“To help illustrate what makes a superforecaster worthy of the name, the FT asked both readers and Good Judgment superforecasters to make predictions on ten questions early this year, spanning from GDP growth rates to viewership of the Fifa women’s world cup final,” Joanna S Kao and Eade Hemingway explain in their article.

A total of 95 Superforecasters made forecasts on the questions, while the reader poll had about 8,500 respondents.

The Results

Nine of the ten questions have now been scored. On a scale where 0.5 equals guessing and 1 equals perfect prediction, Superforecasters scored an average of 0.91 over nine questions, significantly outperforming FT readers who scored 0.73.

In this informal contest, the Superforecasters continued to work on the questions throughout the year, while the reader poll closed early. Based on everyone’s initial forecasts as of 2 February 2023, however, the Superforecasters outperformed the general crowd on eight out of nine questions.

Key to Forecasting Success

Key to the Superforecasters’ success, as the article notes, is their methodology. They focus on gathering comprehensive information, minimizing biases, and reducing the influence of irrelevant factors that only create noise. This methodological rigor stems from the foundational work of Philip Tetlock, a pioneer in the study of expert predictions and co-founder of Good Judgment Inc.

Read the full article in the FT for a fascinating glimpse into the realm of Superforecasting.

To benefit from Superforecasters’ insights on dozens of top-of-mind questions with probabilities that are updated daily, learn more about our forecast monitoring service, FutureFirst.

 



Ready to Elevate Your Forecasting Game in 2024?
Embrace the Future with Our Exclusive End-of-Year Offer!

As we approach the end of another dynamic year, it’s time to think ahead. Are you looking to empower your team’s professional development in 2024? Look no further. Our renowned Superforecasting® Public Workshop will help you make your forecasting more precise than ever.

Join Our January 2024 Public Workshop – Unleash the Power of Accurate Forecasts!

➡️ What to Expect in the Superforecasting Workshop?

  • Advanced Techniques: Dive into the latest forecasting methodologies.
  • Expert Insights: Learn directly from top forecasters and researchers.
  • Interactive Sessions: Engage in hands-on exercises for practical learning.

 

📈 Special Limited-Time Offer:
Claim your 20% discount on the January Public Workshop (16 & 18 January 2024). Use code GJ20XMS at registration. Hurry, offer valid while seats last!

Stop Guessing, Start Superforecasting with Good Judgment’s Superforecasting® Workshop. Register here.


 

Full Marks from The Economist

Full Marks from The Economist

The World Ahead 2023 issue of The Economist revealing some of the Superforecasters' forecastsGood Judgment’s team of Superforecasters received full marks from The Economist for their forecasts published last year in “The World Ahead 2023” issue. Now that eight of the nine questions have resolved, The Economist’s editors were able to score the Superforecasters’ performance.

“The Good Judgment team had a good year in 2023, correctly forecasting the outcomes of the eight questions that were resolved,” the editorial team writes in the “The World Ahead 2024” print issue. “Global growth was 3%, China grew by 5%, ruling-party candidates won in Nigeria and Turkey, Vladimir Putin was not ousted, there was no election in Britain, no clash over Taiwan, and no nuclear device detonated by Russia.”

As to the ninth question in the 2023 publication, the Superforecasters continue to see a protracted conflict in Ukraine, likely going beyond 1 October 2024. That question remains open, and, as The Economist team notes, “Events in 2023 did not prove them wrong.”

“The World Ahead 2024” from The Economist is now available online and in print, and once again features the Superforecasters’ take on key questions for 2024. See their forecasts in the newspaper—or subscribe to FutureFirst™ to access all their forecasts that are updated daily.

About Good Judgment

Good Judgment Inc is the successor to the Good Judgment Project, a research team that emerged as the undisputed victor in a massive forecasting competition (the Aggregate Contingent Estimation or ACE tournament) sponsored by the Intelligence Advanced Research Project Activity (IARPA) of the US Office of the Director of National Intelligence (ODNI). Spanning four years, 500 questions, and over a million forecasts, that research project confirmed and refined methods that lead to the best possible forecast accuracy and is credited with the discovery of Superforecasters—people who are exceptionally skilled at assigning accurate probabilities to future outcomes. Good Judgment Inc is now making this winning approach to harnessing the wisdom of the crowd available for commercial use. Good Judgment’s Superforecasters are men and women around the world who go through a rigorous qualification process to demonstrate consistently high accuracy and quality commentary in their forecasting approaches.

About FutureFirst™

FutureFirst, Good Judgment’s exclusive monitoring tool, gives 24/7 access to timely insights on top-of-mind questions from a diverse global team of professional Superforecasters. It combines the advantages of an expert network with model-friendly quantitative forecasts of likely outcomes of key events. Daily forecast updates from our subscription service allow clients to spot emerging risks earlier and see ahead of the competition.