Superforecasting Explained in Podcasts and Videos

Superforecasting® Explained in Podcasts and Videos

Superforecasting is a disciplined approach to forecasting that uses probabilistic thinking, continuous updating, and rigorous calibration to make well-informed judgments about future events. This approach is based on decades of research spearheaded by Dr. Philip Tetlock into what traits and tools make some people remarkably good at forecasting while others, including many experts, fall short. Since IARPA’s massive forecasting tournament of 2011-2015, Superforecasting has been proven to outperform traditional forecasting methods in many areas, including finance and policy decision-making (e.g., see our forecasting report on the early trajectory of Covid-19). Below, we’ve curated our favourite podcasts and videos that showcase the principles and real-world applications of Superforecasting.

Top 5 Podcasts and Videos

1. BBC Reel: Can You Learn to Predict the Future? (8:21)
This short video from BBC Reel introduces the concept of Superforecasting in an engaging and visual way. It explores the techniques that make accurate forecasting possible and discusses how anyone can improve their forecasting skills.

2. Quid Explore: Superforecasting with Dr. Warren Hatch (22:32)
In this detailed conversation, Dr. Warren Hatch, Superforecaster and CEO of Good Judgment Inc, explains the science behind Superforecasting. He discusses the traits of successful forecasters and shares practical tips for applying these skills in professional and personal decision-making.

3. More or Less: Superforecasting the Coronavirus (08:57)
Tim Harford talks to Terry Murray, GJ co-founder and project manager for the original Good Judgment Project (GJP), about how GJ Superforecasters tackled the uncertainties of the Covid-19 pandemic. This episode highlights how their methods and tools can be applied to making sense of real-world crises.

4. Talking Politics: David Spiegelhalter on Superforecasting (48:55)
Tim Harford talks to Terry Murray, GJ co-founder and project manager for the original Good Judgment Project (GJP), about how GJ Superforecasters tackled the uncertainties of the Covid-19 pandemic. This episode highlights how their methods and tools can be applied to making sense of real-world crises.

5. MarketWatch: Can an Ice Storm Predict the Next Meme Stocks? (25:38)
This podcast explores the intersection of forecasting and finance, showcasing how Superforecasting can shed light on trends in the stock market. In this episode, Dr. Hatch defines “prediction” vs “forecast,” describes the innate characteristics that make some people better at forecasting high-stakes world and financial events, and explains how anybody, whether they possess those innate characteristics or not, can get better at forecasting with practice.

Take a Deeper Dive: Edge Master Class on Superforecasting

For those seeking a more in-depth exploration of Superforecasting, consider the Edge Master Class on Superforecasting led by Dr. Philip Tetlock. This short course covers the foundational principles and techniques of Superforecasting and features discussions with renowned experts (including Dr. Daniel Kahneman, the Nobel laureate in economics and author of Thinking, Fast and Slow; Dr. Barbara Mellers, a leading researcher in decision-making and another key figure behind the GJP; Dr. Robert Axelrod, a political scientist specializing in international security, formal models, and complex adaptive systems), as well as entrepreneurs and journalists.

From Theory to Practice

Whether you’re new to Superforecasting or want to deepen your understanding, these podcasts and videos are a great place to start! Ready to take the next step? Superforecasters’ methods and traits can be learned and cultivated. Start building your own forecasting skills with our training programs.

Common Questions about Good Judgment Inc and Superforecasters

A Primer on Good Judgment Inc and Superforecasters

At Good Judgment Inc (GJI), the official home of Superforecasting®, we pride ourselves on our ability to provide well-calibrated and insightful forecasts. As we continue to partner with clients and media worldwide, it is worthwhile to address some of the common questions we receive about our work. Here is a primer on our story, probabilistic forecasts, and our team of Superforecasters.

What’s in a Name? GJP, GJI, and GJ Open

The Good Judgment Project (GJP)
In 2011, the Intelligence Advanced Research Projects Activity (IARPA) launched a massive tournament to identify the most effective methods for forecasting geopolitical events. Four years, 500 questions, and over a million forecasts later, the Good Judgment Project (GJP), led by Philip Tetlock and Barbara Mellers at the University of Pennsylvania, emerged as the clear winner of the tournament. The research project concluded in 2015, but its legacy lives on. The GJP is credited with the discovery of Superforecasters, people who are exceptionally skilled at assigning realistic probabilities to possible outcomes even on topics outside their primary subject-matter training.

Good Judgment Inc (GJI)
GJI is the commercial successor to the GJP and the official home of Superforecasting® today. We leverage the lessons learned during the IARPA tournament and insights gained in our subsequent work with Phil Tetlock and his research colleagues as well as with leading companies, academic institutions, governments, and non-governmental organizations to provide the best and the latest in forecasting and training services. Our goal is to help organizations make better decisions by harnessing the power of accurate forecasts. GJI relies on a team of Superforecasters, as well as data and decision scientists, to provide forecasting and training to clients.

Good Judgment Open (GJ Open)
GJO, or GJ Open, is our public platform, open to anyone interested in making forecasts. Unlike GJI, which involves professional Superforecasters, GJO welcomes participation from the public. The “Open” in GJ Open not only signifies that it’s accessible to all but also draws a parallel to golf tournaments. Forecasting questions vary in their complexity, so there is no absolute score to indicate a “good” forecast. We use the median of participants’ scores as a benchmark, similar to par in golf, where lower scores indicate better performance.

A Note on Spelling
You may have noticed that “judgment” is spelled without an “e” on all our platforms. This is a consistent choice across GJP, GJI, and GJ Open, reflecting our preference for the parsimonious American spelling of the word.

Understanding Probabilistic Forecasts
Sample forecast on FutureFirst™, 12 July 2024

Our forecasts are not polls. They are aggregated probabilistic predictions about specific events. For instance, Superforecasters gave Joe Biden an 82% chance of winning the 2020 US presidential election. This means that if the election were held 100 times, Biden would win in 82 of those instances.

A common misconception is interpreting a probabilistic forecast as “X% of Superforecasters say a particular outcome will happen.” In reality, each Superforecaster provides their own probabilistic forecast, and we aggregate these individual predictions to reach a collective forecast. Therefore, an 82% forecast does not mean 82% of Superforecasters believe a certain outcome will occur. It is an aggregated probability of the outcome (an 82% probability of it occurring and an 18% probability of a different outcome) based on all individual forecasts.

Understanding Superforecasters’ Backgrounds

Good Judgment works with some 180 Superforecasters from around the world whose forecasting accuracy placed them in the top 1-2% of the more than 100,000 forecasters who took part in the GJP or qualified on GJ Open. Our Superforecasters come from a wide range of professional fields, including finance, IT, humanities, social sciences, engineering, and more. This allows them to approach forecasting questions in a well-rounded way, combining their exceptional forecasting skills with specialized knowledge in different areas.

Age and Geographic Diversity
Superforecasters range in age from their 20s to their 70s and hail from different parts of the world. This geographic and demographic diversity helps to ensure that our forecasts are informed by a broad spectrum of experiences and viewpoints.

The Wisdom of the Crowd
We emphasize the importance of the wisdom of the crowd. Our Superforecasters read different publications in various languages and bring diverse perspectives to the table. To borrow terminology from Tetlock’s training materials in the GJP, some are Intuitive Scientists, others are Intuitive Historians, while still others are Intuitive Data Scientists.

Collaborative Nature of Forecasting
Forecasting at GJI is a team effort. We focus on collective intelligence. It’s not about individual forecasting superheroes tackling challenges alone but about identifying people who bring unique strengths to the table as a team of Superforecasters.

Get actionable early insights on top-of-mind questions by subscribing to our forecasting monitoring tool, FutureFirst™!

Beliefs as Hypotheses: The Superforecaster’s Mindset

Superforecasters’ Toolbox: Beliefs as Hypotheses

Nine years after the conclusion of the IARPA forecasting tournament, one Good Judgment discovery remains the most consequential idea in today’s dynamic world of forecasting: the discovery of Superforecasters. The concept of Superforecasting has at its heart a simple but transformative idea: The best calibrated forecasters treat their beliefs not as sacrosanct truths, but as hypotheses to be tested.

Superforecasting emerged as a game-changer in the four-year, $20-million research tournament run by the US Office of the Director of National Intelligence to see whether crowd-sourced forecasting techniques could deliver more accurate forecasts than existing approaches. The answer was a resounding yes—and there was more. About 2% of the participants in the tournament were consistently better than others in calling correct outcomes early. What gave them the edge, the research team behind the Good Judgment Project (GJP) found, was not some supernatural ability to see the future but the way they approached forecasting questions. For example, they routinely engaged in what Tetlock calls in his seminal book on Superforecasting “the hard work of consulting other perspectives.”

Central to the practice of Superforecasters is a mindset that encourages a continual reassessment of assumptions in light of new evidence. It is an approach that prizes being actively open-minded, constantly challenging our own perspectives to improve decision-making and forecasting accuracy. As we continue to explore the tools Superforecasters use in their daily work at Good Judgment, we look at what treating beliefs as hypotheses means and how it can be done in practice.

Belief Formation

Beliefs are shaped by our experiences and generally reinforced by our desire for consistency. When we encounter new information, our cognitive processes work to integrate it with our existing knowledge and perspectives. Sometimes this leads to the modification of prior beliefs or the formation of new ones. More often, however, this process is susceptible to confirmation bias and the anchoring effect. (Both Daniel Kahneman’s Thinking, Fast and Slow and Noise, the latter co-authored with Olivier Sibony and Cass R. Sunstein, provide an accessible overview of how cognitive biases affect our thinking and belief formation.)

It is not surprising then that traditionally in forecasting, beliefs have been viewed as conclusions drawn from existing knowledge or expertise. These beliefs tended to be steadfast and were slow to change. Leaders and forecasters alike didn’t like being seen as flip-flops.

During the GJP, Superforecasters challenged this notion. In forecasting, where accuracy and adaptability are paramount, they demonstrated that the ability to change one’s mind brought superior results.

The Superforecaster’s Toolkit

What does this mean in practice? Treating beliefs as hypotheses means being actively open-minded. That in turn requires an awareness and mitigation of cognitive biases to ensure a more balanced and objective approach to evaluation of information.

  • To begin with, Superforecasters constantly question themselves—and each other—whether their beliefs are grounded in evidence rather than assumption.
  • As practitioners of Bayesian thinking, they update their probabilities based on new evidence.
  • They also emphasize the importance of diverse information sources, ensuring a comprehensive perspective.
  • They have the courage to listen to contrasting viewpoints and integrate them into their analysis.

This method demands rigorous evidence-based reasoning, but it is worth the effort, as it transforms forecasting from mere guesswork into a systematic evaluation of probabilities. It is this willingness to engage in the “hard work of consulting other perspectives” that has enabled the Superforecasters to beat the otherwise effective futures markets in foreseeing the US Fed’s policy changes.

Cultivating a Superforecaster’s Mindset

Adopting this mindset is not without challenges. Emotional attachments to long-held beliefs can impede objectivity, and the deluge of information available can be overwhelming. But a Superforecaster’s mindset can and should be cultivated wherever good calibration is the goal. Viewing beliefs as flexible hypotheses is a strategy that champions open-mindedness over rigidity, ensuring that our conclusions are always subject to revision and refinement. It allows for a more effective interaction with information, fostering a readiness to adapt when faced with new data.

It is the surest path to better decisions.

Good Judgment Inc offers public and private workshops to help your organization take your forecasting skills to the next level.

We also provide forecasting services via our FutureFirst™ dashboard.

Explore our subscription options ranging from the comprehensive service to select channels on questions that matter to your organization.