A Year in Review

Good Judgment Inc: A Year in Review

From our headquarters in Manhattan to Canada to Brazil and points in between, the Good Judgment team had a productive and exciting year in 2021. Here are some of the key developments and projects we worked on in the past year.

FutureFirst Launched

One of the biggest additions to Good Judgment’s spectrum of services in 2021 was the launch of FutureFirst. FutureFirst is a client-driven subscription platform that gives our user community unlimited access to all Good Judgment’s subscription Superforecasts.

In many ways, FutureFirst is a consolidation of our scientific experiments and several years of successful client engagements. We designed FutureFirst to

    • offer clients one-click access to the collective wisdom of our international team of Superforecasters—to their predictions, rationales, and sources;
    • enable easy monitoring of the Superforecasters’ predictions on a wide range of topics (economy and finance, geopolitics, environment, technology, health, and more); and
    • allow clients to nominate and upvote new questions that matter to their organization so that the topics are crowd-sourced from the community of clients directly.
A sample of books that mention Superforecasters

With the addition of the Class of 2022 Superforecasters, Good Judgment now works with more than 180 professional Superforecasters. They reside on every continent except Antarctica and have been identified through a rigorous process to join the world’s most accurate forecasters.

There are currently some 80 active forecasts on FutureFirst, with new questions being added nearly every week. Taken together, the forecasts on the platform paint the big picture of global risk with accuracy not found anywhere else.

Improving Ways of Eliciting and Aggregating Forecasts

At the same time, we continue to crowdsource other ideas to enhance the value of our service for clients. In response to user feedback and innovations by our data science team, we:

    • now offer “continuous forecasts” so that clients can have a target forecast number as well as probabilities distributed across ranges;
    • provide “rolling forecasts” on a custom basis with predictions that automatically advance each day so that the time horizon is fixed—for instance, the probability of a recession in the next 12 months;
    • will be launching API access shortly for clients to have a data feed directly into their models.


Superforecasters in the Media

Superforecasters predict Jerome Powell’s reappointment

From questions about the Tokyo Olympics to the renomination of Jerome Powell to our early forecasts about Covid-19 that were closed and scored in the past year, the year 2021 offered many examples of Good Judgment’s Superforecasters providing early and accurate insights. The European Central Bank and financial firms such as Goldman Sachs and T. Rowe Price all referenced our forecasts in their work. The year also brought both new and returning collaborations with some of the world’s leading media organizations and authors.

    • We worked with The Economist on their “What If” and “The World Ahead 2022” annual publications.
    • The Financial Times featured our forecast on Covid-19 vaccinations on their front page and on their Covid-19 data page.
    • Sky News launched an exciting current affairs challenge for the UK and beyond on our public platform GJ Open.
    • Best-selling authors Tim Harford and Adam Grant also ran popular forecasting challenges.
    • Adam Grant’s Think Again and Daniel Kahneman’s Noise (with coauthors Oliver Sibony and Cass R. Sunstein) published in 2021 discuss Superforecasters’ outstanding track record.
    • Magazines such as Luckbox and Entrepreneur published major articles about Good Judgment and the Superforecasters.


Training Better Forecasters

Our workshops continued to attract astute business and government participants who received the best training on boosting in-house forecasting accuracy. Of all the organizations that had a workshop in 2020, more than 90% came back for more in 2021. And they were joined by many more organizations in the public and private sectors throughout the year. Many of these firms now regularly send their interns and new hires through our workshops. Capstone LLC, a global policy analysis firm with headquarters in Washington, DC, London, and Sydney, went a step further: They made our workshops the cornerstone of multi-day mandatory training sessions for all their analysts.

“This led to the adoption of [S]uperforecasting techniques across all of our research and a more rigorous measuring of all our predictions,” Capstone CEO David Barrosse wrote on the company’s blog. “Ultimately the process means better predictions, and more value for clients.”

As many in our company are themselves Superforecasters, we start any forecast about Good Judgment in 2022 by first looking back. The science of Superforecasting has shown that establishing a base rate leads to making more accurate predictions. If the developments in 2021 are a valid indication, next year will bring more exciting projects, fruitful collaborations, and effective ways to bring valuable early insight to our clients.

Superforecasters: Still Crème de la Crème Six Years On

Superforecasters: Still Crème de la Crème Six Years On

The multi-year geopolitical forecasting tournament sponsored by the research arm of the US Intelligence Community (IARPA) that led to the groundbreaking discovery of “Superforecasters” ended in 2015. Since then, public and private forecasting platforms and wisdom-of-the-crowd techniques have only proliferated. Six years on, are Good Judgment’s Superforecasters still more accurate than a group of regular forecasters? What, if anything, sets their forecasts apart from the forecasts of a large crowd?

A bar graph showing the Superforecasters' error scores are lower than those of regular forecasters
From the paper: Superforecasters’ accuracy outstrips wisdom-of-the-crowd scores.

A new white paper by Dr. Chris Karvetski, senior data and decision scientist with Good Judgment Inc (GJ Inc), compares six years’ worth of forecasts on the GJ Inc Superforecaster platform and the GJ Open public forecasting platform to answer these questions.

Key takeaway: Superforecasters, while a comparatively small group, are significantly more accurate than their GJ Open forecasting peers. The analysis shows they can forecast outcomes 300 days prior to resolution better than their peers do at 30 days from resolution.

Who are “Superforecasters”?

During the IARPA tournament, Superforecasters routinely placed in the top 2% of accuracy among their peers and were a winning component of the experimental research program of the Good Judgment Project, one of five teams that competed in the initial tournaments. Notably, these elite forecasters were over 30% more accurate than US intelligence analysts forecasting the same events with access to classified information.

Key Findings

Calibration plot showing the Superforecasters are 79% closer to perfect calibration
From the paper: Regular forecasters tend to show overconfidence, whereas the Superforecasters are close to perfect calibration.

Dr. Karvetski’s analysis presented in “Superforecasters: A Decade of Stochastic Dominance” uses forecasting data over a six-year period (2015-2021) on 108 geopolitical forecasting questions that were posted simultaneously on Good Judgment Inc’s Superforecaster platform (available to FutureFirst™ clients) as well as the Good Judgment Open (GJ Open) forecasting platform, an online forecasting platform that allows anyone to sign up, make forecasts, and track their accuracy over time and against their peers.

The data showed:

  • Despite being relatively small in number, the Superforecasters are much more prolific, and make almost four times more forecasts per question versus GJ Open forecasters.
  • They are also much more likely to update their beliefs via small, incremental changes to their forecast.
  • Based on the Superforecasters’ daily average error scores, they are 35.9% more accurate than their GJ Open counterparts.
  • Aggregation has a notably larger effect on GJ Open forecasters; yet, the Superforecaster aggregate forecasts are, on average, 25.1% more accurate than the aggregate forecasts using GJ Open forecasts.
  • The average error score for GJ Open forecasters at 30 days from resolution is larger than any of the average error scores of Superforecasters on any day up to 300 days prior to resolution.
  • GJ Open forecasters, in general, were over-confident in their forecasts. The Superforecasters, in contrast, are 79% better calibrated. “This implies a forecast from Superforecasters can be taken at its probabilistic face value,” Dr. Karvetski explains.
  • Finally, the amount of between-forecaster noise is minimal, implying the Superforecasters are better at translating the variety of different signals into a numeric estimate of chance.

You can read the full paper here.

Where Can I Learn More About Superforecasting?

Subscription to FutureFirst, Good Judgment’s exclusive monitoring tool, gives clients 24/7 access to Superforecasters’ forecasts to help companies and organizations quantify risk, improve judgment, and make better decisions about future events.

Our Superforecasting workshops incorporate Good Judgment research findings and practical Superforecaster know-how. Learn more about private workshops, tailored to the needs of your organization, or public workshops that we offer.

A journey to becoming a Superforecaster begins at GJ Open. Learn more about how to become a Superforecaster.