A Year in Review

Good Judgment Inc: A Year in Review

From our headquarters in Manhattan to Canada to Brazil and points in between, the Good Judgment team had a productive and exciting year in 2021. Here are some of the key developments and projects we worked on in the past year.

FutureFirst Launched

One of the biggest additions to Good Judgment’s spectrum of services in 2021 was the launch of FutureFirst. FutureFirst is a client-driven subscription platform that gives our user community unlimited access to all Good Judgment’s subscription Superforecasts.

In many ways, FutureFirst is a consolidation of our scientific experiments and several years of successful client engagements. We designed FutureFirst to

    • offer clients one-click access to the collective wisdom of our international team of Superforecasters—to their predictions, rationales, and sources;
    • enable easy monitoring of the Superforecasters’ predictions on a wide range of topics (economy and finance, geopolitics, environment, technology, health, and more); and
    • allow clients to nominate and upvote new questions that matter to their organization so that the topics are crowd-sourced from the community of clients directly.
A sample of books that mention Superforecasters

With the addition of the Class of 2022 Superforecasters, Good Judgment now works with more than 180 professional Superforecasters. They reside on every continent except Antarctica and have been identified through a rigorous process to join the world’s most accurate forecasters.

There are currently some 80 active forecasts on FutureFirst, with new questions being added nearly every week. Taken together, the forecasts on the platform paint the big picture of global risk with accuracy not found anywhere else.

Improving Ways of Eliciting and Aggregating Forecasts

At the same time, we continue to crowdsource other ideas to enhance the value of our service for clients. In response to user feedback and innovations by our data science team, we:

    • now offer “continuous forecasts” so that clients can have a target forecast number as well as probabilities distributed across ranges;
    • provide “rolling forecasts” on a custom basis with predictions that automatically advance each day so that the time horizon is fixed—for instance, the probability of a recession in the next 12 months;
    • will be launching API access shortly for clients to have a data feed directly into their models.

 

Superforecasters in the Media

Superforecasters predict Jerome Powell’s reappointment

From questions about the Tokyo Olympics to the renomination of Jerome Powell to our early forecasts about Covid-19 that were closed and scored in the past year, the year 2021 offered many examples of Good Judgment’s Superforecasters providing early and accurate insights. The European Central Bank and financial firms such as Goldman Sachs and T. Rowe Price all referenced our forecasts in their work. The year also brought both new and returning collaborations with some of the world’s leading media organizations and authors.

    • We worked with The Economist on their “What If” and “The World Ahead 2022” annual publications.
    • The Financial Times featured our forecast on Covid-19 vaccinations on their front page and on their Covid-19 data page.
    • Sky News launched an exciting current affairs challenge for the UK and beyond on our public platform GJ Open.
    • Best-selling authors Tim Harford and Adam Grant also ran popular forecasting challenges.
    • Adam Grant’s Think Again and Daniel Kahneman’s Noise (with coauthors Oliver Sibony and Cass R. Sunstein) published in 2021 discuss Superforecasters’ outstanding track record.
    • Magazines such as Luckbox and Entrepreneur published major articles about Good Judgment and the Superforecasters.

 

Training Better Forecasters

Our workshops continued to attract astute business and government participants who received the best training on boosting in-house forecasting accuracy. Of all the organizations that had a workshop in 2020, more than 90% came back for more in 2021. And they were joined by many more organizations in the public and private sectors throughout the year. Many of these firms now regularly send their interns and new hires through our workshops. Capstone LLC, a global policy analysis firm with headquarters in Washington, DC, London, and Sydney, went a step further: They made our workshops the cornerstone of multi-day mandatory training sessions for all their analysts.

“This led to the adoption of [S]uperforecasting techniques across all of our research and a more rigorous measuring of all our predictions,” Capstone CEO David Barrosse wrote on the company’s blog. “Ultimately the process means better predictions, and more value for clients.”

As many in our company are themselves Superforecasters, we start any forecast about Good Judgment in 2022 by first looking back. The science of Superforecasting has shown that establishing a base rate leads to making more accurate predictions. If the developments in 2021 are a valid indication, next year will bring more exciting projects, fruitful collaborations, and effective ways to bring valuable early insight to our clients.

Superforecasters: Still Crème de la Crème Six Years On

Superforecasters: Still Crème de la Crème Six Years On

The multi-year geopolitical forecasting tournament sponsored by the research arm of the US Intelligence Community (IARPA) that led to the groundbreaking discovery of “Superforecasters” ended in 2015. Since then, public and private forecasting platforms and wisdom-of-the-crowd techniques have only proliferated. Six years on, are Good Judgment’s Superforecasters still more accurate than a group of regular forecasters? What, if anything, sets their forecasts apart from the forecasts of a large crowd?

A bar graph showing the Superforecasters' error scores are lower than those of regular forecasters
From the paper: Superforecasters’ accuracy outstrips wisdom-of-the-crowd scores.

A new white paper by Dr. Chris Karvetski, senior data and decision scientist with Good Judgment Inc (GJ Inc), compares six years’ worth of forecasts on the GJ Inc Superforecaster platform and the GJ Open public forecasting platform to answer these questions.

Key takeaway: Superforecasters, while a comparatively small group, are significantly more accurate than their GJ Open forecasting peers. The analysis shows they can forecast outcomes 300 days prior to resolution better than their peers do at 30 days from resolution.

Who are “Superforecasters”?

During the IARPA tournament, Superforecasters routinely placed in the top 2% of accuracy among their peers and were a winning component of the experimental research program of the Good Judgment Project, one of five teams that competed in the initial tournaments. Notably, these elite forecasters were over 30% more accurate than US intelligence analysts forecasting the same events with access to classified information.

Key Findings

Calibration plot showing the Superforecasters are 79% closer to perfect calibration
From the paper: Regular forecasters tend to show overconfidence, whereas the Superforecasters are close to perfect calibration.

Dr. Karvetski’s analysis presented in “Superforecasters: A Decade of Stochastic Dominance” uses forecasting data over a six-year period (2015-2021) on 108 geopolitical forecasting questions that were posted simultaneously on Good Judgment Inc’s Superforecaster platform (available to FutureFirst™ clients) as well as the Good Judgment Open (GJ Open) forecasting platform, an online forecasting platform that allows anyone to sign up, make forecasts, and track their accuracy over time and against their peers.

The data showed:

  • Despite being relatively small in number, the Superforecasters are much more prolific, and make almost four times more forecasts per question versus GJ Open forecasters.
  • They are also much more likely to update their beliefs via small, incremental changes to their forecast.
  • Based on the Superforecasters’ daily average error scores, they are 35.9% more accurate than their GJ Open counterparts.
  • Aggregation has a notably larger effect on GJ Open forecasters; yet, the Superforecaster aggregate forecasts are, on average, 25.1% more accurate than the aggregate forecasts using GJ Open forecasts.
  • The average error score for GJ Open forecasters at 30 days from resolution is larger than any of the average error scores of Superforecasters on any day up to 300 days prior to resolution.
  • GJ Open forecasters, in general, were over-confident in their forecasts. The Superforecasters, in contrast, are 79% better calibrated. “This implies a forecast from Superforecasters can be taken at its probabilistic face value,” Dr. Karvetski explains.
  • Finally, the amount of between-forecaster noise is minimal, implying the Superforecasters are better at translating the variety of different signals into a numeric estimate of chance.

You can read the full paper here.

Where Can I Learn More About Superforecasting?

Subscription to FutureFirst, Good Judgment’s exclusive monitoring tool, gives clients 24/7 access to Superforecasters’ forecasts to help companies and organizations quantify risk, improve judgment, and make better decisions about future events.

Our Superforecasting workshops incorporate Good Judgment research findings and practical Superforecaster know-how. Learn more about private workshops, tailored to the needs of your organization, or public workshops that we offer.

A journey to becoming a Superforecaster begins at GJ Open. Learn more about how to become a Superforecaster.

The Future of Health and Beyond

The Future of Health and Beyond: The Economist features Good Judgment’s Superforecasts

This summer, Good Judgment Inc collaborated with The Economist for the newspaper’s annual collection of speculative scenarios, “What If.” The theme this year was the future of health. In preparing the issue, The Economist asked the Superforecasters to work on several hypothetical scenarios—from America’s opioid crisis to the possibility of the Nobel prize for medicine being awarded to an AI. “Each of these stories is fiction,” the editors wrote in the 3 July edition, “but grounded in historical fact, current speculation, and real science.”

This was unlike most of the work that Good Judgment Inc does for clients. Our Superforecasters typically forecast concrete outcomes on a relatively short time horizon to inform decision- and policymakers about the key issues that matter to them today. The Economist’s “What If” project instead focused on a more speculative, distant future. To address the newspaper’s imaginative scenarios without sacrificing the rigor that Good Judgment’s Superforecasters and clients have become accustomed to, our question generation team crafted a set of relevant, forecastable questions to pair with each topic.

As a result, The Economist’s “What if America tackled its opioid crisis? An imagined scenario from 2025” was paired with our Superforecast: “How many opioid overdoses resulting in death will occur in the US in 2026?”

What if biohackers injected themselves with mRNA? An imagined scenario from 2029” was paired with: “How many RNA vaccines and therapeutics for humans will be FDA-approved as of 2031?”

And “What if marmosets lived on the Moon? An imagined scenario from 2055” was paired with: “When will the first human have lived for 180 days on or under the surface of the moon?”

Superforecaster, Social Scientist, and Archaeologist of Tempe, Arizona, Karen Hagar participated in forecasting these “far into the future” questions because, she says, she likes challenges.

“These questions were different than standard forecasting questions which typically resolve a year into the future,” she explains. “Both types of questions have inherent challenges. The questions with shorter resolution require extreme accuracy. One must research and mentally aggregate all incoming information. This includes any possible Black Swan events, current geopolitical and any social developments that may change within the short time frame. The dynamics of predicting outcomes of questions 10-20 years into the future required the same skill, but possibly even more research.”

The most exciting aspects of the “What If” project for Karen included learning the degree to which science has advanced. “For example, uncovering the scientific data regarding CRISP.R technology and its application to Alzheimer’s research was amazing,” she says.

In making her forecasts for The Economist, she studied the questions from all angles and played devil’s advocate to challenge her colleagues’ thinking. This technique of red-teaming is frequently used by professional Superforecasters to confront groupthink and elicit more accurate predictions.

“What If” is only one of Good Judgment’s several collaborative projects with The Economist. The newspaper’s “World in 2021,” recurring annually since the “World in 2017” and looking to forecast key metrics for the year ahead, consisted of questions that had shorter time horizons and were of immediate importance to decision-makers.

Superforecaster and Show Producer JuliAnn Blam says she is particularly interested in forecasting questions that focus on economic issues and the “World in 2021” project “didn’t disappoint.”

“The questions tended to be more pertinent to everyday life and issues that were of practical interest to me,” JuliAnn explains.

The “World in 2021” project included forecasting the world’s GDP growth, ESG (Environmental, Social, and Governance) investment, and work-from-home dynamics. But one of JuliAnn’s favorite questions was about racial diversity of board members in S&P 500 companies.

A screenshot of Good Judgment’s forecast monitor, FutureFirst, featuring the racial diversity forecast for The Economist’s “World in 2021” project.

“That one was hopeful, ‘woke’, and had me looking more closely at what a diversified board of directors can bring to a company’s outlook, marketing, product line, treatment of employees, etc.,” JuliAnn says. “It was a sort of stepping stone to looking into a lot more than just how many companies will appoint board members of color within the next year, and pushed the argument of why they should and what they would gain by doing so.”

Despite having a shorter time span than the “What If” forecasts, the “World in 2021” also required taking into account numerous factors, some of which weren’t even on the horizon when the questions were launched in October 2020. Take, for instance, the global GDP question.

“There are so many factors to consider, between Xi and Evergrande and the resultant fallout of the cascade from that default, to new COVID variants stopping workforces, anti-vax movements, the infrastructure bill and the green new deal, and then inflation,” JuliAnn says. “Tons to balance and think about!”

Whether it’s a forecast of global GDP next year or a possibility of using the Moon as a base for space exploration in the following decade, the Superforecasters always apply their rigorous process and tested skills to provide thoughtful numeric forecasts on questions that matter. As for their reward, Karen puts it best: “The enjoyment from forecasting is honing and improving forecasting skill, acquiring new information, and interacting with intellectuals of the same knowledge base.”

You can find Good Judgment’s Superforecasts on the “What If” questions in The Economist’s print edition from 3 July 2021 or on their website, or ask us about a subscription to FutureFirst, Good Judgment’s forecast monitor, to view all our current forecasts from our team of professional Superforecasters.