How to Become a Superforecaster®

How to Become a Superforecaster®

A BBC listener asked the team behind their “CrowdScience” radio show and podcast whether she might, in fact, be a Superforecaster. She’s not the first to wonder if she would qualify – and not the first to be curious about how Good Judgment spots and recruits superior forecasting talent.

For all curious souls out there, here’s the inside scoop.

Superforecaster Origins: the Good Judgment Project

Superforecasters were a surprise discovery of the Good Judgment Project (GJP), the research-and-development project that preceded Good Judgment Inc. GJP was the winner of the massive US-government-sponsored four-year geopolitical forecasting tournament known as ACE.

As our co-founder Phil Tetlock explains in his bestseller Superforecasting: The Art and Science of Prediction, we set up GJP as a controlled experiment with random assignment of participants. Results from the first year of the tournament established that teaming and training could boost forecasting accuracy. Therefore, we selected GJP superforecasters from the top 2% of forecasters in each experimental condition to account for the advantages of having been on a team and/or received training. To minimize the chance that outstanding accuracy resulted from luck rather than skill, we limited eligibility for GJP superforecaster status to those forecasters who participated in at least 50 forecasting questions during a tournament “season.”

Professional Superforecaster Selection

When the ACE tournament ended in mid-2015, Good Judgment Inc invited the forecasters with the best-established track records to become the core of our professional Superforecaster contingent. Most of Good Judgment Inc’s 150+ professional Superforecasters qualified through their relative accuracy during GJP. They had not only earned GJP superforecaster status over one tournament season, but also confirmed their accuracy over 50+ questions in a second year of forecasting.

Good Judgment has continued to identify and recruit new professional Superforecasters since the ACE tournament ended via our public forecasting platform, Good Judgment Open. Many of these new recruits never had a chance to participate in ACE − and they have performed every bit as well as the “original” GJP superforecasters.

Do You Have What It Takes?

Each autumn, Good Judgment will identify and recruit potential Superforecasters from the ranks of GJ Open forecasters. If you think you have what it takes to be a Superforecaster, put that belief to an objective test: forecast on at least 100 GJ Open questions (cumulatively, not necessarily in one year). We look at many metrics, but the one we look at closest is average accuracy score per closed question. We also look at comment quality, because understanding the rationales for forecasts is important to our partners and clients. Collegiality is also important to ensure good teamwork with other Superforecasters. And it goes without saying, you’ll need to be in compliance with the GJ Open Terms of Service.

Those who consistently outperform the crowd are automatically eligible for our annual pro Super selection process and potentially a trial engagement. Those who successfully complete a three-month probation period will become full-fledged Superforecasters.

So, there you have it. Becoming a professional Superforecaster is the ultimate meritocracy. We don’t care where you live (as long as you have reliable Internet access!). When evaluating your qualifications for Superforecaster status, we ignore your gender, age, race, religion, and even education. (We are, however, keen to have an increasingly diverse pool of professional Superforecasters and encourage people of all backgrounds to test their skills on GJ Open!) We simply want to find the world’s most accurate forecasters – and to nurture their talents in a collaborative environment with other highly skilled professionals.

For those who are not invited, please continue on GJ Open and improve your forecast skills. We find the process often identifies GJ Open “veterans” who have recently elevated their performance into the top tier.

… If You’re Still Curious

While you’re pondering whether you have what it takes to be a Superforecaster, check out the BBC CrowdScience podcast episode that answers their listener’s question.

And, as always, you can visit our Superforecaster Analytics page to learn more about how Good Judgment’s professional Superforecasters provide early insights and well-calibrated probability estimates about key risks and opportunities to help governments, corporate clients, and NGOs make better decisions.

The Power of Groups Thinking Together … without Groupthink

“Teamwork, teamwork, that’s what it takes.”

 

The cheerleaders’ chant may spur high-school athletes to greater success. But those who study judgment and decision-making have found mixed results from teaming.

That’s why Good Judgment Project researchers weren’t sure what to expect eight years ago, when we launched a controlled experiment to see whether having forecasters work in small teams, rather than as individuals, would help or harm accuracy. Forecasting teams could share information and divide the workload among themselves; however, groupthink within teams could produce pressures for conformity and reduce the diversity of opinions.

Round 1 of IARPA’s[1] massive four-year forecasting tournament gave us compelling evidence in favor of teaming. With simple aggregations, our team opinion pools even outperformed prediction markets, previously considered the gold standard for crowdsourced forecasting accuracy. We replicated this result in each of the three subsequent years of the research project.

As we formed our commercial venture, we wondered whether the accuracy gains from teaming would be equally evident when forecasters met face-to-face, rather than online. There’s good news. Good Judgment’s experience with nearly 100 public and private workshops suggests that well-structured teaming techniques improve forecasting accuracy when teammates collaborate in person, not just when they work asynchronously online.

Workshop attendees receive an introduction to Superforecasting principles, including approaches to minimize cognitive biases and to break forecasting problems down into more tractable pieces. Then, they break into small groups, facilitated by professional Superforecasters and Good Judgment researchers, to practice their new skills by making forecasts on real-world questions that the group selects. Each participant makes an initial, private forecast, after which the entire group shares their forecasts and reasoning. Finally, they each post an anonymous update to their initial forecast. Good Judgment records those updates and tracks forecasters’ accuracy after the outcomes of the forecasting questions are known (i.e., the question has “resolved”).

We looked at data from five workshops in the US and Europe in which forecasters made predictions on questions that resolved in 2019. Workshop participants came from both the private and public sectors. Three workshops mixed participants from different organizations; the other two were private workshops in which all participants came from a single entity.

The same pattern of improved accuracy after group discussion occurred in every workshop, with a mean improvement in group accuracy of over 20%. Only once (when forecasting whether Nicolas Maduro would remain in power in Venezuela) did participants become, on average, less accurate after the second round of forecasting. But for wide-ranging topics illustrated below, the structured teaming experience helped groups make better forecasts.

Labour Government in the UK: Accuracy Up 76%

 

AWS Share of Cloud Computing Market: Accuracy Up 88%

 

US Beef Consumption in 2018: Accuracy Up 97%

 

You can experience the power of groups thinking together at one of Good Judgment’s upcoming public workshops − whether virtual or in-person.

Or contact us to arrange a private workshop specially tailored for your organization.

The Power of Groups Thinking Together … without Groupthink

Superforecasting the Fed

[This post is from our 2019 archives.]

You know you’re onto something when the world’s central banks start paying attention.

As reported by Bloomberg, researchers at the Federal Reserve Bank of New York are pioneering ways to apply Superforecasting® techniques in their work. It’s also a topic of discussion elsewhere in the Federal Reserve system.

Meanwhile, staffers at the Bank of England provided a tentative yes to the question posed in their blog post, “Can central bankers become Superforecasters?” Their chief economist recommends that central banks engage directly with the public via an online forecasting platform, which would “open central banks’ ears (and eyebrows) to a wider range of societal stakeholders when setting policy.”

As it happens, central bankers can already do so on Good Judgment Open in the “Finance Forecasting Challenge,” sponsored by CFA Societies in the U.S. and Canada. It’s open to all, including questions on Fed policy decisions, economic data, and asset class returns.

And there are the Superforecasters themselves, who shadow-forecast the Fed’s own predictions for key macroeconomic variables, including growth, inflation, and unemployment, as well as the federal funds rate itself. Earlier this year, the Superforecasters began to show a risk that inflation in 2019 would fall below the Fed’s expected inflation rate of 1.9% for the year, which would help set the stage for a shift by the Fed to start cutting interest rates again. Since then, the annual inflation rate has been averaging 1.4%.

On the eve of the Fed’s July meeting, the Superforecasters had a 98% probability of a rate cut, in line with most market observers at this point. Looking ahead, the Superforecasters also project at least a 20% probability of additional cuts at each of the next two meetings. Subscribers to Superforecaster® Analytics can monitor updates to these forecasts on a dedicated dashboard.

If you would like to see what the Superforecasters are saying about the rest of 2019 and 2020, please drop us a line and we’ll be happy to send the latest report.