A BBC listener recently asked the team behind their “CrowdScience” radio show and podcast whether she might, in fact, be a Superforecaster. She’s not the first to wonder if she would qualify – and not the first to be curious about how Good Judgment spots and recruits superior forecasting talent.
For all curious souls out there, here’s the inside scoop.
Superforecasters were a surprise discovery of the Good Judgment Project (GJP), the research-and-development project that preceded Good Judgment Inc. GJP was the winner of the massive US-government-sponsored four-year geopolitical forecasting tournament known as ACE.
As our co-founder Phil Tetlock explains in his bestseller Superforecasting: The Art and Science of Prediction, we set up GJP as a controlled experiment with random assignment of participants. Results from the first year of the tournament established that teaming and training could boost forecasting accuracy. Therefore, we selected GJP superforecasters from the top 2% of forecasters in each experimental condition to account for the advantages of having been on a team and/or received training. To minimize the chance that outstanding accuracy resulted from luck rather than skill, we limited eligibility for GJP superforecaster status to those forecasters who participated in at least 50 forecasting questions during a tournament “season.”
When the ACE tournament ended in mid-2015, Good Judgment Inc invited the forecasters with the best-established track records to become the core of our professional Superforecaster contingent. Most of Good Judgment Inc’s 150+ professional Superforecasters qualified through their relative accuracy during GJP. They had not only earned GJP superforecaster status over one tournament season, but also confirmed their accuracy over 50+ questions in a second year of forecasting.
Good Judgment has continued to identify and recruit new professional Superforecasters since the ACE tournament ended via our public forecasting platform, Good Judgment Open. Many of these new recruits never had a chance to participate in ACE − and they have performed every bit as well as the “original” GJP superforecasters.
Each autumn, Good Judgment will identify and recruit potential Superforecasters from the ranks of GJ Open forecasters. If you think you have what it takes to be a Superforecaster, put that belief to an objective test: forecast on at least 100 GJ Open questions (cumulatively, not necessarily in one year).
Those who consistently outperform the crowd are automatically eligible for our annual pro Super selection process and potentially a trial engagement. Those who successfully complete a three-month probation period will become full-fledged Superforecasters.
So, there you have it. Becoming a professional Superforecaster is the ultimate meritocracy. We don’t care where you live (as long as you have reliable Internet access!). When evaluating your qualifications for Superforecaster status, we ignore your gender, age, race, religion, and even education. (We are, however, keen to have an increasingly diverse pool of professional Superforecasters and encourage people of all backgrounds to test their skills on GJ Open!) We simply want to find the world’s most accurate forecasters – and to nurture their talents in a collaborative environment with other highly skilled professionals.
While you’re pondering whether you have what it takes to be a Superforecaster, check out the BBC CrowdScience podcast episode that answers their listener’s question and interviews Good Judgment Managing Director and professional Superforecaster Michael Story.
And, as always, you can visit our Superforecaster Analytics page to learn more about how Good Judgment’s professional Superforecasters provide early insights and well-calibrated probability estimates about key risks and opportunities to help governments, corporate clients, and NGOs make better decisions.