- Who We Serve
- Think Again Challenge
- Client Sign In
Superforecasters represented the cream of the crop of the Good Judgment Project forecasters. And they’ve proven themselves time and again since turning professional in 2015. Below, we present data about their track record in both absolute and relative terms.
“Team Good Judgment, led by Philip Tetlock and Barbara Mellers of the University of Pennsylvania, beat the control group by more than 50%. This is the largest improvement in judgmental forecasting accuracy observed in the literature.”Steven Rieber, Program Manager, IARPA
An analysis of Good Judgment Project forecasts by UC-Irvine decision scientist Mark Steyvers found that Superforecasters anticipated events 400 days in advance as accurately as regular forecasters could see those events 150 days ahead.
Forecasting questions posed to the Superforecasters that have "resolved"
Average percentage of days on which our forecasts placed the highest probability on the "correct" outcome
Average probability that Superforecasters assigned to the "correct" outcome over all forecasting days
We’re keeping score on each and every question posted to our public Superforecasts dashboard, which allows you to see more granular detail about a subset of the questions included in the more complete analysis reported above.
Where readily available, we include comparisons to what others were saying on the same forecasting topics. But, unlike the head-to-head competitions described at the top of this page, we don’t always have easy, direct comparisons because the Superforecasters are providing early insight on questions that almost no one else has even attempted to address.
Schedule a consultation to learn how our FutureFirst monitoring tool, custom Superforecasts, and training services can help your organization make better decisions.