Resources > The Superforecasters’ Track Record > Supers vs hybrid systems

Superforecasters took on hybrid human-machine forecasting systems − and won

We live in an era in which human judgment is rapidly being replaced by artificial intelligence. But forecasting geopolitical and economic events is far more difficult than winning at Jeopardy or Go.

In a US-government-sponsored competition, Superforecasters took on three competing research teams, each with millions of dollars of funding, who built hybrid forecasting systems to combine statistical models, automated tools, and judgments from over 1,000 human forecasters. 187 forecasting questions later, the results were clear.

The Superforecasters were 20% more accurate than the closest competitor and 21% more accurate than the control group.

They achieved this impressive victory by being “justifiably confident but appropriately humble,” as we explain further below.

Our calculations use the public HFC competition data that IARPA, the US-government sponsor of this tournament, has published. The forecasts for the 187 questions in this competition are not included in the summary statistics presented elsewhere for Good Judgment’s commercial forecasts.

Intrigued?

Stop guessing. Start Superforecasting.

Schedule a consultation to learn how our FutureFirst monitoring tool, custom Superforecasts, and training services can help your organization make better decisions.

  • This field is for validation purposes and should be left unchanged.

Keep up with Superforecasting news. Discover training opportunities. Make better decisions. Sign up for Good Judgment’s newsletter.

  • This field is for validation purposes and should be left unchanged.