As of November 18th, ten of the questions on our Public Dashboard have resolved. The average Brier score for our aggregate Superforecasts over the ten questions is 0.30 on a 0 to 2 scale, where 0 represents perfect prescience.
Below, we present the full forecasting history for those questions, along with the Superforecasters’ Brier scores and (where available) comparative data about what other analysts and pundits were saying while these questions were still open.
Check out our public dashboard to see what the Superforecasters have to say about ongoing forecasting questions about the prospects for widely available vaccines, US politics, COVID case and mortality metrics, and both US and global economic and financial indicators.
The second wave of cases in Europe was earlier and more severe than the Superforecasters had anticipated; the second wave of deaths followed, with our forecasting question resolving on November 14th. What was predictable was that the Brier score for our aggregate forecast on this question would also reflect the difficulty of forecasting exponential growth months in advance: we received an unusually high (poor) Brier score of 1.0834. As always, this score is on a scale of 0 to 2, where 0 equates to perfect prescience.
Below, we display the time series of our forecasts.
Except for a brief period in late August, the Superforecasters were “on the right side of maybe” on this question throughout the forecasting period, converging on near-certainty in the final week. Accordingly, the Brier score for our aggregate forecast was a solid 0.2087.
The FDA’s decision to give Remdesivir full approval as a COVID-19 treatment came more rapidly than more Superforecasters had anticipated. Their commentary had correctly identified Remdesivir as being most likely to be the first COVID treatment to receive full FDA approval; however, they expected the FDA to take more time in reviewing the application, especially because the drug was already available for prescription via Emergency Use Authorization. Being off on timing resulted in a relatively high Brier score of 0.7455 on a 0-2 scale, where 0 reflects perfect prescience.
These questions presented “a tale of two forecasts.” On the one hand, the Superforecasters correctly anticipated from the day the question was launched that the US would have 200,000 or more confirmed new COVID cases in either Q4 20 or Q1 21. Their confident and accurate forecasts earned an extremely low Brier score of 0.0112 for this question.
On the other hand, they were caught by surprise when confirmed new COVID cases in Europe suddenly increased and gave greatest weight to the correct answer bin (“more than 59,163”) only after mid-September. This “October surprise” led to their worst Brier score to date for a COVID-related question, 0.5204 on a 0-2 scale, where 0 reflects perfect prescience.
We display the time-series of their forecasts below.
This question resolved on September 21st when the New York City public schools resumed limited in-person instruction for pre-K and certain other students. The aggregate Superforecast over time, as displayed on the historical chart below, strongly projected this outcome from the first day the question was open for forecasting and accordingly received an extremely low 0.0068 Brier score.
This question resolved on September 19th. The aggregate Superforecast over time, as displayed on the historical chart below, received a 0.2935 Brier score.
This question resolved on August 10th when UCLA resumed limited in-person instruction for certain medical-school courses. The aggregate Superforecast over time, as displayed on the historical chart below, received a 0.1244 Brier score.
Good Judgment’s Superforecasters truly knocked this forecast out of the park, achieving a 0.007 Brier score on this question. As with the Magic Kingdom question, they showed early insight by:
The compound forecasting question makes direct comparisons difficult; however, we display other publicly available information on the graph below to provide context for evaluating the Superforecasters’ projections at each point in time.
Footnotes:
Superforecasters achieved a 0.04 Brier score on this forecasting question, showing early insight by:
Their forecasts compared favorably to other publicly available forecasts and statements from Disney management, as displayed on the graph below.
Schedule a consultation to learn how our Superforecaster Analytics, Workshops, or Staffcasting services can help your organization make better decisions.