- Client Sign In
“Teamwork, teamwork, that’s what it takes.”
The cheerleaders’ chant may spur high-school athletes to greater success. But those who study judgment and decision-making have found mixed results from teaming.
That’s why Good Judgment Project researchers weren’t sure what to expect eight years ago, when we launched a controlled experiment to see whether having forecasters work in small teams, rather than as individuals, would help or harm accuracy. Forecasting teams could share information and divide the workload among themselves; however, groupthink within teams could produce pressures for conformity and reduce the diversity of opinions.
Round 1 of IARPA’s massive four-year forecasting tournament gave us compelling evidence in favor of teaming. With simple aggregations, our team opinion pools even outperformed prediction markets, previously considered the gold standard for crowdsourced forecasting accuracy. We replicated this result in each of the three subsequent years of the research project.
As we formed our commercial venture, we wondered whether the accuracy gains from teaming would be equally evident when forecasters met face-to-face, rather than online. There’s good news. Good Judgment’s experience with nearly 100 public and private workshops suggests that well-structured teaming techniques improve forecasting accuracy when teammates collaborate in person, not just when they work asynchronously online.
Workshop attendees receive an introduction to Superforecasting principles, including approaches to minimize cognitive biases and to break forecasting problems down into more tractable pieces. Then, they break into small groups, facilitated by professional Superforecasters and Good Judgment researchers, to practice their new skills by making forecasts on real-world questions that the group selects. Each participant makes an initial, private forecast, after which the entire group shares their forecasts and reasoning. Finally, they each post an anonymous update to their initial forecast. Good Judgment records those updates and tracks forecasters’ accuracy after the outcomes of the forecasting questions are known (i.e., the question has “resolved”).
We looked at data from five workshops in the US and Europe in which forecasters made predictions on questions that resolved in 2019. Workshop participants came from both the private and public sectors. Three workshops mixed participants from different organizations; the other two were private workshops in which all participants came from a single entity.
The same pattern of improved accuracy after group discussion occurred in every workshop, with a mean improvement in group accuracy of over 20%. Only once (when forecasting whether Nicolas Maduro would remain in power in Venezuela) did participants become, on average, less accurate after the second round of forecasting. But for wide-ranging topics illustrated below, the structured teaming experience helped groups make better forecasts.
You can experience the power of groups thinking together at one of Good Judgment’s upcoming public workshops − whether virtual or in-person.
Or contact us to arrange a private workshop specially tailored for your organization.
The Power of Groups Thinking Together … without Groupthink
Schedule a consultation to learn how our FutureFirst monitoring tool, custom Superforecasts, and training services can help your organization make better decisions.