In Superforecasting: The Art and Science of Prediction, Good Judgment co-founder Philip Tetlock and his co-author Dan Gardner summarize the Good Judgment Project research findings in the form of “Ten Commandments for Aspiring Superforecasters.” These commandments describe behaviors that have been “experimentally demonstrated to boost [forecasting] accuracy.” You can learn more about these commandments—and practice applying them under the guidance of professional Superforecasters—at one of Good Judgment’s training workshops.
Focus on questions where your hard work is likely to pay off. Don’t waste time either on easy “clocklike” questions (where simple rules of thumb can get you close to the right answer) or on impenetrable “cloud-like” questions (where even fancy statistical models can’t beat the dart-throwing chimp). Concentrate on questions in the Goldilocks zone of difficulty, where effort pays off the most.
Channel the playful but disciplined spirit of Enrico Fermi who—when he wasn’t designing the world’s first atomic reactor—loved ballparking answers to head-scratchers such as “How many extraterrestrial civilizations exist in the universe?” Break apart the problem into its knowable and unknowable parts. Flush ignorance into the open. Expose and examine your assumptions. Dare to be wrong by making your best guesses. Better to discover errors quickly than to hide them behind vague verbiage.
Superforecasters know that there is nothing new under the sun. Nothing is 100% “unique.” Language purists be damned: uniqueness is a matter of degree. So Superforecasters conduct creative searches for comparison classes even for seemingly unique events, such as the outcome of a hunt for a high-profile terrorist (Joseph Kony) or the standoff between a new socialist government in Athens and Greece’s creditors. Superforecasters are in the habit of posing the outside-view question: How often do things of this sort happen in situations of this sort?
Belief updating is to good forecasting as brushing and flossing are to good dental hygiene. It can be boring, occasionally uncomfortable, but it pays off in the long term. That said, don’t suppose that belief updating is always easy because it sometimes is. Skillful updating requires teasing subtle signals from noisy news flows— all the while resisting the lure of wishful thinking.
For every good policy argument, there is typically a counterargument that is at least worth acknowledging. For instance, if you are a devout dove who believes that threatening military action never brings peace, be open to the possibility that you might be wrong about Iran. And the same advice applies if you are a devout hawk who believes that soft “appeasement” policies never pay off. Each side should list, in advance, the signs that would nudge them toward the other.
As in poker, you have an advantage if you are better than your competitors at separating 60/40 bets from 40/60—or 55/45 from 45/55. Translating vague-verbiage hunches into numeric probabilities feels unnatural at first, but it can be done. It just requires patience and practice. The Superforecasters have shown what is possible.
Superforecasters understand the risks both of rushing to judgment and of dawdling too long near “maybe.” They routinely manage the trade-off between the need to take decisive stands (who wants to listen to a waffler?) and the need to qualify their stands (who wants to listen to a blowhard?). They realize that long-term accuracy requires getting good scores on both calibration and resolution—which requires moving beyond blame-game ping-pong. It is not enough just to avoid the most recent mistake. They have to find creative ways to tamp down both types of forecasting errors—misses and false alarms—to the degree a fickle world permits such uncontroversial improvements in accuracy.
Don’t try to justify or excuse your failures. Own them! Conduct unflinching postmortems: Where exactly did I go wrong? And remember that although the more common error is to learn too little from failure and to overlook flaws in your basic assumptions, it is also possible to learn too much (you may have been basically on the right track but made a minor technical mistake that had big ramifications). Don’t forget to do postmortems on your successes, too. Not all successes imply that your reasoning was right. You may have just lucked out by making offsetting errors.
Master the fine art of team management, especially perspective taking (understanding the arguments of the other side so well that you can reproduce them to the other’s satisfaction), precision questioning (helping others to clarify their arguments so they are not misunderstood), and constructive confrontation (learning to disagree without being disagreeable). Wise leaders know how fine the line can be between a helpful suggestion and micromanagerial meddling or between a rigid group and a decisive one or between a scatterbrained group and an open-minded one.
Implementing each commandment requires balancing opposing errors. Just as you can’t learn to ride a bicycle by reading a physics textbook, you can’t become a superforecaster by reading training manuals. Learning requires doing, with good feedback that leaves no ambiguity about whether you are succeeding—“I’m rolling along smoothly!”—or whether you are failing—“crash!” Also remember that practice is not just going through the motions of making forecasts, or casually reading the news and tossing out probabilities. Like all other known forms of expertise, superforecasting is the product of deep, deliberative practice.
For those ready to engage in such deep, deliberative practice, our public forecasting site Good Judgment Open offers friendly competition against other aspiring Superforecasters in topical forecasting challenges covering finance and economics, geopolitics, popular culture, and many more topics. The most accurate Good Judgment Open forecasters also have the opportunity to join the ranks of Good Judgment’s professional Superforecasters, the most accurate forecasters in the business!
“It is impossible to lay down binding rules,” Helmuth von Moltke warned, “because two cases will never be exactly the same.”
Schedule a consultation to learn how our FutureFirst monitoring tool, custom Superforecasts, and training services can help your organization make better decisions.