Handicapping the odds

Handicapping the odds: What gamblers can learn from Superforecasters

Successful gamblers, like good forecasters, need to be able to translate hunches into numeric probabilities. For most people, however, this skill is not innate. It requires cultivation and practice.

In Superforecasting: The Art and Science of Prediction, a best-selling book co-authored with Dan Gardner, Phil Tetlock writes: “Nuance matters. The more degrees of uncertainty you can distinguish, the better a forecaster you are likely to be. As in poker, you have an advantage if you are better than your competitors at separating 60/40 bets from 40/60—or 55/45 from 45/55.”

Good Judgment’s professional Superforecasters excel at this, but thinking in probabilities doesn’t come naturally to the majority of human beings. Daniel Kahneman and Amos Tversky, who studied decision making under risk, found that most people tend to overweight low probabilities (e.g., the odds of winning a lottery) and underweight other outcomes that were probable but not certain. In other words, people on average evaluate probabilities incorrectly even when making critical decisions.

Superforecasting gambling poker
Base rate neglect often leads to poor decisions in forecasting, finance, or gambling.

If you’ve participated in any Good Judgment training, you’ll know that the first step in estimating correct probabilities is to identify the base rate—the underlying likelihood of an event. This is also the step that the majority of decision makers tend to ignore. Base rate neglect is one of the most common cognitive biases we see in training programs and workshops, and it generally leads to poor investing, betting, and forecasting outcomes.

For those new to the concept, consider this classic example: “Steve is very shy and withdrawn, invariably helpful, but with little interest in people or the social world. A meek and tidy soul, he has a need for order and structure and a passion for detail.”

Is Steve more likely to be a librarian or a farmer? A librarian or a salesman?

While the description, offered in Daniel Kahneman’s Thinking, Fast and Slow, may be that of a stereotypical librarian, Steve is in fact 20 times more likely to be a farmer—and 83 times more likely to be a salesman—than a librarian. There are simply a lot more farmers and sales persons in the United States than male librarians.

Base rate neglect is the mind’s irrational tendency to disregard the underlying odds. Failure to account for the base rate could lead, for example, to the belief that participating in commercial forms of gambling is a good way of making money. Likewise, failure to factor in the house edge could lead to poor betting decisions.

Fortunately, the mind’s tendency to overlook the base rate can be corrected with training and practice.

Recognition of bias and noise, and techniques to mitigate their detrimental effects, should be at the heart of any training on better decision making. In Good Judgment workshops, we have consistently observed tangible improvements in the quality of forecasting as a result of debiasing interventions.

The other essential component is practice. On Good Judgment Inc’s public platform, GJ Open, anyone can try their hand at forecasting—from predicting the next NBA winner to estimating the future price of a bitcoin. Unsurprisingly, those forecasters who use base rates and forecast on the platform regularly tend to have better results.

To stay on top, gamblers, like successful forecasters and professional Superforecasters, need to actively seek out the base rate and mitigate other cognitive biases that interfere with their judgment. While “Thinking in Bets,” as professional gambler Annie Duke puts it in her best-seller, does not come easy to most people, better decision making—in forecasting, investing, and gambling alike—is a skill that can be learned. With an awareness of cognitive biases, debiasing techniques, and regular practice, anyone can acquire the mental discipline to handicap the odds more effectively.

* This article originally appeared in Luckbox Magazine and is shared with their permission.

Forecasting the Tokyo Olympics

Forecasting the Tokyo Olympics

In late July 2020, a year ahead of the Tokyo Olympics (postponed in 2020 and scheduled to open 23 July 2021), we asked the Superforecasters whether the Games will begin as planned. By 7 September 2020, the Superforecasters had a clear answer. Back then, they gave the Games a 63% probability of proceeding and have hardly looked back.

The picture was by no means clear if you followed media reports around the Olympics throughout the past year. The IOC and the organizing committee were adamant that options such as a cancellation or delay were off the table. The Japanese public became increasingly opposed to the event. COVID-19 hit the country with a new wave. A range of dissonant headlines, speculations, public opinion polls, and even allegations that Japan had privately decided to cancel the Games (Times) all contributed to the noise surrounding the future of the event.

Good Judgment’s professional Superforecasters are skilled at separating the signal from the noise. They took into account such factors as the associated costs; the likelihood that a vaccine would be developed, tested, and become available by the time of the event (this was months before any COVID-19 vaccine was found to be effective—and safe—in a large clinical trial); and the increasing international experience with measures to contain risk. See how their forecast evolved over time against the backdrop of media reports throughout the year.

A list of media headlines and key events is provided at the bottom of this article, demonstrating both the signal and the noise surrounding the Tokyo Olympics.

Full access to the Superforecasts and commentary is available through subscription via our FutureFirst™ monitor.

Tokyo Olympics: A Sample of Media Headlines Since July 2020

21 July 2020: USA Today: “As COVID-19 pandemic rages on, experts say it’s unlikely Tokyo Olympics can be held next summer”

20 Aug 2020: Japan Times: “Majority of Japanese firms against holding Tokyo Olympics in 2021”

7 Sept 2020: BBC: “Games will go ahead ‘with or without Covid’, says IOC VP”

1 Oct 2020: The Diplomat: “The International Olympic Committee has ruled out postponing the Tokyo Games for a second time”

1 Dec 2020: “Report: Delay of 2020 Tokyo Olympics cost $3 billion”

15 Dec 2020: Japan Times: “Most in Japan oppose holding Olympics in 2021, polls show”

27 Dec 2020: Kyodo: “Pandemic causing uncertainty, unease for Tokyo Olympic ‘host towns’”

7 Jan 2021: BBC: “Tokyo 2020: No guarantee Olympics will go ahead, says IOC’s Pound”

10 Jan 2021: Kyodo: “About 80% favor canceling, postponing Tokyo Olympics in summer: poll”

11 Jan 2021: Japan declares a state of emergency

13 Jan 2021: Guardian: “Tokyo’s Covid outbreak adds to doubts over hosting Olympic Games”

15 Jan 2021: NYT: “Hopes for Tokyo’s Summer Olympics Darken”

CBS: “Tokyo Olympics 2021: Spike in COVID-19 cases has Japanese officials bracing for possible postponement”

19 Jan 2021: AP: “Tokyo Olympics Q&A: 6 months out and murmurs of cancellation”

BBC: “Tokyo Olympics ‘unlikely to go ahead in 2021’”

21 Jan 2021: Times: “The Japanese government has privately concluded that the Tokyo Olympics will have to be cancelled because of the coronavirus”

22 Jan 2021: Reuters: “Japan and IOC deny that Olympics will be cancelled”

11 Feb 2021: Guardian: “Tokyo 2020 Olympics president expected to resign over sexist comments”

15 Feb 2021: CNN: “An earthquake at the Olympic torch relay start point is just the beleaguered Tokyo 2020 Games’ latest crisis”

17 Feb 2021: Seiko Hashimoto becomes new president of the Tokyo Olympic organizing committee

9 March 2021: Kyodo: “Japan to stage Tokyo Olympics without overseas spectators”

25 March 2021: Tokyo Olympic torch relay begins

15 April 2021: Washington Post: “Olympics could be canceled because of virus, Japan ruling party figure admits”

1 May 2021: Washington Post: “Olympic officials are determined to have a Tokyo Games despite Japan’s growing doubts”

12 May 2021: BBC: “Tokyo 2020: United States track and field team cancels pre-Olympic training in Japan”

14 May 2021: NPR: “Opposition to Tokyo Games Grows Heated amid COVID Concerns”

Guardian: “Hospitals overwhelmed as Covid cases surge in Osaka”

18 May 2021: CNBC: “Tokyo medical association calls for cancellation of Tokyo Olympics due to spike in Covid cases”

19 May 2021: CNN: “Dozens of Japanese towns have canceled plans to host foreign athletes from around the world due to concern over Covid-19”

25 May 2021: CNN: “Canceling Tokyo Olympics is ‘essentially off the table,’ says IOC member Dick Pound”

2 June 2021: AP: “Yes. Tokyo Olympics are ‘a go’ despite opposition, pandemic”

13 June 2021: CBS: “Cancel the Tokyo Olympics? It’s unlikely. Here’s why”

Super Quiet

Super Quiet: Kahneman’s Noise and the Superforecasters

Much is written about the detrimental role of bias in human judgment. Its companion, noise, on the other hand, often goes undetected or underestimated. Noise: A Flaw in Human Judgment, the new book by Nobel laureate Daniel Kahneman and his co-authors, Olivier Sibony and Cass R. Sunstein, exposes how noise—variability in judgments that should be identical—wreaks havoc in many fields, from law to medicine to economic forecasting.

Noise offers research-based insights into better decision-making and suggests remedies to reduce the titular source of error.

No book on making better judgments, of course, particularly better judgments in forecasting, would be complete without the mention of Superforecasters, and certainly not one co-authored by such a luminary of human judgment as Kahneman.

Superforecasters (discussed in detail in chapters 18 and 21 of the book) are a select group who “reliably out-predict their less-than-super peers” because they are able to consistently overcome both bias and noise. One could say, the Superforecasters are not only actively open-minded—they are also super quiet in their forecasts.

“What makes the Superforecasters so good?” the authors ask. For one, they are “unusually intelligent” and “unusually good with numbers.” But that’s not it.

“Their real advantage,” according to Kahneman, Sibony, and Sunstein, “is not their talent at math; it is their ease in thinking analytically and probabilistically.”

Noise identifies other qualities that set the Superforecasters apart from regular forecasters:

    • Willingness and ability to structure and disaggregate problems;
    • Taking the outside view;
    • Systematically looking for base rates.

In short, it’s not just their natural intelligence. It’s how they use it.

Not everyone is a good forecaster, of course, and while crowds are usually better than individuals, not every crowd is equally wise.

“It is obvious that in any task that requires judgment, some people will perform better than others will. Even a wisdom-of-crowds aggregate of judgments is likely to be better if the crowd is composed of more able people,” the authors state.

Good Judgment’s Superforecasters are unique, with an unbeaten track record, among a myriad of individual forecasters and forecasting firms. Kahneman, Sibony, and Sunstein are not surprised:

“Judgments are both less noisy and less biased when those who make them are well trained, are more intelligent, and have the right cognitive style.”

Good Judgment’s Training Reduces Noise

“Well trained” is a key word here. When the Superforecasters were discovered in “some of the most innovative work on the quality of forecasting”—the Good Judgment Project (GJP, 2011-2016)—they were the top 2% among thousands of volunteers. That doesn’t mean, however, that the rest of the world is doomed to drown in noisy decision-making. It is not an either-you-have-it-or-you-don’t skill.

According to Kahneman, Sibony, and Sunstein, “people can be trained to be superforecasters or at least to perform more like them.”

Good Judgment Inc’s online training and workshops do just that. Based on the concepts taught in the GJP training, these workshops are designed to reduce psychological biases—which, in turn, results in less noise.

Kahneman and colleagues explain how this works, citing the BIN (bias, information, and noise) model for forecasting developed by Ville Satopää, Marat Salikhov, and Good Judgment’s co-founders Phil Tetlock and Barb Mellers:

“When they affect different individuals on different judgments in different ways, psychological biases produce noise. … As a result, training forecasters to fight their psychological biases works—by reducing noise.”

Good Judgment’s training also focuses on teaming, another effective method scientifically demonstrated to reduce noise.

According to Kahneman, Sibony, and Sunstein, both private and public organizations—and the society at large—stand to gain much from reducing noise. “Should they do so, organizations could reduce widespread unfairness—and reduce costs in many areas,” the authors write. And the Superforecasters are an example for decision-makers to emulate in these efforts.

Learn Superforecasting from the pros at our Superforecasting Workshops
See upcoming workshops here