Superforecaster Tips: Dealing with Confirmation Bias in Election Forecasting
As the 2024 US election approaches, forecasters are faced with the daunting task of finding signal amid a cacophony of partisan noise, personal biases, and volatile public opinion. One significant challenge is confirmation bias—the tendency to search for, interpret, and recall information in a way that confirms one’s preconceptions. In this blog post, we draw on an internal discussion among seasoned Superforecasters to explore practical strategies forecasters can use to mitigate confirmation bias in election forecasting.
Diversifying Information Sources
“Assign yourself to spend some time reading (reasonably reputable) news sources that disagree with your general perspective on the question.”
Superforecasters highlight the importance of consuming a balanced diet of news sources, including those that challenge one’s beliefs. This approach was systematized by Good Judgment Project (GJP) superforecaster Doug Lorch, who wrote a program to randomize his news intake among a diverse set of sources.
“It certainly didn’t hurt,” recalls Terry Murray, CEO Emeritus of Good Judgment Inc and Project Manager for the GJP at UC–Berkeley. “He was the top forecaster in the whole IARPA tournament that year.”
Engaging in Scenario Analysis and Premortems
“I try to run through various scenarios where [the expected winner] could end up losing.”
Superforecasters routinely consider alternative outcomes by rigorously testing their own assumptions and logic. This involves running through various scenarios where expected outcomes might not materialize and thinking critically about the conditions that would lead to different results.
Embracing Epistemic Humility
“One thing I know is that I don’t know much.”
Acknowledging the limits of one’s knowledge and being open to new information is another tip the Superforecasters offer. This strategy is crucial for preventing overconfidence and being receptive to counterarguments.
Red Teaming
“One of the most important duties for me, as a Red Team member, is not to convince a forecaster that they are wrong… Rather, it’s to test the confidence of the Superforecaster in their own forecast.”
Having a red team to challenge forecasts helps forecasters to re-evaluate the confidence in their arguments and consider why they might be wrong. Red teaming is a standard practice in all Good Judgment’s forecasting.
Leveraging Collective Wisdom
“Sometimes, it pays to listen to the articulated reason of an outlier.”
Some Superforecasters use the median forecast of their group as a benchmark, particularly when their individual estimates deviate significantly from the consensus. This approach can provide a reality check against one’s own extremes. It is important, however, to pay attention to outlier opinions too, to resist conformity and groupthink.
As we dive into another election cycle, the discipline of forecasting reminds us that remaining actively open-minded is more crucial than ever. Combating confirmation bias in election forecasting is no small feat, given the complexity and the emotionally charged nature of politics. However, by employing strategies such as diversifying information sources, engaging in premortems, practicing epistemic humility, employing red teaming, and referencing the collective wisdom of peers, forecasters can enhance the accuracy and reliability of their predictions. Good Judgment’s exclusive forecast monitoring tool FutureFirst™ offers daily forecast updates on election results and trends and many other topics, brought to you by professional Superforecasters.
Learn More about FutureFirst™!