When AI Becomes a False Prophet: A Cautionary Tale for Forecasters

When AI Becomes a False Prophet: A Cautionary Tale for Forecasters

With a nod to Taylor Swift and Travis Kelce, Superforecaster Ryan Adler discusses the gospel according to AI and why forecasters should always verify their sources.

Google’s AI Overview references an AI-generated video to support a false claim.

The promises of artificial intelligence have set up camp in media headlines over the past few years. ChatGPT has become a household name, billions are being spent just to power the equipment to run these programs and models, and the cutting-edge technology is front and center in ongoing tensions between the US and China. It hasn’t left any aspect of human activity untouched, including forecasting.

To be sure, the impacts already felt cannot be understated. We are looking at the front end in what I’m confident will be a seismic shift in society, with large swaths of labor markets around the globe being shaken to their core. That said, we aren’t there yet.

Here’s a recent example of how AI took itself out at the knees regarding a recent forecasting question on Good Judgment Open. In late April 2025, the time came to close a question regarding potential nuptials between Kansas City Chiefs star Travis Kelce and pop superstar Taylor Swift: “Before 19 April 2025, will Travis Kelce and Taylor Swift announce or acknowledge that they are engaged to be married?” (It’s not my favorite subject matter, but we try to maintain a diverse pool of questions.)

As a moderately rabid Chiefs fan myself, I was confident the answer was no, because that would have made headlines across media outlets. However, a key part of the job of running a forecasting platform is being in the habit of double and triple checking. So, I checked with Google. I entered “Are Travis Kelce and…” into the search field, which immediately autofilled to “are travis and taylor engaged?” (The first-name thing with pop culture stars annoys me to no end, but I digress.) To my surprise, Google’s AI preview popped up immediately.

“Yes, according to reports, Travis Kelce and Taylor Swift are engaged.”

“Trust, but verify”

Skeptical, I looked at what the experimental generative AI response was using as a reference to return such a statement. That’s when things got fun.

The first link of the cited material was a YouTube video. Keep in mind that Google, the search engine I used to start my research, owns YouTube. The account that posted the video? DangerousAI. That alone raises more red flags than a May Day parade in Moscow circa 1974. The brief video, dated 24 February 2025, purported to show Travis Kelce announcing that Swift and he “got engaged last week.” However, as the video progressed, the absurdity of Kelce’s putative announcement became perfectly clear.

To sum up, Google’s AI system linked to search was fooled by an AI product posted on another Google platform to give a patently false response.

I don’t highlight this incident as a criticism of Google. However, it should serve as a warning. I’ve seen some GJ Open forecasters take AI responses as gospel. I’m here to tell you that in matters of facts vs fiction, AI is very capable of being a false prophet. This is not to say that AI isn’t an incredibly valuable tool. It certainly is! We are finding more and more uses for it at Good Judgment, but we put it through its paces long before we deem it reliable for a particular role. As the Russian proverb instructs, “Trust, but verify.” (No, President Reagan didn’t say it first.) When it comes to AI and everything else you see online, my suggestion is that you just verify.

Do you have what it takes to be a Superforecaster? Find out on GJ Open!

* Ryan Adler is a Superforecaster, GJ managing director, and leader of Good Judgment’s question team

Informed Practice and Superforecasting: Taking Your Forecasts to the Next Level

Informed Practice and Superforecasting: Taking Your Forecasts to the Next Level

“Not all practice improves skill. It needs to be informed practice.”
– Phil Tetlock & Dan Gardner in Superforecasting

In any area of decision-making where uncertainty looms large, accuracy is the gold standard. However, many decision makers often find themselves in a frustrating cycle—sometimes they make the right call, but other times they miss the mark entirely. Inconsistency can be costly. So, what separates those who occasionally succeed from those who reliably deliver top-notch forecasts? The answer lies in informed practice—one of the concepts at the heart of Superforecasting.

What Is Informed Practice?

Informed practice is not just repetition. It’s a deliberate and thoughtful process of learning from each forecast, refining techniques, and continuously updating one’s beliefs based on new information. It’s about approaching forecasting with a Superforecaster’s mindset—an outlook geared toward improvement, with a consistent effort to mitigate one’s cognitive biases.

What Can Forecasters Learn from Superforecasters?

Superforecasters, known for their uncanny forecasting accuracy, exemplify informed practice. They don’t pull numbers out of a hat or look into a crystal ball for answers. For every question they face, they engage in a rigorous process of analysis, reflection, and adjustment. Here’s how informed practice gives them the edge:

1. Learning from Feedback: Superforecasters thrive on feedback. They meticulously track their forecasts, comparing them against the outcomes to identify where they went right and where they missed the mark. This feedback loop is crucial. It allows them to recalibrate their approach and avoid making the same mistakes twice. Over time, this leads to more refined and accurate forecasts.

2. Understanding Probability: A key aspect of informed practice is the understanding and effective use of probability. Superforecasters don’t think in black-and-white, yes-or-no terms. They consider a range of possible outcomes and assign probabilities to each. They also update these probabilities as new information becomes available, a process known as Bayesian reasoning. This probabilistic thinking helps them navigate uncertainty with greater precision.

3. Continuous Learning: The world is constantly changing, and so too are the variables that influence forecasts. Superforecasters are voracious learners, continuously updating their knowledge base. They stay informed about the latest developments in multiple areas, thus grounding their forecasts in the most current data and insights.

4. Mitigating Cognitive Biases: Cognitive biases can cloud judgment and lead to poor forecasts. Superforecasters are keenly aware of these biases and actively work to mitigate their impact. Through informed practice, they develop strategies to counteract such biases as overconfidence, anchoring, confirmation bias, and more, to make well-calibrated forecasts.

What Is the Role of Collaboration in This?

Informed practice is not a solitary endeavor. Collaboration with other forecasters is a powerful tool for improving accuracy and keeping track. By engaging in discussions, comparing notes, and challenging each other’s assumptions, forecasters can gain new perspectives and insights. Good Judgment’s Superforecasters work in teams, leveraging the collective intelligence of the group to arrive at superior forecasts.

What Practical Steps Can I Take?

1. Keep Track: Keep a record of your forecasts and compare them with the outcomes. Analyze your hits and misses to identify patterns and areas for improvement.

2. Seek Feedback: Seek out feedback from peers or through forecasting platforms such as GJ Open, which provides performance metrics. Use this feedback to refine your approach.

3. Diversify Your Sources of Information: Regularly update your knowledge on the topics you forecast and seek out diverse sources. This includes staying current with news, research, and expert opinions, including those you disagree with.

4. Practice Probabilistic Thinking: Assign probabilities to your forecasts and be willing to adjust them as new information emerges. This helps you avoid the trap of binary thinking.

5. Challenge Your Assumptions: Regularly question your assumptions and be open to changing your mind. This flexibility is crucial in a rapidly changing world.

6. Get a Head Start with GJ Superforecasting Workshops: Consider enrolling in a Superforecasting workshop. Good Judgment’s workshops, led by Superforecasters and GJ data scientists, leverage our years of experience in the field of elite forecasting as well as new developments in the art and science of decision-making to provide you with structured guidance on improving your forecasting skills. Our practical exercises will boost your informed practice, offering you lifelong benefits.

Informed practice is the cornerstone of good forecasting and one of the secrets behind the success of Superforecasters. By diligently applying the above principles, you can enhance your forecasting skills and make better-informed decisions. See the workshops we offer to help you and your team take your forecasting success to the next level.

Superforecaster Tips: Dealing with Confirmation Bias in Election Forecasting

Superforecaster Tips: Dealing with Confirmation Bias in Election Forecasting

As the 2024 US election approaches, forecasters are faced with the daunting task of finding signal amid a cacophony of partisan noise, personal biases, and volatile public opinion. One significant challenge is confirmation bias—the tendency to search for, interpret, and recall information in a way that confirms one’s preconceptions. In this blog post, we draw on an internal discussion among seasoned Superforecasters to explore practical strategies forecasters can use to mitigate confirmation bias in election forecasting.

Diversifying Information Sources

“Assign yourself to spend some time reading (reasonably reputable) news sources that disagree with your general perspective on the question.”

Superforecasters highlight the importance of consuming a balanced diet of news sources, including those that challenge one’s beliefs. This approach was systematized by Good Judgment Project (GJP) superforecaster Doug Lorch, who wrote a program to randomize his news intake among a diverse set of sources.

“It certainly didn’t hurt,” recalls Terry Murray, CEO Emeritus of Good Judgment Inc and Project Manager for the GJP at UC–Berkeley. “He was the top forecaster in the whole IARPA tournament that year.”

Engaging in Scenario Analysis and Premortems

“I try to run through various scenarios where [the expected winner] could end up losing.”

Superforecasters routinely consider alternative outcomes by rigorously testing their own assumptions and logic. This involves running through various scenarios where expected outcomes might not materialize and thinking critically about the conditions that would lead to different results.

Embracing Epistemic Humility

“One thing I know is that I don’t know much.”

Acknowledging the limits of one’s knowledge and being open to new information is another tip the Superforecasters offer. This strategy is crucial for preventing overconfidence and being receptive to counterarguments.

Red Teaming

“One of the most important duties for me, as a Red Team member, is not to convince a forecaster that they are wrong… Rather, it’s to test the confidence of the Superforecaster in their own forecast.”

Having a red team to challenge forecasts helps forecasters to re-evaluate the confidence in their arguments and consider why they might be wrong. Red teaming is a standard practice in all Good Judgment’s forecasting.

Leveraging Collective Wisdom

“Sometimes, it pays to listen to the articulated reason of an outlier.”

Some Superforecasters use the median forecast of their group as a benchmark, particularly when their individual estimates deviate significantly from the consensus. This approach can provide a reality check against one’s own extremes. It is important, however, to pay attention to outlier opinions too, to resist conformity and groupthink.

As we dive into another election cycle, the discipline of forecasting reminds us that remaining actively open-minded is more crucial than ever. Combating confirmation bias in election forecasting is no small feat, given the complexity and the emotionally charged nature of politics. However, by employing strategies such as diversifying information sources, engaging in premortems, practicing epistemic humility, employing red teaming, and referencing the collective wisdom of peers, forecasters can enhance the accuracy and reliability of their predictions. Good Judgment’s exclusive forecast monitoring tool FutureFirst™ offers daily forecast updates on election results and trends and many other topics, brought to you by professional Superforecasters.

Learn More about FutureFirst™!