Tribute to Daniel Kahneman: Insights and Memories from Superforecasters

Tribute to Daniel Kahneman: Insights and Memories from Superforecasters

 



“Superforecasters are better than others at finding relevant information—possibly because they are smarter, more motivated, and more experienced at making these kinds of forecasts… Superforecasters are less noisy than…even trained teams.”

—Kahneman, Sibony, & Sunstein, in Noise: A Flaw in Human Judgment

“For as long as I have been running forecasting tournaments, I have been talking with Daniel Kahneman about my work. For that, I have been supremely fortunate. … Talking to Kahneman can be a Socratic experience: energizing as long as you don’t hunker down into a defensive crouch. So in the summer of 2014, when it was clear that superforecasters were not merely superlucky, Kahneman cut to the chase: ‘Do you see them as different kinds of people, or as people who do different kinds of things?’ My answer was, ‘A bit of both.’”

—Phil Tetlock in Superforecasting: The Art and Science of Prediction


 

Daniel Kahneman, Nobel Prize winner and a friend of Good Judgment, died aged 90 on 27 March 2024. He was a true pioneer in the realms of judgment, decision-making, and the psychological underpinnings that affect our forecasting work. Beyond his monumental contributions to psychology and economics, Kahneman took a keen interest in the art and science of Superforecasting, discussing Superforecasters in his last published book Noise and gracing us with his presence at a Good Judgment workshop. In this tribute to Daniel Kahneman, Superforecasters share insights and memories on how his work influenced and enriched them.

Superforecaster Dan Mayland:

The best tribute I can give is to share a photo of my copy of Thinking, Fast and Slow, which I read around the same time I started forecasting. I have this (slightly!) obsessive habit of affixing small Post-It notes to book passages that I think are particularly insightful and don’t want to lose track of once I turn the page. (Even though, the mind being what it is, I do.) It’s my version of taking pictures of sunsets in an attempt to preserve the memory. The photo below speaks to the level of insight I thought Kahneman had to offer.

 

Co-founder and CEO Emeritus of Good Judgment Inc, Terry Murray:

In the early days of Good Judgment Inc, I had the privilege of being in an extended conversation that included our co-founders Phil Tetlock and Barb Mellers along with Danny Kahneman and a couple of his colleagues. I expected Danny to have great questions and insights on the substantive side of our forecasting work because of his own research interests and his past interest in the Good Judgment Project itself. What surprised me most at the time was that Danny’s questions about the business side of our fledgling company were among the best questions anyone had ever posed to me.

Looking back now, I shouldn’t have been surprised. The handful of opportunities I had to interact directly with Danny all confirmed what his writings on judgment and decision-making tell us: Danny was a master at cutting to the chase. Whatever the topic under discussion, he would quickly filter out the noise and hone in on the signal. If I could create the ideal AI personal assistant, it would be one that would pose Danny-like questions to challenge me until I too filtered out the noise and focused on what is most important. I’d like to think that Danny would approve.

Superforecaster and CEO of Good Judgment Inc, Warren Hatch:

A while back, we ran a “Noise Challenge” on Good Judgment Open that was inspired by the launch of the book Daniel Kahneman co-wrote. He generously agreed to sign copies that we could present to the top performers at the end of the competition. So I went by his house to meet him in the lobby. I figured it would take five minutes of his time tops. Instead, we sat down and spent the better part of an hour going over some of the questions then posed to the Superforecasters. It was a masterclass.

Superforecaster Giovanni Ciriani:

I read Thinking, Fast and Slow in early 2013; on 21 March 2013, David Brooks wrote the NY Times column “Forecasting Fox,” which was centered on Tetlock and the Good Judgment Project. It credited Kahneman’s Thinking, Fast and Slow as training material. I read the column uniquely because the book was mentioned. On 22 March 2013, I signed up for the GJP. Of course, one could play counterfactuals, to see which factor was more important in my joining, but I see Kahneman’s as the sine qua non, the condition without which it would not have happened.

Superforecaster and GJ Director Ryan Adler:

I will ashamedly confess that my knowledge of Danny Kahneman’s work was limited before I found myself in the Good Judgment Project, but immersing myself in it was and is an absolute pleasure. To borrow some phrasing from Robert Heinlein, Kahneman and Tversky studied what man is, not what they wanted him to be, and it is saddening to know that there will be less of that kind of thinking with his passing.

Superforecaster Vijay Karthik:

I have learnt a lot by reading articles regarding him and through his book. Life has become a lot more enriched trying to put his recommendations into action, to the best extent possible.

Superforecaster David Fisher:

I remember listening to him on a podcast. He was pessimistic that information alone could change opinions about global warming. He said only if someone identified with the person giving the information were they likely to consider it. I have read Thinking, Fast and Slow and most of Noise and have nothing but respect. Decisions only made because of someone’s economic interests? He disproved that. Deserved the Nobel.

Superforecaster Dwight Smith:

His influence on my life has been tremendous. It was through his work that I became a far better forecaster. And that has led to a multitude of adventures and good works. To be so profoundly influenced by such a great mind even once in a lifetime is quite a gift.

Beliefs as Hypotheses: The Superforecaster’s Mindset

Superforecasters’ Toolbox: Beliefs as Hypotheses

Nine years after the conclusion of the IARPA forecasting tournament, one Good Judgment discovery remains the most consequential idea in today’s dynamic world of forecasting: the discovery of Superforecasters. The concept of Superforecasting has at its heart a simple but transformative idea: The best calibrated forecasters treat their beliefs not as sacrosanct truths, but as hypotheses to be tested.

Superforecasting emerged as a game-changer in the four-year, $20-million research tournament run by the US Office of the Director of National Intelligence to see whether crowd-sourced forecasting techniques could deliver more accurate forecasts than existing approaches. The answer was a resounding yes—and there was more. About 2% of the participants in the tournament were consistently better than others in calling correct outcomes early. What gave them the edge, the research team behind the Good Judgment Project (GJP) found, was not some supernatural ability to see the future but the way they approached forecasting questions. For example, they routinely engaged in what Tetlock calls in his seminal book on Superforecasting “the hard work of consulting other perspectives.”

Central to the practice of Superforecasters is a mindset that encourages a continual reassessment of assumptions in light of new evidence. It is an approach that prizes being actively open-minded, constantly challenging our own perspectives to improve decision-making and forecasting accuracy. As we continue to explore the tools Superforecasters use in their daily work at Good Judgment, we look at what treating beliefs as hypotheses means and how it can be done in practice.

Belief Formation

Beliefs are shaped by our experiences and generally reinforced by our desire for consistency. When we encounter new information, our cognitive processes work to integrate it with our existing knowledge and perspectives. Sometimes this leads to the modification of prior beliefs or the formation of new ones. More often, however, this process is susceptible to confirmation bias and the anchoring effect. (Both Daniel Kahneman’s Thinking, Fast and Slow and Noise, the latter co-authored with Olivier Sibony and Cass R. Sunstein, provide an accessible overview of how cognitive biases affect our thinking and belief formation.)

It is not surprising then that traditionally in forecasting, beliefs have been viewed as conclusions drawn from existing knowledge or expertise. These beliefs tended to be steadfast and were slow to change. Leaders and forecasters alike didn’t like being seen as flip-flops.

During the GJP, Superforecasters challenged this notion. In forecasting, where accuracy and adaptability are paramount, they demonstrated that the ability to change one’s mind brought superior results.

The Superforecaster’s Toolkit

What does this mean in practice? Treating beliefs as hypotheses means being actively open-minded. That in turn requires an awareness and mitigation of cognitive biases to ensure a more balanced and objective approach to evaluation of information.

  • To begin with, Superforecasters constantly question themselves—and each other—whether their beliefs are grounded in evidence rather than assumption.
  • As practitioners of Bayesian thinking, they update their probabilities based on new evidence.
  • They also emphasize the importance of diverse information sources, ensuring a comprehensive perspective.
  • They have the courage to listen to contrasting viewpoints and integrate them into their analysis.

This method demands rigorous evidence-based reasoning, but it is worth the effort, as it transforms forecasting from mere guesswork into a systematic evaluation of probabilities. It is this willingness to engage in the “hard work of consulting other perspectives” that has enabled the Superforecasters to beat the otherwise effective futures markets in foreseeing the US Fed’s policy changes.

Cultivating a Superforecaster’s Mindset

Adopting this mindset is not without challenges. Emotional attachments to long-held beliefs can impede objectivity, and the deluge of information available can be overwhelming. But a Superforecaster’s mindset can and should be cultivated wherever good calibration is the goal. Viewing beliefs as flexible hypotheses is a strategy that champions open-mindedness over rigidity, ensuring that our conclusions are always subject to revision and refinement. It allows for a more effective interaction with information, fostering a readiness to adapt when faced with new data.

It is the surest path to better decisions.

Good Judgment Inc offers public and private workshops to help your organization take your forecasting skills to the next level.

We also provide forecasting services via our FutureFirst™ dashboard.

Explore our subscription options ranging from the comprehensive service to select channels on questions that matter to your organization.

Superforecasters vs FT Readers: Results of an Informal Contest in the Financial Times

Superforecasters vs FT Readers: Results of an Informal Contest in the Financial Times

In early 2023, the Financial Times launched an informal contest that pitted the predictive prowess of the FT’s readership against that of Good Judgment’s Superforecasters. The contest involved forecasting key developments in the year ahead, ranging from geopolitical events to economic indicators to heat waves to sport. The results? Illuminating.

“To help illustrate what makes a superforecaster worthy of the name, the FT asked both readers and Good Judgment superforecasters to make predictions on ten questions early this year, spanning from GDP growth rates to viewership of the Fifa women’s world cup final,” Joanna S Kao and Eade Hemingway explain in their article.

A total of 95 Superforecasters made forecasts on the questions, while the reader poll had about 8,500 respondents.

The Results

Nine of the ten questions have now been scored. On a scale where 0.5 equals guessing and 1 equals perfect prediction, Superforecasters scored an average of 0.91 over nine questions, significantly outperforming FT readers who scored 0.73.

In this informal contest, the Superforecasters continued to work on the questions throughout the year, while the reader poll closed early. Based on everyone’s initial forecasts as of 2 February 2023, however, the Superforecasters outperformed the general crowd on eight out of nine questions.

Key to Forecasting Success

Key to the Superforecasters’ success, as the article notes, is their methodology. They focus on gathering comprehensive information, minimizing biases, and reducing the influence of irrelevant factors that only create noise. This methodological rigor stems from the foundational work of Philip Tetlock, a pioneer in the study of expert predictions and co-founder of Good Judgment Inc.

Read the full article in the FT for a fascinating glimpse into the realm of Superforecasting.

To benefit from Superforecasters’ insights on dozens of top-of-mind questions with probabilities that are updated daily, learn more about our forecast monitoring service, FutureFirst.

 



Ready to Elevate Your Forecasting Game in 2024?
Embrace the Future with Our Exclusive End-of-Year Offer!

As we approach the end of another dynamic year, it’s time to think ahead. Are you looking to empower your team’s professional development in 2024? Look no further. Our renowned Superforecasting® Public Workshop will help you make your forecasting more precise than ever.

Join Our January 2024 Public Workshop – Unleash the Power of Accurate Forecasts!

➡️ What to Expect in the Superforecasting Workshop?

  • Advanced Techniques: Dive into the latest forecasting methodologies.
  • Expert Insights: Learn directly from top forecasters and researchers.
  • Interactive Sessions: Engage in hands-on exercises for practical learning.

 

📈 Special Limited-Time Offer:
Claim your 20% discount on the January Public Workshop (16 & 18 January 2024). Use code GJ20XMS at registration. Hurry, offer valid while seats last!

Stop Guessing, Start Superforecasting with Good Judgment’s Superforecasting® Workshop. Register here.