Tribute to Daniel Kahneman: Insights and Memories from Superforecasters

Tribute to Daniel Kahneman: Insights and Memories from Superforecasters

 



“Superforecasters are better than others at finding relevant information—possibly because they are smarter, more motivated, and more experienced at making these kinds of forecasts… Superforecasters are less noisy than…even trained teams.”

—Kahneman, Sibony, & Sunstein, in Noise: A Flaw in Human Judgment

“For as long as I have been running forecasting tournaments, I have been talking with Daniel Kahneman about my work. For that, I have been supremely fortunate. … Talking to Kahneman can be a Socratic experience: energizing as long as you don’t hunker down into a defensive crouch. So in the summer of 2014, when it was clear that superforecasters were not merely superlucky, Kahneman cut to the chase: ‘Do you see them as different kinds of people, or as people who do different kinds of things?’ My answer was, ‘A bit of both.’”

—Phil Tetlock in Superforecasting: The Art and Science of Prediction


 

Daniel Kahneman, Nobel Prize winner and a friend of Good Judgment, died aged 90 on 27 March 2024. He was a true pioneer in the realms of judgment, decision-making, and the psychological underpinnings that affect our forecasting work. Beyond his monumental contributions to psychology and economics, Kahneman took a keen interest in the art and science of Superforecasting, discussing Superforecasters in his last published book Noise and gracing us with his presence at a Good Judgment workshop. In this tribute to Daniel Kahneman, Superforecasters share insights and memories on how his work influenced and enriched them.

Superforecaster Dan Mayland:

The best tribute I can give is to share a photo of my copy of Thinking, Fast and Slow, which I read around the same time I started forecasting. I have this (slightly!) obsessive habit of affixing small Post-It notes to book passages that I think are particularly insightful and don’t want to lose track of once I turn the page. (Even though, the mind being what it is, I do.) It’s my version of taking pictures of sunsets in an attempt to preserve the memory. The photo below speaks to the level of insight I thought Kahneman had to offer.

 

Co-founder and CEO Emeritus of Good Judgment Inc, Terry Murray:

In the early days of Good Judgment Inc, I had the privilege of being in an extended conversation that included our co-founders Phil Tetlock and Barb Mellers along with Danny Kahneman and a couple of his colleagues. I expected Danny to have great questions and insights on the substantive side of our forecasting work because of his own research interests and his past interest in the Good Judgment Project itself. What surprised me most at the time was that Danny’s questions about the business side of our fledgling company were among the best questions anyone had ever posed to me.

Looking back now, I shouldn’t have been surprised. The handful of opportunities I had to interact directly with Danny all confirmed what his writings on judgment and decision-making tell us: Danny was a master at cutting to the chase. Whatever the topic under discussion, he would quickly filter out the noise and hone in on the signal. If I could create the ideal AI personal assistant, it would be one that would pose Danny-like questions to challenge me until I too filtered out the noise and focused on what is most important. I’d like to think that Danny would approve.

Superforecaster and CEO of Good Judgment Inc, Warren Hatch:

A while back, we ran a “Noise Challenge” on Good Judgment Open that was inspired by the launch of the book Daniel Kahneman co-wrote. He generously agreed to sign copies that we could present to the top performers at the end of the competition. So I went by his house to meet him in the lobby. I figured it would take five minutes of his time tops. Instead, we sat down and spent the better part of an hour going over some of the questions then posed to the Superforecasters. It was a masterclass.

Superforecaster Giovanni Ciriani:

I read Thinking, Fast and Slow in early 2013; on 21 March 2013, David Brooks wrote the NY Times column “Forecasting Fox,” which was centered on Tetlock and the Good Judgment Project. It credited Kahneman’s Thinking, Fast and Slow as training material. I read the column uniquely because the book was mentioned. On 22 March 2013, I signed up for the GJP. Of course, one could play counterfactuals, to see which factor was more important in my joining, but I see Kahneman’s as the sine qua non, the condition without which it would not have happened.

Superforecaster and GJ Director Ryan Adler:

I will ashamedly confess that my knowledge of Danny Kahneman’s work was limited before I found myself in the Good Judgment Project, but immersing myself in it was and is an absolute pleasure. To borrow some phrasing from Robert Heinlein, Kahneman and Tversky studied what man is, not what they wanted him to be, and it is saddening to know that there will be less of that kind of thinking with his passing.

Superforecaster Vijay Karthik:

I have learnt a lot by reading articles regarding him and through his book. Life has become a lot more enriched trying to put his recommendations into action, to the best extent possible.

Superforecaster David Fisher:

I remember listening to him on a podcast. He was pessimistic that information alone could change opinions about global warming. He said only if someone identified with the person giving the information were they likely to consider it. I have read Thinking, Fast and Slow and most of Noise and have nothing but respect. Decisions only made because of someone’s economic interests? He disproved that. Deserved the Nobel.

Superforecaster Dwight Smith:

His influence on my life has been tremendous. It was through his work that I became a far better forecaster. And that has led to a multitude of adventures and good works. To be so profoundly influenced by such a great mind even once in a lifetime is quite a gift.

Beliefs as Hypotheses: The Superforecaster’s Mindset

Superforecasters’ Toolbox: Beliefs as Hypotheses

Nine years after the conclusion of the IARPA forecasting tournament, one Good Judgment discovery remains the most consequential idea in today’s dynamic world of forecasting: the discovery of Superforecasters. The concept of Superforecasting has at its heart a simple but transformative idea: The best calibrated forecasters treat their beliefs not as sacrosanct truths, but as hypotheses to be tested.

Superforecasting emerged as a game-changer in the four-year, $20-million research tournament run by the US Office of the Director of National Intelligence to see whether crowd-sourced forecasting techniques could deliver more accurate forecasts than existing approaches. The answer was a resounding yes—and there was more. About 2% of the participants in the tournament were consistently better than others in calling correct outcomes early. What gave them the edge, the research team behind the Good Judgment Project (GJP) found, was not some supernatural ability to see the future but the way they approached forecasting questions. For example, they routinely engaged in what Tetlock calls in his seminal book on Superforecasting “the hard work of consulting other perspectives.”

Central to the practice of Superforecasters is a mindset that encourages a continual reassessment of assumptions in light of new evidence. It is an approach that prizes being actively open-minded, constantly challenging our own perspectives to improve decision-making and forecasting accuracy. As we continue to explore the tools Superforecasters use in their daily work at Good Judgment, we look at what treating beliefs as hypotheses means and how it can be done in practice.

Belief Formation

Beliefs are shaped by our experiences and generally reinforced by our desire for consistency. When we encounter new information, our cognitive processes work to integrate it with our existing knowledge and perspectives. Sometimes this leads to the modification of prior beliefs or the formation of new ones. More often, however, this process is susceptible to confirmation bias and the anchoring effect. (Both Daniel Kahneman’s Thinking, Fast and Slow and Noise, the latter co-authored with Olivier Sibony and Cass R. Sunstein, provide an accessible overview of how cognitive biases affect our thinking and belief formation.)

It is not surprising then that traditionally in forecasting, beliefs have been viewed as conclusions drawn from existing knowledge or expertise. These beliefs tended to be steadfast and were slow to change. Leaders and forecasters alike didn’t like being seen as flip-flops.

During the GJP, Superforecasters challenged this notion. In forecasting, where accuracy and adaptability are paramount, they demonstrated that the ability to change one’s mind brought superior results.

The Superforecaster’s Toolkit

What does this mean in practice? Treating beliefs as hypotheses means being actively open-minded. That in turn requires an awareness and mitigation of cognitive biases to ensure a more balanced and objective approach to evaluation of information.

  • To begin with, Superforecasters constantly question themselves—and each other—whether their beliefs are grounded in evidence rather than assumption.
  • As practitioners of Bayesian thinking, they update their probabilities based on new evidence.
  • They also emphasize the importance of diverse information sources, ensuring a comprehensive perspective.
  • They have the courage to listen to contrasting viewpoints and integrate them into their analysis.

This method demands rigorous evidence-based reasoning, but it is worth the effort, as it transforms forecasting from mere guesswork into a systematic evaluation of probabilities. It is this willingness to engage in the “hard work of consulting other perspectives” that has enabled the Superforecasters to beat the otherwise effective futures markets in foreseeing the US Fed’s policy changes.

Cultivating a Superforecaster’s Mindset

Adopting this mindset is not without challenges. Emotional attachments to long-held beliefs can impede objectivity, and the deluge of information available can be overwhelming. But a Superforecaster’s mindset can and should be cultivated wherever good calibration is the goal. Viewing beliefs as flexible hypotheses is a strategy that champions open-mindedness over rigidity, ensuring that our conclusions are always subject to revision and refinement. It allows for a more effective interaction with information, fostering a readiness to adapt when faced with new data.

It is the surest path to better decisions.

Good Judgment Inc offers public and private workshops to help your organization take your forecasting skills to the next level.

We also provide forecasting services via our FutureFirst™ dashboard.

Explore our subscription options ranging from the comprehensive service to select channels on questions that matter to your organization.

Looking Back at 2023: Good Judgment’s Year of Impact and Innovation

Looking Back at 2023: Good Judgment’s Year of Impact and Innovation

As we start another year at Good Judgment, it’s time to reflect on our achievements in 2023 and the journey ahead for 2024. The past year was a testament to our continued commitment to excellence in forecasting.

Superforecasters’ Track Record and Updates to FutureFirst™

In finance, our Superforecasters outperformed the CME FedWatch Tool by a staggering 66%. They also anticipated the June inflection point in US Federal Reserve policy notably earlier than the futures markets. This success was reported by the New York Times and Financial Times.

Our FutureFirst service expanded in 2023, offering direct subscriptions and specialized forecast channels, including “Superforecasting the Middle East” and “Superforecasting Ukraine.” With daily updates, API access, and dozens of Superforecasts on top-of-mind questions, it’s a comprehensive tool for the future-conscious decision-maker.

Forecasting Services and GJ Open Challenges

We actively engaged with our clients, enabling them to pose 79 new questions to the Superforecasters in 2023. We also launched exciting new challenges on our public forecasting platform, Good Judgment Open, complete with 475 new forecasting questions! Partnerships with industry giants and educational institutions like Harvard Kennedy School have been pivotal in broadening our reach and offered organizations a unique way to spot in-house talent, train interns, and identify trends.

Training Services

Our workshops and the Forecasting Aptitude Screening Tool (FAST) have empowered organizations to refine their forecasting skills and assess their team’s capabilities. An analysis of workshop outcomes released by our data science team showed marked improvement in forecasting accuracy for workshop participants.

Other Highlights

Notable Mentions of Superforecasters in the Media in 2023

We’re excited to continue this journey in 2024 and invite you to stay connected through our newsletter. Cheers to a new year filled with insightful forecasts and strategic growth for our partners!