Common Questions about Good Judgment Inc and Superforecasters

A Primer on Good Judgment Inc and Superforecasters

At Good Judgment Inc (GJI), the official home of Superforecasting®, we pride ourselves on our ability to provide well-calibrated and insightful forecasts. As we continue to partner with clients and media worldwide, it is worthwhile to address some of the common questions we receive about our work. Here is a primer on our story, probabilistic forecasts, and our team of Superforecasters.

What’s in a Name? GJP, GJI, and GJ Open

The Good Judgment Project (GJP)
In 2011, the Intelligence Advanced Research Projects Activity (IARPA) launched a massive tournament to identify the most effective methods for forecasting geopolitical events. Four years, 500 questions, and over a million forecasts later, the Good Judgment Project (GJP), led by Philip Tetlock and Barbara Mellers at the University of Pennsylvania, emerged as the clear winner of the tournament. The research project concluded in 2015, but its legacy lives on. The GJP is credited with the discovery of Superforecasters, people who are exceptionally skilled at assigning realistic probabilities to possible outcomes even on topics outside their primary subject-matter training.

Good Judgment Inc (GJI)
GJI is the commercial successor to the GJP and the official home of Superforecasting® today. We leverage the lessons learned during the IARPA tournament and insights gained in our subsequent work with Phil Tetlock and his research colleagues as well as with leading companies, academic institutions, governments, and non-governmental organizations to provide the best and the latest in forecasting and training services. Our goal is to help organizations make better decisions by harnessing the power of accurate forecasts. GJI relies on a team of Superforecasters, as well as data and decision scientists, to provide forecasting and training to clients.

Good Judgment Open (GJ Open)
GJO, or GJ Open, is our public platform, open to anyone interested in making forecasts. Unlike GJI, which involves professional Superforecasters, GJO welcomes participation from the public. The “Open” in GJ Open not only signifies that it’s accessible to all but also draws a parallel to golf tournaments. Forecasting questions vary in their complexity, so there is no absolute score to indicate a “good” forecast. We use the median of participants’ scores as a benchmark, similar to par in golf, where lower scores indicate better performance.

A Note on Spelling
You may have noticed that “judgment” is spelled without an “e” on all our platforms. This is a consistent choice across GJP, GJI, and GJ Open, reflecting our preference for the parsimonious American spelling of the word.

Understanding Probabilistic Forecasts
Sample forecast on FutureFirst™, 12 July 2024

Our forecasts are not polls. They are aggregated probabilistic predictions about specific events. For instance, Superforecasters gave Joe Biden an 82% chance of winning the 2020 US presidential election. This means that if the election were held 100 times, Biden would win in 82 of those instances.

A common misconception is interpreting a probabilistic forecast as “X% of Superforecasters say a particular outcome will happen.” In reality, each Superforecaster provides their own probabilistic forecast, and we aggregate these individual predictions to reach a collective forecast. Therefore, an 82% forecast does not mean 82% of Superforecasters believe a certain outcome will occur. It is an aggregated probability of the outcome (an 82% probability of it occurring and an 18% probability of a different outcome) based on all individual forecasts.

Understanding Superforecasters’ Backgrounds

Good Judgment works with some 180 Superforecasters from around the world whose forecasting accuracy placed them in the top 1-2% of the more than 100,000 forecasters who took part in the GJP or qualified on GJ Open. Our Superforecasters come from a wide range of professional fields, including finance, IT, humanities, social sciences, engineering, and more. This allows them to approach forecasting questions in a well-rounded way, combining their exceptional forecasting skills with specialized knowledge in different areas.

Age and Geographic Diversity
Superforecasters range in age from their 20s to their 70s and hail from different parts of the world. This geographic and demographic diversity helps to ensure that our forecasts are informed by a broad spectrum of experiences and viewpoints.

The Wisdom of the Crowd
We emphasize the importance of the wisdom of the crowd. Our Superforecasters read different publications in various languages and bring diverse perspectives to the table. To borrow terminology from Tetlock’s training materials in the GJP, some are Intuitive Scientists, others are Intuitive Historians, while still others are Intuitive Data Scientists.

Collaborative Nature of Forecasting
Forecasting at GJI is a team effort. We focus on collective intelligence. It’s not about individual forecasting superheroes tackling challenges alone but about identifying people who bring unique strengths to the table as a team of Superforecasters.

Get actionable early insights on top-of-mind questions by subscribing to our forecasting monitoring tool, FutureFirst™!

Superforecaster Tips: Dealing with Confirmation Bias in Election Forecasting

Superforecaster Tips: Dealing with Confirmation Bias in Election Forecasting

As the 2024 US election approaches, forecasters are faced with the daunting task of finding signal amid a cacophony of partisan noise, personal biases, and volatile public opinion. One significant challenge is confirmation bias—the tendency to search for, interpret, and recall information in a way that confirms one’s preconceptions. In this blog post, we draw on an internal discussion among seasoned Superforecasters to explore practical strategies forecasters can use to mitigate confirmation bias in election forecasting.

Diversifying Information Sources

“Assign yourself to spend some time reading (reasonably reputable) news sources that disagree with your general perspective on the question.”

Superforecasters highlight the importance of consuming a balanced diet of news sources, including those that challenge one’s beliefs. This approach was systematized by Good Judgment Project (GJP) superforecaster Doug Lorch, who wrote a program to randomize his news intake among a diverse set of sources.

“It certainly didn’t hurt,” recalls Terry Murray, CEO Emeritus of Good Judgment Inc and Project Manager for the GJP at UC–Berkeley. “He was the top forecaster in the whole IARPA tournament that year.”

Engaging in Scenario Analysis and Premortems

“I try to run through various scenarios where [the expected winner] could end up losing.”

Superforecasters routinely consider alternative outcomes by rigorously testing their own assumptions and logic. This involves running through various scenarios where expected outcomes might not materialize and thinking critically about the conditions that would lead to different results.

Embracing Epistemic Humility

“One thing I know is that I don’t know much.”

Acknowledging the limits of one’s knowledge and being open to new information is another tip the Superforecasters offer. This strategy is crucial for preventing overconfidence and being receptive to counterarguments.

Red Teaming

“One of the most important duties for me, as a Red Team member, is not to convince a forecaster that they are wrong… Rather, it’s to test the confidence of the Superforecaster in their own forecast.”

Having a red team to challenge forecasts helps forecasters to re-evaluate the confidence in their arguments and consider why they might be wrong. Red teaming is a standard practice in all Good Judgment’s forecasting.

Leveraging Collective Wisdom

“Sometimes, it pays to listen to the articulated reason of an outlier.”

Some Superforecasters use the median forecast of their group as a benchmark, particularly when their individual estimates deviate significantly from the consensus. This approach can provide a reality check against one’s own extremes. It is important, however, to pay attention to outlier opinions too, to resist conformity and groupthink.

As we dive into another election cycle, the discipline of forecasting reminds us that remaining actively open-minded is more crucial than ever. Combating confirmation bias in election forecasting is no small feat, given the complexity and the emotionally charged nature of politics. However, by employing strategies such as diversifying information sources, engaging in premortems, practicing epistemic humility, employing red teaming, and referencing the collective wisdom of peers, forecasters can enhance the accuracy and reliability of their predictions. Good Judgment’s exclusive forecast monitoring tool FutureFirst™ offers daily forecast updates on election results and trends and many other topics, brought to you by professional Superforecasters.

Learn More about FutureFirst™!

Decoding SCOTUS: Navigating Media Bias in Supreme Court Forecasting

Decoding SCOTUS: Navigating Media Bias in Supreme Court Forecasting

Superforecaster, GJ managing director, and leader of Good Judgment’s question team, Ryan Adler shares tips on how to approach forecasting Supreme Court decisions.

May is a lovely time of year. Lots of daylight, newborn critters all around, and Supreme Court junkies start to get their fix from now through the end of June. As someone who has watched the Court enthusiastically for more than a quarter of a century, I always look forward to decision season. However, in all those years, I’ve learned a painful lesson. The press is generally terrible at reporting about the Court. For Court junkies like me, it’s not that big of a deal. I learned early in my studies that if I want to know what a case means, I need to dig into the weeds myself. But as someone who has also written dozens of SCOTUS forecasting questions, I know that many forecasters don’t have the benefit of having read thousands of pages of appellate decisions to instruct their analysis. So, as is the case for anybody forecasting on a topic with which they aren’t intimately familiar, they lean on press reporting to fill in the gaps.

“Creating an emotional investment in the outcome of a case is poison for a forecaster.”

Alas, most journalists and talking heads intent to tell readers and viewers “how it is” end up just regurgitating heavily politicized schtick from one side of the aisle or the other. A glaring example is Trump v. United States, the presidential immunity case out of the DC Circuit Court of Appeals. For the vast majority of the press, it’s a question of what the Court will let happen to Trump in Jack Smith’s federal election interference case. For anyone who understands the Constitution and listened to oral arguments in the case, the Court knows it’s grappling with something much bigger and fundamental than Trump’s actions and statements on January 6th. Questions of presidential immunity go to the heart of the structure of the US’ separation of powers, perhaps the greatest innovation from the Founding Fathers that has proven key to good governance in modernity. Framing the case as democracy in peril (you can find such arguments from both Trump’s supporters and detractors) is, in this writer’s opinion, theater. This isn’t to say that the events of January 6th are nothing of concern, nor does it make light of the potential precedents to be set. But let’s be honest: if the party affiliation of the defendant in this case were flipped, who honestly thinks that the arguments from the “legal experts” in the media wouldn’t look far different? That doesn’t make these people liars, but it highlights that these “experts” are doing something very different than what we ask forecasters to do—figure out what will happen irrespective of what they feel should happen. Such “experts” placate and/or infuriate, which is how politics works.

What is to be done?

Creating an emotional investment in the outcome of a case is poison for a forecaster. We can debate all day about the political implications of a Court decision, and the nine robed ones don’t operate in a vacuum. That all notwithstanding, there is a stark difference between being mindful of political factors that influence the conversation among them and acting as though those implications represent the sole axis upon which the case will turn. It may seem hard not to see anything and everything through a political lens, and most media outlets have given up pretending that their reporting isn’t first run through a political prism.

So, what is to be done? As James Madison stated that ambition must be made to counteract ambition, this writer suggests that cynicism must be made to counteract cynicism. Take the words of any legal “expert” or talking head with a large grain of salt. Being a lawyer doesn’t make you an expert on the Constitution. Many lawyers who practice would probably admit that the last time they paid attention to constitutional law was prepping for a bar exam. That fact is not mitigated by gaining the title of “contributor.” If what you read or hear feels like it’s what you may personally want to read or hear, be suspicious. That’s sound advice for anything in the media, but it’s especially crucial for navigating the flood of gibberish that inundates the airwaves and web when a high-profile case makes it to Washington.

Do you have what it takes to be a Superforecaster? Find out on GJ Open!