About > Our Team > Superforecaster Profiles > Robert de Neufville

GJ: You qualified as a Superforecaster in IARPA’s forecasting tournament in Season 3. Tell us a little about your journey to becoming a Superforecaster.

RdN: The first time I did anything like formal forecasting was on an early fantasy sports site called ProTrade, where you competed by trading football players like stocks. I won about $2000 over the first few months just by more accurately projecting the production of different players. After a few months, they changed the rules in a way that made it harder for me to win. Somewhere I still have a hat they gave me. When I heard about the Good Judgment Project in 2013—right before Season 3—I was confident I could do well. I qualified as a Super in 2014 at the end of my first year.

GJ: You have degrees in government and political science from Harvard and Berkeley. However, at Good Judgment Inc we also focus on many economic and financial questions. How do you approach forecasts that are outside of your field of expertise?

RdN: I probably know a lot more about political science than most people, but I’m sure my advisers would tell you I wasn’t a great political scientist or a great academic. I’m just temperamentally much more of a generalist than a specialist. I’m interested in more or less everything, so I already knew something about a lot of the subjects the questions covered. When I know less about a subject—like epidemiology—I really enjoy coming up to speed on it. I think teaching yourself a subject requires some of the same kind of judgment that forecasting does about what’s significant and credible.

GJ: On the question of whether Russia would invade Ukraine that we had on our Superforecasting platform in February, you were firmly in the camp that saw an invasion as more likely than not, and you said early on that, if it did happen, it would be a full-scale invasion. You also wrote, “I hate forecasting horrible things.” Thinking back to the Good Judgment Project and then your work for Good Judgment Inc, what has been the hardest question for you to forecast, emotionally? Which one has been most fun?

RdN: I wish I were as prescient as you make me sound! When I started formally forecasting the question in early February, it did seem likely to me that Russia would invade because it was essentially already initiating the process of invading. I thought Russia would go beyond the Donetsk and Luhansk oblasts, but I still didn’t think Russia would try a full-scale invasion. Like a lot of other forecasters, I didn’t take Putin’s public statements seriously enough and I didn’t think he would commit what seemed to me an obvious strategic mistake.

I don’t enjoy predicting a country is about to commit an atrocity. There’s a part of me that would have preferred not to know Russia was about to invade, much less be the bearer of that news. At one point during the Good Judgment Project, we were asked to predict whether Egyptian political prisoners would be executed. I think a lot of us felt ghoulish speculating about the deaths of specific human beings. But I think the hardest questions for me emotionally are the questions about US politics because I care a lot about democracy in the US and find it stressful to even think that it could be under threat.

I think I’d like it if I could forecast some really surprising piece of good news about renewable energy or something.

GJ: You’ve also worked for the Global Catastrophic Risk Institute and researched existential risks associated with AI and nuclear threats, among others. How does one forecast risks that are so out there that it would be nearly impossible to establish a base rate?

RdN: I think you have to rely a lot less on base rates and more on your own analysis of the facts. But there are things you can do to estimate the background risk of rare or unprecedented events. If you’re trying to forecast the chance of nuclear war, for example, you can try to break down the different pathways to war into the discrete steps along that path. How frequently do false alarms occur? How often do nuclear-armed countries go to war with one another? You can look at the history of close calls and try to use your judgment about how likely similar events are to lead to war. Seth Baum, Tony Barrett, and I have paper that looks at some of these issues.

GJ: Early one Saturday morning in January 2018, together with every other cell phone user across the state of Hawaii, you received an alert of a ballistic missile heading your way. Twelve minutes later—and that’s exactly the time some say it would take a North Korean missile to reach Hawaii—it was confirmed that this was a false alarm. You describe this incident on your blog. In those first minutes after receiving the initial alert on your phone, did you think that forecasting had failed you? Is forecasting even useful for life-and-death decisions?

RdN: We mostly don’t forecast things that would be that useful to me in my personal life. I did stop going to the gym about a month before we locked down here in Hawaii because I knew COVID was spreading. If I had lived in Ukraine, I think I would have thought about fleeing the country well before the invasion started. But forecasting the North Korean nuclear program didn’t really help when the ballistic missile alert came. My best guess was that North Korea couldn’t hit Honolulu with a ballistic missile, but I thought we should probably take the missile alert seriously. But I didn’t think an attack was likely enough to have a realistic plan for what to do in case one occurred. We live in a one-bedroom apartment in a wooden building. No one has a concrete basement where we could shelter. We just put our clothes on and moved away from the windows. I don’t think that would have done us much good if a nuclear missile had hit downtown Honolulu.

GJ: Like many of your fellow Superforecasters, you have said elsewhere that forecasting takes a fair amount of time. Do you have a process that aids you? Is it different for making an initial forecast and for updating an existing forecast?

RdN: I often feel that if I had more time to devote to forecasting, I could do a better job. I often wish I had time to go through historical data more carefully. I also wish I had more time to scan the news and update my forecasts. My score is often a lot worse than it would be if I had time to update my forecast constantly. But the great thing about working with other Supers at Good Judgment is that there is a good chance someone else has done the research you haven’t had time to do. I think it’s important always to form your own opinion—otherwise you’re not adding any real value—but I find it very valuable to draw on others’ insights once you’ve had a chance to form your own initial ideas.

Robert (left) creates the NonProphets (Super)forecasting Podcast together with fellow Superforecasters Atief Heermance and Scott Eastman.

GJ: The NonProphets: (Super)forecasting Podcast that you create with fellow Superforecasters Scott Eastman and Atief Heermance is about to air its 100th episode. That’s quite a milestone! How did this project start? Is the 100th episode going to be extra special? 

RdN: We have to think of something to do for the 100th episode. We don’t have anything special planned right now. I think Atief deserves most of the credit for coming up with the idea to do a podcast. I hope our listeners enjoy our conversations, but I think a lot of the reason we keep doing it is that the three of us just really enjoying talking about politics and forecasting with each other.

GJ: One of the most popular posts on our blog, Insights, has been about Superforecasters’ reading preferences. What’s on your reading list right now? 

RdN: Two great books that I keep coming back to when I think about the conflict in Ukraine are Geoffrey Blainey’s The Causes of War and Thomas Schelling’s The Strategy of Conflict. My thinking about politics has also been heavily influenced by George Orwell’s essay “Politics and the English Language” and Hannah Arendt’s essay “Lying in Politics”. Ludwig Wittgenstein’s Philosophical Investigations has had a huge influence on how I think about both knowledge and philosophy. As far as my reading list right now, I have two novels by the side of my bed I’m trying to choose between, W.G. Sebald’s Austerlitz and James Baldwin’s Giovanni’s Room.

GJ: What advice would you give those getting started at forecasting and wanting to improve their skills?

RdN: I think it really helps to read how the best forecasters explain their forecasts. Examine your own ideas critically and look for holes in your logic. And then practice. You can improve every skill with practice.

GJ: Thank you, Robert. As always, great talking to you.

Intrigued?

Stop guessing. Start Superforecasting.

Schedule a consultation to learn how our FutureFirst monitoring tool, custom Superforecasts, and training services can help your organization make better decisions.

  • This field is for validation purposes and should be left unchanged.

Keep up with Superforecasting news. Discover training opportunities. Make better decisions. Sign up for Good Judgment’s newsletter.

  • This field is for validation purposes and should be left unchanged.