Meet the winner of the “Right!” said FRED: Q3 2024 Challenge

Meet the winner of the Q3 2024 “Right!” said FRED Challenge

The winner of the Q3 2024 “Right!” said FRED Challenge, Julio Vieiro, is a retired geologist with two decades’ experience in the oil industry as well as an investor, photographer, and painter. Known on GJ Open as JAVL, in this interview he discusses his interest in forecasting, his diverse hobbies, and his tips for fellow forecasters. He lives in Argentina.

GJO: Could you please tell us about yourself and your background?

Hi. I was born and live in Argentina. Professionally, I’m a geologist, and I have also studied biology and engineering. I worked almost two decades in the oil industry, and I’m currently retired. Additionally, I’ve been a photographer and painter for many years. Nowadays I dedicate my ever-scarce time to personal investments in the stock market, artistic activities, and a two-year program to become a professional sommelier. I enjoy mountains, the outdoors, sports, reading, and barbecues with my partner, my two adult children, and close friends. I’m interested in diverse topics.

GJO: How did you first become interested in forecasting? What brought you to GJ Open specifically?

During my professional development, I was always involved in the forecasting of technical parameters such as hydrocarbon saturation, mineralized thickness, porosity, pressure, etc. In this context, I was often curious why many highly trained professionals had an unconscious tendency to “fall in love” with their own ideas or models, or to expect results mainly according to their desires or what they thought someone expected, wrongly allocating resources or expectations. Frequently, people were more accurate evaluating their peers’ projects than their own.

In my case, Daniel Kahneman’s Thinking Fast and Slow was very revealing. It allowed me to give shape to the intuition that humans have evolved not to develop accurate objective explanations of reality but to generate quick, plausible explanations in order to effectively operate in our original environment, which is not the current one. The way I see it, many times our minds tend to take shortcuts that seem reasonable and/or comforting instead of doing the hard work of looking for evidence. This characteristic would not always help us understand complex situations, especially if we are emotionally involved.

Regarding GJ Open, during the pandemic I read Philip Tetlock’s Superforecasting and found it captivating. I decided to join GJ Open as a fun challenge to test in a non-subjective way my abilities to predict results of complex situations.

GJO: You had some tough competition in the Q3 2024 “Right!” said FRED Challenge. In your opinion, what helped you top the leaderboard?

I’m sure there’s always a dose of luck, some randomness. On the other hand, I try to pay attention to economic data. There is a great availability of information and projections, and it’s difficult to weigh which variables are really relevant and which have little or no impact. I also try to understand the probabilities of future events that could change the observed trends.

GJO: What types of forecasting questions do you enjoy the most? What topics would you like to see more of on the platform in 2025?

I’m more interested in some topics, such as economics, technology, space, sports. I tend to focus on what I enjoy most and what I’m interested in learning about. Evaluating the possible results helps understanding and stimulates the search for information. I prefer questions with data available, not just opinions. I also prefer questions where possible outcomes are expressed in ranges with multiple probabilities rather than yes/no or single probability answers.

I would like to have more questions related to the Latin American reality. In particular, my country is currently going through a complex and interesting economic and political process with still uncertain results.

GJO: What tips could you offer beginner forecasters on GJ Open?

I’m not sure what the best advice is, and I suppose there are many things that could be said. In any case, what I try to do is understand the question very well, not give an opinion on things that I am totally unaware of, look for reliable information and sources of data with frequent updates, and discriminate the relevant factors among the multiple ones that could influence a result.

What I find most important, however, is to find the point at which doubt or uncertainty makes us start to feel uncomfortable with the probabilities we assign. In my experience, people tend to feel too confident or secure. In many of these cases, something important could be missing. On the contrary, I often tend to overextend the alternatives to ensure I don’t err. I try, then, to force myself to rethink the problem in a different way or, more thoroughly, look for more information, or review what I have, to adjust the answer until I feel unsure. It’s not usually easy. Logically, at some point there are cases that are mathematically defined or where the probabilities seem to me less than 1/1000. Another exercise that I find useful is to assign a higher likelihood to those outcomes that don’t match the one that I prefer.

GJO: Is there anything else you’d like to add?

It seems to me that the exercise of making forecasts accustoms us to confronting our ideas with reality and trains us to make decisions with probabilistic logic, which I consider very important for both professional and personal life. It is also a very fun activity, a game where one competes against others and against oneself. Many thanks to Good Judgment for the enjoyment.

See the latest forecasting challenges on GJ Open and try your hand at forecasting!

Interview with the winner of the “Right!” said FRED Challenge

Interview with the winner of the “Right!” said FRED Challenge

In this interview, we sit down with the winner of the “Right!” said FRED Challenge for Q2 2024, Sigitas Keras. Known on GJ Open as sigis, Sigitas is an experienced quant and trader who decided to explore the world of forecasting after an impressive 25-year career in finance. With a PhD in mathematics and a natural curiosity about the world, he shares insights into the unique challenge he has taken on to forecast every question on GJO in 2024 and the strategies that helped him excel on the platform. Originally from Lithuania, Sigitas currently lives in Canada.

GJO: What is your background, and how did you first become interested in forecasting?

I was born in Lithuania, have a PhD in maths, but, as many others with a similar background, ended up in finance industry. After almost 25 years as a quant and a trader, I recently retired, which freed up a lot of time for other things. I tried forecasting on GJO for the first time a couple years ago. It seemed like an interesting challenge where I could combine analytical skills and general curiosity about the world.

GJO: How did you learn about GJ Open? How would you describe your experience on the platform so far?

I read Tetlock’s book Superforecasting, so likely that was an initial prompt, but to be honest I don’t remember full details anymore. Rightly or wrongly, I am one of the few forecasters who decided to forecast every question in 2024. It was very enjoyable, and I feel I learnt a lot both about forecasting and about various topics, but I have to admit this is getting too difficult to maintain. I don’t think I’ll continue doing all questions next year, and most likely will just focus on a few challenges, but I still like to maintain a good mixture of various topics.

GJO: What was your approach to the “Right!” said FRED Challenge? What do you think helped you come out on top?

I like questions that have good supporting data. In that sense, the FRED challenge is perfect for me. Whenever there is good data available, I try to use some mathematical model. Having a background in finance industry helps a bit with that, although I don’t think I use anything that requires more than FRED and other publicly available data and a Google spreadsheet. I also try to update my forecasts regularly, typically once a week. I think consistency is another important component of successful forecasting.

GJO: What topics would you consider of particular interest to forecast for 2025 and beyond?

I tend to forecast better when there is good data available for analysis. On the other hand, geopolitical questions are often much more challenging, so perhaps I will focus on improving there. My goal is to improve my score in the Superforecasting Workshops challenge!

GJO: Is there anything you would like to add that would be of particular interest to other forecasters on GJ Open?

I feel I am still very new to forecasting and to the community. One thing I hope is to learn more about other forecasters, their backgrounds, their approaches to forecasting. And if anyone has any questions for me, feel free to reach out.

See the latest forecasting challenges on GJ Open and try your hand at forecasting!

Keeping It Civil – Promoting an Open-Minded Dialog on Good Judgment Open

Keeping It Civil

A wise crowd encompasses diverse views.

Good Judgment Project research found that being an “actively open-minded thinker” is positively correlated with being an accurate forecaster. That’s no mystery. Exposure to views with which we disagree – even views that we find repugnant – can inform our understanding of the world in which we live. And the better we understand that world, the better we can project what will happen under various conditions.

On Good Judgment Open (GJO), our public forecasting platform, we strive to foster the candid exchange of opinions without personal attacks on others who express opposing views and without using profane language or offensive epithets. We do not censor comments simply because they express opinions with which we, individually or as a company, disagree.

We rely primarily on our forecasting community to let us know if any comments posted on GJO fall outside the reasonable bounds of civil discourse. Any forecaster can “flag” a comment, which triggers a review by our site administrators. We’re happy to report that flagged comments are rare; from time to time, however, we have deleted inflammatory remarks, especially ones that personally insult other forecasters, and have cautioned GJO forecasters to find ways to “disagree without being disagreeable.”

Can Forecasting Tournaments Reduce Political Polarization?

The relative rarity of nastiness on GJO may surprise those accustomed to the rough-and-tumble of the Twitterverse. But it’s little surprise to Good Judgment researchers. Our co-founder Barb Mellers launched a three-year, National-Science-Foundation-funded research program to investigate whether participation in forecasting tournaments could moderate political polarization. Specifically, Mellers and her co-authors Philip Tetlock and Hal Arkes wanted to explore:

How feasible is it to induce people to treat their political beliefs as testable probabilistic propositions open to revision in response to dissonant as well as consonant evidence and arguments?

Mellers, B., Cognition, https://doi.org/10.1016/j.cognition.2018.10.021.

They put this question to the test for two years in a forecasting tournament they dubbed the Foresight Project. This tournament went beyond prior Good Judgment Project research in that it included questions relating to controversies in US domestic politics and not just about geopolitical events elsewhere. That increased the likelihood that Foresight Project participants, who were predominantly US residents, would hold strong views related to the forecasting questions.

The Mellers et al. findings shed light on why the conversation on Good Judgment Open seems so civil compared to what we see elsewhere on social media.

Forecasting tournaments are not a panacea for what ails our political conversations. Even the most accurate forecasters – including some who earn the Superforecaster® credential – occasionally express their views in polarizing prose and endorse opinions that many consider to be immoral. In that respect, forecasting accuracy is like other forms of competence − in Phil Tetlock’s words, “[t]here is no divinely mandated link between morality and competence.”[1]

Nonetheless, we are optimistic that Good Judgment Open contributes to a more thoughtful public dialog and encourages our forecasters to listen carefully to points of view that they might otherwise dismiss. For example, we took great pleasure in hosting an “adversarial collaboration” challenge on the Iran nuclear deal, inspired by this New York Times op-ed co-written by Phil Tetlock and Peter Scoblic. We plan to engage GJO forecasters with more opportunities to test whether their views on controversial subjects lead to more or less accurate predictions.

As the 2020 election cycle intensifies, we forecast with near certainty (p>.99) that the public debate will grow even more heated and more personal. Together, let’s preserve Good Judgment Open as a place where facts and reasoned argument reign supreme. We have nothing to lose but our illusions.

[1] Tetlock, P., & D. Gardner (2015). Superforecasting: The Art and Science of Prediction. New York: Crown. P. 229.