Week 3: How do you form beliefs?

Please note that the exercise for this week will require you to begin right after the end of your discussion session for Week 2.


This week, we will discuss the project of developing a clearer picture of the world and improving our thinking both for ourselves and our work. We’ll evaluate the argument for why this might be important, look at some reasons to be excited about the project, and look at some next steps.

The project of effective altruism is one that tries to get a clearer picture of the world in order to take actions which would improve it. Some examples of questions we can ask are:

  • How many animals are suffering right now?
  • Which country’s culture is most likely to spread to others over the next 50 years such that efforts there will be multiplied?
  • How much can I reduce the chance of existential catastrophe this century?

These are questions about the world and what happens within it - rather than the questions about values we discussed last week. Getting as close to the truth as possible is essential for figuring out how to actually have the impact we’re aiming for. If you need to decide which country to run a corporate cage-free campaign you need to ask questions like:

  • Which country has the most caged chickens?
  • Which country's companies are the most amenable to campaigning?
  • How much does it cost to run a campaign in this country?

However, these questions are hard. Not only are we facing incredibly difficult questions but also people are plagued with bias, predictably overconfident, and prone to groupthink. There are also people who are systematically underconfident and shy away from making guesses, for fear of being wrong. However, there are people who have developed an impressive track record when predicting what will happen in the world. If they managed to overcome their biases and improve their thinking, maybe we can too.

Developing your thinking can also be incredibly exciting! The reading this week includes stories and examples from a few EAs who worked on developing this skill and found it both rewarding and incredibly valuable when doing good. We’ve also included some useful next steps and will take the time in the session to discuss the merits and failings of each suggestion, as well as the personal relevance for you.

We also include an exercise this week focussed on journaling and noticing our thought processes and biases. It’s often difficult to go from hearing a bias or mode of thinking described to noticing it within yourself, but we think this may be the most important step in improving your thinking. It’s hard to check yourself for these biases in the moment, so it is useful to practice noticing, reflecting, and talking about them with others.

Core Reading

Exercise (~80 mins)

One way to start building good habits when evaluating tough questions is to journal about your thought processes. Often it takes writing down a thought to get better at noticing modes of thinking you'd rather avoid and pulling out practices you'd like to encourage. In doing the journaling and prediction exercises below the hope is you might notice a few trends in your thinking that bias you one way or another, or a common mistake that you can correct for.

Journaling (>5 mins a day for the week)

Once a day for the week leading up to your session open up a document or notepad and write down your thought on whichever of these prompts seem the most interesting to you (this might change day to day):

  • Practice noticing confusion and uncertainty. What are some instances where you’ve been confused and uncertain today? What did that feel like?
  • Practice noticing overconfidence and underconfidence. What are some instances of this you had today? What did that feel like? How did you know you were over-/underconfident?
  • Practice asking yourself “How might I be wrong about X? How could I find out if I’m wrong?” where X could be something you've been thinking about today. Jot down your answer.
  • Practice asking yourself “How might [someone you respect] be wrong about Y? How could I find out if [so-and-so] is wrong?” Jot down your answer.
  • Practice asking yourself “What would make me change my mind about Z? What are some steps I could take to get info on those things?” Jot down your answer.

Making Predictions (30 mins)

An incredibly valuable way to test your ideas in the world is to use them to make predictions. We judge our models of physics, the economy, and politics based on how accurate the predictions they make are. We can do this with our own personal ideas about the world as well. Throughout the week leading up to the session, try to write down any predictions you make, and maybe share them with your Fellowship cohort. If you and your cohort are interested you could even make small bets on your predictions to test your confidence in them. Below are a few examples of predictions you could make, but feel free to try any predictions that pop into your head over the week.

  • I predict my project/essay/deadline today will take me 10 hours to complete.
    • I'm 90% sure of this and if it doesn't take 10 hours I think it will almost certainly take less than 15.
  • I predict a politician from my country will announce policy X this week.
  • I predict I will spend 2 hours on In-Depth Fellowship reading this week.
  • I predict I will spend <2 hours scrolling through TikTok this week (that would be too ambitious for me).
  • I predict that the talk I'm attending this week will have X attendees.
  • I predict that the chance I change my prioritisation in this cause area is X over the next 2 years.

Check out this post if you want some ideas about useful types of predictions.

Further Reading

Here are some ways to learn more about overcoming bias:


  • Thinking, Fast and Slow (Amazon link) (Daniel Kahneman, 2012): This book research into "System 1" and "System 2", the two modes of thinking we all have and whose quirks and specialities affect our day-to-day decision making drastically.
  • Superforecasting: The Art and Science of Prediction (Amazon link) (Philip Tetlock, 2015): In this book, Philip Tetlock describes the results of his landmark 12-year study that there are some people with real, provable foresight when making difficult predictions.
  • “Everything we do involves forecasts about how the future will unfold. Whether buying a new house or changing jobs, designing a new product or getting married, our decisions are governed by implicit predictions of how things are likely to turn out. The problem is, we're not very good at it.”
  • Rationality: A-Z (or “The Sequences”) is a series of blog posts by Eliezer Yudkowsky on human rationality and irrationality in cognitive science. It is an edited and reorganized version of posts published to LessWrong and Overcoming Bias between 2006 and 2009. This collection serves as a long-form introduction to formative ideas behind LessWrong, the Machine Intelligence Research Institute (MIRI), the Center for Applied Rationality (CFAR), and substantial parts of the effective altruist community. Each book also comes with an introduction by Rob Bensinger and a supplemental essay by Yudkowsky.

Blogs and Forums


  • Clearer Thinking: this website hosts free courses and tools that aim to bridge the gap between research on human behaviour and thinking, and people who could use that research to change their habits, improve their thinking, and achieve their goals.
  • Rationally Speaking | Official Podcast of New York City Skeptics: This biweekly podcast hosted by Julia Galef focuses on the application of rationalist ideas to a bunch of interesting topics.