Building effective altruism

Priority ranking
Related to Readings and resources database (Cause Area)
to investigate

Resources to check out


The argument for working on EA

  • lays out the case compellingly:
    • multiplier effect: see more here, with the slightly nuanced lens of "being a multiplier"
    • Keeps your options open if the ideas of EA turn out to be wrong (note that I think that this is also a reason to be careful on working on EA)
    • Neglected

The argument not to (also from Duda 2017)

I write below each point the info that I think is likely to affect your likelihood of working on this
  • The ideas of EA are correct, or will be badly implemented (likely to be the the community fails to put the ideas of EA into practice)
    • depends on how much you trust the EA community
    • depends on your model of how the EA community works (see )
  • There is a specific problem that is much more pressing
    • depends on how significant you think longtermism is
  • There is a better opportunity for advocacy
    • depends on how sure you are that a given cause area is the best one
  • There is a better way to gain flexibility
    • depends on how much you think
      • gaining career capital early in career (e.g. by working in a non-EA aligned company) is valuable
      • working on other directly EA internventions gives you flexibility
  • Widespread promotion of the current ideas of effective altruism may be premature (ideas will change)
    • how much will ideas change?

What sub-tracks are there within working on EA?

  • Being a public intellectual
  • Being involved in operations
    • see
  • Running an EA group
  • Being involved in some large EA non-profit
  • Promoting EA on the side
  • Earn to give to EA orgs

Cause Prioritization considerations

  • What causes is this cause more important than?
    • I think that there is a strong case to be made, that this is the most important cause area. This argument goes something like
      • Most of the expected value of EA as a movement is in the future, as EA ideas mature
      • Thus, being a community builder who reduces X risk to EA is v high impact because even reducing EA X risk by 0.01% increases expected value a lot
  • What causes could be more important that this?
    • From a high level cause cause comparison perspective, other causes would be more important if you think that X risk to EA is low, or that you aren't able to provide much benefit
    • From a personal career perspective, you might think that other causes give more career flexibility or are just higher impact
  • How does this interact with other causes?
    • Highly interrelated with other causes in the sense that:
      • Many causes are either
        • totally under the umbrella of EA (e.g longtermism)
        • Partly under the umbrella of EA (e.g AI safety, global poverty)
          • This suggests that if EA has some extinction event (loosely defined as everyone hates EA) then this has a huge impact on these causes
    • Also interrelated on a day to day basis
      • community builders likely interact with people in different cause areas, help them make professional/social connections, etc.

What this cause area most needs

  • This is hard to say, but in general:
    • Talent (rather than money)
    • Expertise in operations
    • People who can think about failure modes of EA
    • People who can reach out to others and build connections

Other notes