Scammers, filters, and an EA group

By Will Payne

This is a pretty rough model of why I think culture is something group leaders should pay a lot of attention to, particularly how people enter the group and how much we should worry about the effects of this.

The unpolished scam

Famously, some scam emails have terrible spelling and grammar, you might have seen something like this email before and wondered how anyone could possibly follow up on that email:

image

The answer is that the world is big, the market of all people reached by the email is massive, and there is someone vulnerable enough - or foolish enough - to fall for this first email. If someone falls for the first email, chances are they're gullible enough to fall for the whole scam. Scammers want their first email to be the least believable so that all the non-believers and sceptics of the world ignore them, leaving anyone they could plausibly take money from easier to pick out of the replies. It's better to be ignored by most, so long as the people who pay attention are the people you care about.

(Here is a better explanation of this scamming tactic)

The Evaporative Filter

A while ago, I read a rationalist article called Every Cause Wants to Be a Cult and an associated article Evaporative Cooling of Group Beliefs. The “evaporative cooling” analogy plays to my physics background, but the phenomenon isn't that different from the scammers and their unpolished emails.

The model is as follows: imagine a group of people who agree on an idea, most are quite average, some are very sceptical, and some are radical. Whenever something in this group happens to shake the foundations of the group - some evidence against the ideology that they support, a social scandal within the group, or the death of a thought leader - people start to lose faith. Who are the first people to leave when that happens? It's not the radicals, and not the average people (not straight away at least), it's the sceptics. This leads to a group which is slightly more radical than it was before, and slightly more prone to crazy ideas which aren't in line with the initial group consensus. This scares off a few more of the most sceptical, which makes the group more extreme, and so on, and so forth, until you have an unrecognisable group of the most radical people in the original society - plus a bunch of new more zealous people they picked up.

This implication scares me. If the model is correct, groups tend towards their most off-putting beliefs, because anyone who might oppose them gets driven away by how off-putting they are. The culture of a group isn't static, it has momentum. Culture attracts people who agree with it and drives away people who don't, and the product is motion towards the extremes. The scariest part is that this wouldn't be a one-off problem. This could happen on every axis (maybe something happens which makes the radicals start to leave first and you get an overly sceptical group which never generates new ideas). It seems like an incredibly natural thing: every cause wants to be a cult you need to work to make sure it isn't. There is no static culture, and you need to keep correcting for shifts like the one described above in order to avoid cultishness.

Tying into EA groups

As a scammer, your first email does most of the work in selecting the people you'll spend time trying to fool. The first email might reach 1000 times as many people as actually respond. As an EA group member, the way you describe EA for the first time selects the people who will join your community. If you describe it in the way that persuaded you, then you'll only get people like you. If you misrepresent EA, then you'll get people who don't fully agree with the core claims of the movement. Maybe they'll leave later on, and you'll have wasted their time and your own. Maybe they'll stick around and start shifting the culture of the movement.

Culture has momentum, extreme ideas attract extremists. As an EA group leader, misrepresenting EA not only attracts people who don't agree with the claims, it might shift the culture in a way that attracts even more of them. Maybe it scares off those people who do agree with the core claims of the movement. Culture can change in other ways, events that attract you might only attract people like you and put off people not like you, this is especially bad because then you get a community which only wants to put on this kind of event. New ideas might scare off the most sceptical EAs and put us in an echo chamber. Culture has momentum, small changes in one direction can be harder to reverse than they were to create.

The general claim I'm making here is something like representing an idea is a lot harder, and a lot more important than it might seem. Small errors can have a large effect, and one person's misunderstanding can affect a lot more than just their own beliefs. As EA community builders, we need to work hard to keep the culture in a good place (both by working hard to get there and by working hard to hold it there). We also affect the culture by being responsible for welcoming people to the community. This means that, consciously or unconsciously, we are filtering the people who join the community. We should make sure to filter for people who agree with the core ideas - but not zealously - and actively make sure not to filter only for people who think like us.

A few examples

Just to put this ramble into context, I thought I'd add a couple of examples:

  • When EA Oxford runs an introduction fellowship, the thing we measure first is not attendance, but it's the number of these people who end up sticking around in the community. We don't want to force someone to come to an 8-week programme if they realise in the first week that EA isn't for them.
  • Recently, the EA Oxford committee made the decision to always actively display our pronouns in Zoom meetings (and plan to do so in physical events when they start up again). This is both good direct progress to make our community more welcoming and a clear signal of our intention to be welcoming to people who are gender nonconforming (and implicitly to generally be welcoming to everyone). Hopefully, this will put off people from joining the group if they don't agree with that goal and will attract people who might have initially worried we weren't welcoming enough.
  • Original versions of this In-Depth Fellowship were far heavier on philosophy papers and strongly academic content. We worried this was scaring off people who weren't interested in pursuing a deeply academic career as a member of the EA community. In order to filter for more kinds of thinking and goals, we decided to reduce the amount of philosophy represented in the core weeks of the curriculum.