I want an ethnography of EA

post by Holly_Elmore · 2019-05-02T20:33:38.915Z · score: 66 (40 votes) · EA · GW · 33 comments

Commissioning an ethnography or routine anthropological observation of EA communities could be good for our epistemic hygiene. A lot of the big differences of opinion in EA today don't come down to empirical matters, but priors and values. It's difficult to get anywhere using logic and debate when the real difference between sides is, say, how realistic a catastrophe feels or whether you lean negative utilitarian. One productive way I see to move forward is identifying the existence of strong motives or forces that lead us to hold certain beliefs besides their truth value.

With longtermism, EA is trending into areas where it's difficult to make short-term testable predictions that could expose motivated reasoning or bias (let alone unforeseen complications). Without many empirical checks available in EA's new hot topics, I don't know how to adjudicate between my biases and everyone else's. That's why I think it would be extremely interesting to see what a sociologist or ethnographer had to say about this topic and everything else we do.

What I'd want is a breakdown of the social dynamics and the role that beliefs play in that from someone who won't get bogged down in the content of EA beliefs-- just a descriptive analysis of what beliefs and their associated behaviors are doing what in our system. Particularly, I'd want to know what beliefs they viewed as serving important social functions, i.e. having the greatest reason to persist without being true.

I have my issues with anthropology as a field but I generally approve of the practices around ethnography. I think an ethnography of EA would be a valuable outside opinion that could offer unique access to difficult to debug areas of our thinking. Though not my top priority, such a document might also reveal promising areas for movement-building or previously unappreciated community vulnerabilities.

I'm just curious what people think about this idea for now. Please let me know if this has already been done (yay!). I haven't heard of an anthropological study of EA yet and a quick google search didn't produce anything relevant. Even if it has been done before, EA moves so fast that it could surely profitably be done again. I just learned googling for this post that businesses getting ethnographies for the kind of reasons I'm listing is a thing, so we might be able to get someone like that without having to interest someone in us for their own research. An analysis of the social function of our beliefs is potentially such a good epistemic check that perhaps we should hire someone to do it on a regular basis or try to get someone interested in us for a dissertation.

33 comments

Comments sorted by top scores.

comment by aarongertler · 2019-05-03T04:57:06.541Z · score: 31 (17 votes) · EA · GW

I think ethnography could be useful. But what I really want is for people to spend more time discussing why they make donations, prioritize certain causes above others, etc.

People write about this on the Forum all the time, but the number of people who post on the Forum is a tiny fraction of the number who donate a lot of money, want to work in a certain field, etc. I don't mind if people have lots of hand-wavey bits in their models (lord knows I do); I mostly want to see what kinds of reasons they think they have:

  • How many of us mostly make decisions by putting a lot of trust in EA organizations?
  • How many of us found that the decisions we were making already matched up with what EA organizations recommended?
  • How many of us do any kind of independent analysis of orgs we support, or even read what those orgs write about themselves?

...and so on. Invisible motives can be very powerful, but they don't have to be invisible. (Now that I've said this, I realize I should write up a "where I'm giving and why" post at the end of this year; thanks for the inspiration, Holly!)

comment by MichaelPlant · 2019-05-03T09:13:05.133Z · score: 12 (6 votes) · EA · GW

Can you spell out why you'd like to see that? As read I your comment I immediately thought 'I would also like to see this' and then realised I wasn't sure why self-reports of reasons would be useful.

comment by aarongertler · 2019-05-04T01:55:45.215Z · score: 6 (4 votes) · EA · GW

This could be a long essay, but here are the two points which most stand out to me:

1. I'd like a culture of more honesty/transparency in EA around, specifically, charitable giving; it's a huge part of the movement, but few people talk openly about their own giving decisions, which seems like it has a few different bad effects (for example, making it seem like direct work is a much bigger part of EA than it is, thus increasing the pressure on people to do direct work and feel like donating doesn't matter).

2. I want to learn from people who have spent time thinking about giving, even if those thought processes aren't completely clear or unbiased. I can't possibly follow all of the interesting charities that might appeal to EAs, so seeing where people give is often really informative for me.

(I work for CEA, but these views are my own.)

comment by Milan_Griffes · 2019-05-03T22:04:41.914Z · score: 3 (2 votes) · EA · GW

+1.

Seems like there are a lots of incentive effects & cognitive biases that'd be activated when someone writes up a public-facing account of their prioritization & donation decisions.

comment by aarongertler · 2019-05-04T01:52:17.331Z · score: 3 (2 votes) · EA · GW

Well, the idea would be to try and write your way through those biases and incentives as best you can -- the idea being that EA should have a culture where it's fine to not have all the numbers and to have a personal pull in certain directions, as long as you can recognize this. I'd guess that 90+% of Giving What We Can members don't have really distinct personal models for their donations, for example, and I'd be interested to hear how they choose instead.

comment by Milan_Griffes · 2019-05-04T03:28:41.353Z · score: 2 (1 votes) · EA · GW
the idea would be to try and write your way through those biases and incentives as best you can

I think a crux here is that I'm bearish about the community being able to collectively write its way through this in a way that's positive on net.

It seems like you're more bullish about that.

(I agree that getting more truth-tracking info about why folks are making the decisions they make is a good goal. I think we have a tactical disagreement about how to surface truth-tracking information.)

comment by aarongertler · 2019-05-07T00:29:11.224Z · score: 4 (2 votes) · EA · GW

I think that if a lot of people tried to do this, few would fully succeed, and most would mostly fail, but that we'd all learn a lot in the process and get better at bias-free belief reporting over time.

The EA community has become unusually good at some forms of communication (e.g. our online discussions are more civil and helpful than those almost anywhere else), and I think that's partly a function of our ability to help each other improve through the use of group norms, even if no group member fully adheres to those norms.

comment by Milan_Griffes · 2019-05-07T18:43:50.589Z · score: 2 (1 votes) · EA · GW
I think that if a lot of people tried to do this, few would fully succeed, and most would mostly fail, but that we'd all learn a lot in the process and get better at bias-free belief reporting over time.

Right. I'm modeling some subset of the failures as negative expected value, and it's not obvious to me that the positive impact of the successes would outweigh the impact of these failures.

The EA community has become unusually good at some forms of communication (e.g. our online discussions are more civil and helpful than those almost anywhere else)

Totally agree. I don't understand why our communication norms are so good (compared to benchmarks).

Because I don't have a believable causal model of how this came to be, I have a Hayekian stance towards it – I'm reluctant to go twiddling with things that seem to be working well via processes I don't understand.

comment by aarongertler · 2019-05-07T21:26:20.768Z · score: 7 (2 votes) · EA · GW
I'm reluctant to go twiddling with things that seem to be working well via processes I don't understand.

To me, one of the things that has "worked well" historically has been "people in EA writing about why they've made decisions in great detail". These posts tend to be heavily upvoted and have often been influential in setting the tone of discussion around a particular topic. I don't think people should be forced or pressured to write more of them, but I also don't see why more of them would turn the sign from positive to negative.

comment by Milan_Griffes · 2019-05-07T21:41:28.565Z · score: 4 (2 votes) · EA · GW

Ben Hoffman's latest feels tangentially relevant to our disagreement here.

comment by Milan_Griffes · 2019-05-07T21:36:01.139Z · score: 2 (1 votes) · EA · GW
... but I also don't see why more of them would turn the sign from positive to negative.

There's probably strong selection effects here.

People write up things / spotlight things that are straightforward to justify and/or make them look good.

People avoid things / downplay things that are opaque and/or unflattering.

(speculative) Perhaps more posts like this would increase the selection pressure, leading to a more distorted map of what's going on / more distance between the map and the territory.

comment by Holly_Elmore · 2019-05-04T20:33:12.195Z · score: 4 (3 votes) · EA · GW

What is this bear/bull distinction?

comment by Milan_Griffes · 2019-05-04T17:40:39.690Z · score: 2 (1 votes) · EA · GW

Zvi's recent post feels tangentially relevant to our disagreement here:

This is a world where all one cares about is how one is evaluated, and lying and deceiving others is free as long as you’re not caught. You’ll get exactly what you incentivize.
comment by Julia_Wise · 2019-05-07T19:57:43.307Z · score: 6 (3 votes) · EA · GW

To the extent that ethnography is anonymized, I could imagine people speaking more freely than they do in blog posts, interviews where they're identified, etc.

comment by Holly_Elmore · 2019-05-03T18:07:47.093Z · score: 6 (5 votes) · EA · GW

I see this as something of a different question, i.e. "What portion of this disagreement is due to factors we can access through self-reflection and rationally discuss?" I would want the ethnography to get at things we're too embedded in to see.

comment by Nathan Young (nathan) · 2019-05-03T06:33:49.859Z · score: 4 (3 votes) · EA · GW

To what extent should we fund a counter organisation to, say, 80000 hours to reresearch its decisions - an independent watchdog so to speak?

comment by aarongertler · 2019-05-03T21:18:57.512Z · score: 6 (4 votes) · EA · GW

The term "counter organization" sounds like a bad place to start. I think we currently live in a world where EA organizations are generally pretty transparent about their reasoning and open to being challenged in public, so I'm not sure what a specific "independent watchdog" might accomplish, but I'd be curious to see more details of a proposal in a Forum post!

(I work for CEA, but these views are my own.)

comment by Holly_Elmore · 2019-05-04T01:16:27.882Z · score: 1 (1 votes) · EA · GW

Good point. I originally interpreted the comment to mean just an independent take on 80k topics, and I'm super-supportive of that, but I agree with you that it shouldn't be adversarial.

comment by casebash · 2019-05-03T11:07:13.922Z · score: 1 (1 votes) · EA · GW

Maybe post this as a separate question?

comment by DavidNash · 2019-05-03T08:38:09.651Z · score: 22 (12 votes) · EA · GW

I remember a couple of people doing something slightly similar to this.

Dan Artus wrote a dissertation in 2018 - "​An​ ​ethnographic​ ​exploration​ ​of​ ​ethics,
empathy​ ​and​ ​data​ ​practises​ ​within​ ​the​ ​London​ ​Effective​ ​Altruist​ ​community"

Nick Philips wrote a thesis about the EA movement in 2015 -"Rational Faith: A Study of the Effective Altruism Movement "

comment by Milan_Griffes · 2019-05-03T22:05:23.747Z · score: 8 (6 votes) · EA · GW

Link for Artus' dissertation?

comment by Holly_Elmore · 2019-05-03T17:48:27.971Z · score: 2 (2 votes) · EA · GW

Cool! I've never heard of these so thank you very much.

comment by Peter_Hurford · 2019-05-03T16:12:22.979Z · score: 17 (7 votes) · EA · GW

I've been really interested by the amount of times I've found myself and/or others surprised by seeing that the EA community does something that nearly all other communities do (e.g., infight, unfairly exclude an outgroup, unfairly prefer something or someone high status). I think better awareness of this could be valuable and we may be able to learn a good deal more from the successes and failures of other communities.

comment by Holly_Elmore · 2019-05-03T17:59:32.970Z · score: 8 (6 votes) · EA · GW

I feel like growing up religious (and especially having lots of different Protestant sects in the family) gives me insight that a lot of people in EA who were raised secular don't have. I think it's because we think of those failure modes as having to do with irrational religion (like believing in the supernatural) and not the rational approach we're taking. Short of getting a specific study of EA, I think most EAs would benefit from learning about the history of social and especially religious movements to see how much we are like them.

comment by Khorton · 2019-05-03T18:55:53.097Z · score: 11 (6 votes) · EA · GW

Giles Fraser summed up EA London's atmosphere as 'an evangelical youth group' - not in a mean way - and I've frequently worried that we'll undergo something akin to a church split. The parallels are quite obvious if you're familiar.

comment by DavidNash · 2019-05-09T10:45:45.509Z · score: 13 (4 votes) · EA · GW

Here is an edited version of the dissertation mentioned earlier. It has had most of the non EA London related content removed to help make it more relevant.

An​ ​ethnographic​ ​exploration​ ​of​ ​ethics, empathy​ ​and​ ​data​ ​practises​ ​within​ ​the​ ​London​ ​Effective​ ​Altruist​ ​community

comment by Milan_Griffes · 2019-05-09T17:33:04.089Z · score: 8 (3 votes) · EA · GW

Put this up as a standalone post [EA · GW].

comment by Holly_Elmore · 2019-05-09T16:40:04.960Z · score: 3 (4 votes) · EA · GW

Haven't had a chance to read much but it's already gold

comment by Cullen_OKeefe · 2019-05-03T23:07:37.840Z · score: 5 (4 votes) · EA · GW

How, if at all, do you envision this differing from some of the portrayals of EAs in Strangers Drowning?

comment by Holly_Elmore · 2019-05-04T01:12:15.482Z · score: 5 (4 votes) · EA · GW

I was imagining it as more of a population study than case studies or biographies. More of a study of EA the movement than the stories of individuals involved in EA.

comment by Nathan Young (nathan) · 2019-05-06T13:34:59.193Z · score: 1 (1 votes) · EA · GW

What should we do about ethnographic biases, some suggestions. https://forum.effectivealtruism.org/posts/hPJgG32orpttRNkBY/if-this-forum-ea-has-ethnographic-biases-here-are-some [EA · GW]