Some non-EAs worry about EA's effect on mental healthpost by Kaleem · 2022-08-04T00:12:25.136Z · EA · GW · 4 comments
a sequence I’m trying out Epistemic Status: I am 100% sure people have told me what I’ve reported, but I am very unsure about their conclusions or the validity of their claims. Intro For the sake of readability I am writing the first two sections in the voice of the collective of people who expressed these views - these are not my views (my views are the last two sections). “EA as a Crutch "How this relates to Selecting for “Rationalism” My Commentary on these views Notes on the methodology of this post None 4 comments
Hi reader! This is the first post in a sequence I’m trying out [? · GW], and I’m still figuring out how to present the findings I’m presenting, and making the posts useful to you. Please let me know if you have any ideas or suggestions on how to do this in the comments.
Epistemic Status: I am 100% sure people have told me what I’ve reported, but I am very unsure about their conclusions or the validity of their claims.
In my role as a community builder in Boston (and online) for the past year, I have heard the same theme/concern arise out of ~10 different conversations which I have had about the EA community with non-EAs. Unfortunately, none of them were willing to write this up themselves, but I felt that their views were significant enough that I would try and offer a very low-effort way for them to have their views shared, which they did think was important to do.
So what I’ve done here is (with permission) summarized and collated these concerns into a succinct couple of points which I hope will be taken into consideration with other points about EA community building [EA · GW] and mental health [EA · GW] within the EA community [EA · GW]. I also hope that it might (even temporarily) cause us to pay closer attention to our own and our EA friends’ mental health.
For the sake of readability I am writing the first two sections in the voice of the collective of people who expressed these views - these are not my views (my views are the last two sections).
“EA as a Crutch
We don’t know many EAs (some of us know one, some of us know a couple, some of us have attended a house party or an event with EAs also in attendance) but those who we do know seem to have bad mental health, and don’t recognize it. One way we see this manifesting is that our EA friends seem to be more and more sucked-in to EA as their depression or anxiety gets worse - they spend more time with other EAs, or spend more time working long stints, and not seeking help. There are two ways this seems to work:
- EAs tend to focus on really drastic and dire topics and, by working on them, we think our friends are able to ‘justify’ their depression or anxiety as being a rational response to what they’re thinking about all the time.
- We think our EA friends are using discussions with other EAs (or within the EA community) and with us as a way of trying to address some of these issues instead of seeking professional therapy, which we think they need."
"How this relates to Selecting for “Rationalism”
We’ve come to think that EAs tend to lean in the direction of ‘emotionally inexpressive’, or ‘emotionally unaware’. We think this might be a trait which exists prior to people joining EA, rather than something happening to them as a result of joining EA. This seems important because of the way that rationalism seems to lead practitioners to try and engage with ‘heavy’ topics using logic and reason, rather than emotion, which one might then try to suppress in order to ‘become better’ at rational thinking.
We think the type of ‘emotional expressiveness’ we’re worried about is when someone is experiencing physiological pain or dealing with complex and uncomfortable emotions, but doesn't feel comfortable or articulate enough to ask for help or support. It seems like this could lead to very bad outcomes because it's likely that the types of causes EAs think about might make mental health worse, and then these friends get trapped in a downward spiral.
Another thing we think EA might be selecting for is people who are bad at conversing with normal people/having regular conversations. Whilst this is not a bad thing, it does likely mean that when a non-EA meets an EA, chances are that the EA is more likely to behave in a socially unpleasant way, which is probably not good for the public impression of EA, and what EAs are like - but more worryingly, what EA ‘does’ to people in it.”
My Commentary on these views
I think it's plausible that these claims are true. It's interesting to think about what “EA” selects for - we already think it's likely that it selects for people who have some background in economics and philosophy, wealthier people, and younger people, but we haven’t looked much into selection based on psychological or temperamental traits, as far as I know. The idea that EA selects for lower EQ or the presence of mental health issues seems like a massive and spurious claim, but it’d be super interesting to investigate. It’d be especially interesting if it was true that there is something about EA that selects for (or causes) poor social interaction or low personability, as this is something I think is a limiting factor in current outreach/community building efforts on the ground.
Notes on the methodology of this post
This is my first time trying to report on what is essentially a single-blind focus group on the forum, which is not something I think I've seen on the forum in general. That’s understandable, given how iffy and contentious most of the information and sourcing is, and how it's probably not epistemically sound.
Comments sorted by top scores.