Concern, and hope

post by willbradshaw · 2020-07-05T15:08:47.766Z · score: 114 (67 votes) · EA · GW · 27 comments

I am worried.

The last month or so has been very emotional for a lot of people in the community, culminating in the Slate Star Codex controversy of the past two weeks. On one side, we've had multiple posts talking about the risks of an incipient new Cultural Revolution; on the other, we've had someone accuse a widely-admired writer associated with the movement of abetting some pretty abhorrent worldviews. At least one prominent member of an EA org I know, someone I deeply respect, deleted their Forum account this week. I expect there are more I don't know about.

Both groups feel like they and their sacred values are under attack. Both groups are increasingly commenting anonymously or from throwaway accounts, and seeing their comments mass-downvoted and attacked. It's hard not to believe we're at risk of moving in a much more unpleasant direction.

I'm not going to pretend I don't have my own sympathies here. I've definitely been feeling a lot more tribal than usual lately, and it's impaired my judgement at a couple of points. But I think it's important to remember that we are all EAs here. We're here because we endorse, in one form or another, radical goodwill towards the rest of the world. I have never been among a group of people at once more dedicated to the wellbeing of others and the pursuit of the true. I admire you all so much.

Many people here feel their membership in EA is a natural outgrowth of their other beliefs. Those other beliefs can differ quite a lot from person to person. But I implore all of you to see the common good in each other. There are many people in EA who hold beliefs and political opinions significantly different from mine. But with very few exceptions they have proven among the most open, honest and charitable proponents of those views I've ever encountered. We can have the conversations we need to have to get through this.

The Forum is probably not the place to have those conversations. Too many people are too worried about their words being used against them to speak too openly under their own names – an indictment of our broader culture if ever there was one. But you can reach out to each other! Schedule calls! Now is a bad time to not be able to have in-person conferences, but it's not impossible to make up the difference if we try.

(And on the Forum, please try to be charitable, even if your conversation partner is falling short of the standards you would set yourself. Strive to raise the tone of the conversation, not just to match it. I have sometimes failed in this recently.)

I'll start. If I say something on the Forum you disagree with, and you don't think it's productive to discuss it in comments, please feel free to reach out to me by private message, or schedule a call with me here.

Our epistemic norms are precious. So are our norms of compassion, justice, and universal goodwill. We need both to achieve the lofty goals we've set ourselves, and we need each other.

27 comments

Comments sorted by top scores.

comment by Wei_Dai · 2020-07-09T04:51:46.048Z · score: 38 (14 votes) · EA(p) · GW(p)

Re "Cultural Revolution" comparison, let me put it this way: I'm a naturalized citizen of the US who has lived here for 30+ years, and recently I've spent 20+ hours researching the political climate and immigration policies of other countries I could potentially move to. I've also refrained multiple times from making a public comment on a topic that I have an opinion on (including on this forum), because of potential consequences that I've come to fear may happen in a few years or decades later. (To be clear I do not mean beatings, imprisonment, or being killed, except as unlikely tail risks, but more along the lines of public humiliation, forced confessions/apologies, career termination, and collective punishment of my family and associates.)

If there are better or equally valid historical analogies for thinking about what is happening and what it may lead to, I'm happy to hear them out. But if some people are just offended [EA(p) · GW(p)] by the comparison, I can only say that I totally understand [LW · GW] where they're coming from.

comment by Khorton · 2020-07-09T07:54:51.142Z · score: 12 (6 votes) · EA(p) · GW(p)

I basically think the cultural revolution, witch hunts, and people being denounced as heretics are all equally good (and equally bad) comparisons. All three are examples of top-down, peer-enforced violence against an outgroup who can be accused for no reason.

The main differences I see here are that this doesn't seem really top down (neither the Republican party nor the church seem fond of cancel culture) and this has more to do with reputation/livelihood than physical harm. (I have more thoughts about why they're different but I'm self-censoring to be more convincing and because people are mean to me on the EA Forum e.g. when I suggest sexism exists in America.)

I suspect there are many other historical examples of people demonizing the outgroup as well.

comment by Habryka · 2020-07-09T08:02:34.173Z · score: 27 (8 votes) · EA(p) · GW(p)
witch hunts [...] top-down

The vast majority of witch hunts were not top-down as far as I remember from my cursory reading on this topic. They were usually driven by mobs and bottom-up social activity, with the church or other higher institutions usually trying to avoid getting involved with them.

comment by Khorton · 2020-07-09T08:03:59.953Z · score: 9 (10 votes) · EA(p) · GW(p)

Thanks Habryka. In that case, I take it back - witch hunts are a better analogy than the cultural revolution.

EDIT: I also prefer any analogy which emphasizes continuity. I don't think people being "cancelled" this week face particularly different circumstances than Monica Lewinsky; I dislike analogies that suggest there's been a sudden change in how American society behaves.

comment by Wei_Dai · 2020-07-10T20:58:05.650Z · score: 18 (8 votes) · EA(p) · GW(p)

The witch hunts were sometimes endorsed/supported by the authorities, and other times not, just like the Red Guards:

Under Charlemagne, for example, Christians who practiced witchcraft were enslaved by the Church, while those who worshiped the Devil (Germanic gods) were killed outright.

By early 1967 Red Guard units were overthrowing existing party authorities in towns, cities, and entire provinces. These units soon began fighting among themselves, however, as various factions vied for power amidst each one’s claims that it was the true representative of Maoist thought. The Red Guards’ increasing factionalism and their total disruption of industrial production and of Chinese urban life caused the government in 1967–68 to urge the Red Guards to retire into the countryside. The Chinese military was called in to restore order throughout the country, and from this point the Red Guard movement gradually subsided.

I would say the most relevant difference between them is that witch hunts were more "organic", in other words they happened pretty much everywhere where people believed in the possibility of witches (which was pretty much everywhere period), whereas the Cultural Revolution was driven/enabled entirely by ideology indoctrinated by schools, universities, and mass media propaganda.

comment by Larks · 2020-07-07T02:20:18.304Z · score: 33 (14 votes) · EA(p) · GW(p)
On one side, we've had multiple posts talking about the risks of an incipient new Cultural Revolution; on the other, we've had someone accuse a widely-admired writer associated with the movement of abetting some pretty abhorrent worldviews.

I'm not sure what contrast you are trying to make here:

  • The first post [EA · GW] argues that, while SJ cancellations are a problem, we should not fight back against them because it would be too expensive. The second post [EA · GW] agrees that SJ cancellations are a problem that could become much worse, but argues we should try to do something about it.
  • The third post is an example of an attempted SJ cancellation, criticizing the community for being insufficiently zealous in condemning the outgroup. (It was downvoted into oblivion for being dishonest and nasty).

The first two are motivated by concern over the rise of bullying and its ability to intimidate people from communicating honestly about important issues, and discuss what we should do in response. The third article is... an example of this bad behaviour?

For the symmetry argument you want to make, it seems like you would need a right-wing version of the third post - like an article condemning the community for not doing enough to distance itself from communists and failing to constantly re-iterate its support for the police. Then it would make sense to point out that, despite the conflict, both sides were earnestly motivated by a desire to make the world a better place and avoid bad outcomes, and we should all remember this and respect each other.

But to my knowledge, no such article exists, partly because there are very few right-wing EAs. Rather, the conflict is between the core EA movement of largely centre-left people who endorse traditional enlightenment values of debate, empiricism and universalism, vs the rise of extreme-left 'woke' culture, which frequently rejects such ideals. Accusing the moderate left of being crypto-fascists is one of the standard rhetorical moves the far-left uses against the centre-left, and one they are very vulnerable to.


Note that I removed the link to the attack article because I think it is probably a violation of implicit forum norms to promote content with more than 100 net downvotes. If it hadn't been linked in this article I would not have come across it, which is probably desirable from the perspective of the moderators and the community.


Edit: the OP was edited between when I opened the page and starting writing this comment, and when I hit publish; at the request of the author I have updated the quote to reflect his edits, though I think this makes the comment a little harder to understand.

comment by willbradshaw · 2020-07-07T07:53:22.271Z · score: 25 (15 votes) · EA(p) · GW(p)

This comment does a good job of summarising the "classical liberal" position on this conflict, but makes no effort to imagine or engage with the views of more moderate pro-SJ EAs (of whom there are plenty), who might object strongly to cultural-revolution comparisons or be wary of SSC given the current controversy.

As I already said in response to Buck's comment:

I agree that post was very bad (I left a long comment explaining part of why I strong-downvoted it). But I think there's a version of that post, that is phrased more moderately and tries harder to be charitable to its opponents, that I think would get a lot more sympathy from the left of EA. (I expect I would still disagree with it quite strongly.)

As you say, there aren't many right-wing EAs. The key conflict I'm worried about is between centre/centre-left/libertarian-leaning EAs and left-wing/SJ-sympathetic EAs[1]. So suggesting I need to find a right-wing piece to make the comparison is missing the point.

(This comment also quotes an old version of my post, which has since been changed on the basis of feedback. I'm a bit confused about that, since some of the changes were made more than a day ago – I tried logging out and the updated version is still the one I see. Can you update your quote?)


  1. I also don't want conservative-leaning EAs to be driven from the movement, but that isn't the central thing I'm worried about here. ↩︎

comment by Buck · 2020-07-07T16:48:26.251Z · score: 11 (4 votes) · EA(p) · GW(p)

What current controversy are you saying might make moderate pro-SJ EAs more wary of SSC?

comment by Buck · 2020-07-05T18:11:41.303Z · score: 23 (14 votes) · EA(p) · GW(p)

Edit: the OP has removed the link I’m complaining about.

I think it's quite bad to link to that piece. The piece makes extremely aggressive accusations and presents very little evidence to back them up; it was extensively criticised in the comments. I think that that piece isn't an example of people being legitimately concerned, it was an example of someone behaving extremely badly.

Another edit: I am 80% confident that the author of that piece is not actually a current member of the EA community, and I am more than 50% confident that the piece was written mostly with an intention of harming EA. This is a lot of why I think it's bad to link to it. I didn't say this in my initial comment, sorry.

comment by Max_Daniel · 2020-07-07T15:46:14.615Z · score: 39 (15 votes) · EA(p) · GW(p)

I don't have strong views on this, but I'm curious why you think linking to instances of bad behavior is bad. All the reasons I can think of don't seem to apply here - e.g. the link clearly isn't an endorsement, and it's not providing resources e.g. through increased ad revenues or increasing page rank.

By contrast, I found the link to the post useful because it's evidence about community health and people's reactions: the fact that someone wrote that post updated me toward being more worried (though I think I'm still much less worried than the OP, and for somewhat different reasons). And I don't think I could have made the same update without skimming the actual post. I.e. simply reading a brief description like "someone made a post saying X in a way I think was bad" wouldn't have been as epistemically useful.

I would guess this upside applies to most readers. So I'm wondering which countervailing downsides would recommend a policy of not linking to such posts.

comment by Buck · 2020-07-07T16:14:34.863Z · score: 6 (7 votes) · EA(p) · GW(p)

I have two complaints: linking to a post which I think was made in bad faith in an attempt to harm EA, and seeming to endorse it by using it as an example of a perspective that some EAs have.

I think you shouldn't update much on what EAs think based on that post, because I think it was probably written in an attempt to harm EA by starting flamewars.

EDIT: Also, I kind of think of that post as trying to start nasty rumors about someone; I think we should generally avoid signal boosting that type of thing.

comment by Max_Daniel · 2020-07-07T18:02:10.388Z · score: 31 (10 votes) · EA(p) · GW(p)

Thanks for explaining. This all makes some sense to me, but I still favor linking on balance.

(I don't think this depends on what the post tells us about "what EAs think". Whether the author of the post is an EA accurately stating their views, or a non-EA trying to harm EA, or whatever - in any case the post seems relevant for assessing how worried we should be about the impacts of certain discussions / social dynamics / political climate on the EA community.)

I do agree that it seems bad to signal boost that post indiscriminately. E.g. I think it would be bad to share without context on Facebook. But in a discussion on how worried we should be about certain social dynamics I think it's sufficiently important to look at examples of these dynamics.

EDIT: I do agree that the OP could have done more to avoid any suggestion of endorsement. (I thought there was no implied endorsement anyway, but based on your stated reaction and on a closer second reading I think there is room to make this even clearer.) Or perhaps it would have been best to explicitly raise the issue of whether that post was written with the intent to cause harm, and what this might imply for how worried we should be. Still, linking in the right way seems clearly better to me than not linking at all.

comment by willbradshaw · 2020-07-07T19:03:47.050Z · score: 30 (11 votes) · EA(p) · GW(p)

I'm still pretty sceptical that the post in question was deliberately made with conscious intention to cause harm. In any case, I know of at least a couple of other EAs who have good-faith worries in that direction, so at worst it's exacerbating a problem that was already there, not creating a new one.

(Also worth noting that at this point we're probably Streisanding this dispute into irrelevance anyway.)

comment by willbradshaw · 2020-07-05T18:56:01.480Z · score: 26 (11 votes) · EA(p) · GW(p)

I agree that post was very bad (I left a long comment explaining part of why I strong-downvoted it). But I think there's a version of that post, that is phrased more moderately and tries harder to be charitable to its opponents, that I think would get a lot more sympathy from the left of EA. (I expect I would still disagree with it quite strongly.)

I think there's a reasonable policy one could advocate, something like "don't link to heavily-downvoted posts you disagree with, because doing so undermines the filtering function of the karma system". I'm not sure I agree with that in all cases; in this case, it would have been hard for me to write this post without referencing that one, I think the things I say here need saying, and I ran this post by several people I respect before publishing it.

I could probably be persuaded to change that part given some more voices/arguments in opposition, here or in private.

(It's also worth noting that I expect there are a number of people here who think comparisons of the current situation to the Cultural Revolution are quite bad, see e.g. here [EA(p) · GW(p)].)

comment by Buck · 2020-07-06T02:40:04.579Z · score: 26 (10 votes) · EA(p) · GW(p)

I think that both the Cultural Revolution comparisons and the complaints about Cultural Revolution comparisons are way less bad than that post.

comment by Khorton · 2020-07-05T20:20:51.277Z · score: 26 (25 votes) · EA(p) · GW(p)

I agree that comparisons to the Cultural Revolution are bad. As someone with family members who were alive during the Chinese Cultural Revolution (one of whom died because of it), I'm pretty unsympathetic to people saying cancel culture is the new cultural revolution.

comment by Buck · 2020-07-06T02:41:46.375Z · score: 34 (16 votes) · EA(p) · GW(p)

Many other people who are personally connected to the Chinese Cultural Revolution are the people making the comparisons, though. Eg the EA who I see posting the most about this (who I don't think would want to be named here) is Chinese.

comment by Khorton · 2020-07-06T08:53:23.360Z · score: 16 (9 votes) · EA(p) · GW(p)

Yes, I've spoken in depth with one. I don't believe he shouldn't be able to make the comparison, but we agreed the comparison has no predictive power and is one of many comparisons that could be made (eg you could probably just as easily compare the current situation to witch hunts which is a more common analogy in Western circles).

We also agreed there are dissimilarities (eg in this situation in America there's no state backing of anyone being targeted; in fact, social justice protestors are much more likely to be injured or killed by the state than the people they oppose)

comment by willbradshaw · 2020-07-07T08:00:57.244Z · score: 4 (2 votes) · EA(p) · GW(p)

(I have now cut the link.)

comment by Buck · 2020-07-05T18:50:00.477Z · score: 21 (11 votes) · EA(p) · GW(p)
culminating in the Slate Star Codex controversy of the past two weeks

I don't think that the SSC kerfuffle is that related to the events that have caused people to worry about cultural revolutions. In particular, most of the complaints about the NYT plan haven't been related to the particular opinions Scott has written about.

comment by willbradshaw · 2020-07-05T19:04:13.659Z · score: 19 (8 votes) · EA(p) · GW(p)

"Culminating" might be the wrong word, I agree the triggering event was fairly independent.

But I do think people's reactions to the SSC kerfuffle were coloured by their beliefs about the previous controversy (and Scott's political beliefs), and that it contributed to the general feeling I'm trying to describe here.

comment by willbradshaw · 2020-07-13T07:50:57.393Z · score: 8 (5 votes) · EA(p) · GW(p)

So far the comments here have overwhelmingly been (various forms of) litigating the controversy I discuss in the OP. I think this is basically fine – disagreements have all been civil – but insofar as there is still interest I'd be keen to hear people's thoughts on a more meta level: what sorts of things could we do to help increase understanding and goodwill in the community over this issue?

comment by abrahamrowe · 2020-07-16T19:00:27.409Z · score: 19 (15 votes) · EA(p) · GW(p)

Thanks for making this post Will -

I'll admit that since the SSC stuff happened, I've been feeling a lot further from EA (not necessarily the core EA ideas, but associating with the community or labeling myself as an EA), and I felt genuinely a bit scared learning through the SSC stuff about ways in which the EA community overlaps with alt-right communities and ideas, etc. I don't know what to make of all of it, as everyone I work with in EA regularly are wonderful people who care deeply about making the world better. But I feel wary and nervous about all this, and I've also been considering leaving the forum / FB groups just to have some space to process what my relationship with EA ought to be external to my work.

I see a ton of overlap between EA in concept and social justice. A lot of the dialogue in the social justice community focuses on people reflecting on their biases, and working to shift out of a lens on the world that introduces some kinds of biases. And, broadly folks working on social justice issues are trying to make the world better. This all feels very aligned with EA approaches, even if the social justice community is working on different issues, and are focused on different kinds of biases.

I've heard (though don't know much about it), that to some extent EA outreach organizations stopped focusing on growth and has focused more on quality in some sense a few years ago. I wonder if doing that has locked in whatever norms were present in the community prior to that, and that's ended up unintentionally resulting in a fair amount of animosity toward ideas or approaches to argument that are outside the community's standards of acceptability? I generally think that one of the best ways to improve this issue is to invest heavily in broadening the community, and part of that might require work to make the community more welcoming (and not actively threatening) to people who might not feel welcome here right now.

comment by willbradshaw · 2020-07-16T21:34:36.059Z · score: 9 (3 votes) · EA(p) · GW(p)

Thanks, Abraham. It's really valuable to get these perspectives, and it's helpful to get people discussing these issues under their real names where they feel they can. I agree that there is a lot of overlap between the impulses that lead people into EA and those that lead many people into SJ.

I'm too tired right now to respond to this in the depth and spirit it deserves – I'll try and do so tomorrow – so just wanted to flag that this is a positive and valuable contribution to the discussion. I hope any responses to it in the meantime are made in the same spirit.

comment by alexrjl · 2020-07-16T21:56:13.217Z · score: 5 (3 votes) · EA(p) · GW(p)

This post does a much better job that I could manage of explaining how I've felt recently. Thank you for writing it.

comment by John_Maxwell (John_Maxwell_IV) · 2020-07-20T05:37:51.168Z · score: 14 (6 votes) · EA(p) · GW(p)

Something I've been doing just a bit lately which seems to be working surprisingly well so far: If I see a polarizing discussion on EA Facebook, and someone writes a comment in a way which seems needlessly combative/confrontational to me, I add them as a friend and private message them trying to persuade them to rewrite their comment.

My general model here is that private 1-on-1 communication is much higher bandwidth, less ego-driven, and more amenable to the resolution of misunderstandings etc. However it's not nearly as scalable (in terms of the size of the audience reached) as a forum discussion is. But private 1-on-1 communication where you try to persuade someone to change their forum writing gets you the best of both worlds.

Another model is that combativeness tends to beget combativeness, so it's high-leverage to try & change the tone of the conversation as early as possible.

comment by Halffull · 2020-07-20T18:25:57.858Z · score: 5 (3 votes) · EA(p) · GW(p)

Here's Raymond Arnold on this strategy:

https://www.lesswrong.com/posts/LxrpCKQPbdpSsitBy/short-circuiting-demon-threads-working-example [LW · GW]