↑ comment by Ben Pace ·
2019-10-03T08:39:14.450Z · EA(p) · GW(p)
For some (most?) of these opinions, there isn't any social pressure not to air them. Indeed, as several people have already noted, some of these topics are already the subject of extensive public debate by people who like EA.
First: Many positions in the public discourse are still strongly silenced. To borrow an idea from Scott Alexander, the measure of how silenced something is is not how many people talk publicly about it, but the ratio of people who talk publicly about it to the people who believe it. If a lot of people in a form say they believe something but are afraid to talk about it, I think it's a straightforward sign that they do feel silenced. I think you should indeed update that, to borrow some of your examples, when someone makes an argument for negative utilitarianism, or human enhancement, or abortion, or mental health, that several people are feeling grateful that the person is stepping out and watching with worry to see whether the person gets attacked/dismissed/laughed at. I'm pretty sure I personally have experienced seeing people lose points socially for almost every single example you listed, to varying degrees.
Second: Even for social and political movements, it's crucial to know what people actually believe but don't want to say publicly. The conservative right in the US of the last few decades would probably have liked to know that many people felt silenced about how much they liked gay marriage, given the very sudden swing in public opinion on that topic; they could then have chosen not to build major political infrastructure around the belief that their constituents would stand by that policy position. More recently I think the progressive left of many countries in Europe, Australia and the US would appreciate knowing when people are secretly more supportive of right wing policies, as there has been (IIRC) a series of elections and votes where the polls predicted a strong left-wing victory and in reality there was a slight right-wing victory.
Third: I think the public evidence of the quality of the character of people working on important EA projects is very strong and not easily overcome. You explain that it's important to you that folks at your org saw it and they felt worried that EA contains lots of bad people, or people who believe unsavoury things, or something. I guess my sense here is that there is a lot of strong, public evidence about the quality of people who are working on EA problems, about the insights that many public figures in the community have, and about the integrity of many of the individuals and organisation.
- You can see how Holden Karnofsky went around being brutally honest yet rigorous in his analysis of charities in the global health space.
- You can see how Toby Ord and many others have committed to giving a substantial portion of their lifetime resources to altruistic causes instead of personal ones.
- You can see how Eliezer Yudkowsky and Nick Bostrom spent several decades of their lives attempting to lay out a coherent philosophy and argument that allowed people to identify a key under-explored problem for humanity.
- You can read the writings of Scott Alexander and see how carefully he thinks about ethics, morality and community.
- You can listen to the podcast of and read the public writings by Julia Galef and see how carefully she thinks about complex and controversial topics and the level of charity she gives to people on both sides of debates.
- You can read the extensive writing of The Unit Of Caring by Kelsey Piper and see how much she cares about both people and principles, and how she will spend a great deal of her time trying to help people figure out their personal and ethical problems.
- I could keep listing examples, but I hope the above gets my point across.
I am interested in being part of a network of people who build trust through costly (yet worthwhile) acts of ethics, integrity, and work on important problems, and I do not think the above public Form is a risk to the connections of that network.
Fourth: It's true that many social movements have been able to muster a lot of people and political power behind solving important problems, and that this required them to care a lot about PR and hold very tight constraints on what they can be publicly associated with (and thus what they're allowed to say publicly). I think however, that these social movements are not capable of making scientific and conceptual progress on difficult high-level questions like cause prioritisation and the discovery of crucial considerations.
They're very inflexible; by this I don't merely mean that they're hard to control and can take on negative affect (e.g. new atheism is often considered aggressive or unkind), but that they often cannot course correct or change their minds (e.g. environmentalism on nuclear energy, I think) in a way that entirely prohibits intellectual progress. Like, I don't think you can get 'environmentalism, but for cause prioritisation' or 'feminism, but for crucial considerations'. I think the thing we actually want here is something much closer to 'science', or 'an intellectual movement'. And I think your points are much less applicable to a healthy scientific community.
I hope this helps to communicate where I'm coming from.Replies from: Halstead
↑ comment by Halstead ·
2019-10-03T09:57:19.131Z · EA(p) · GW(p)
Thanks for this, this is useful (upvoted)
1. I think we disagree on the empirical facts here. EA seems to me unusually open to considering rational arguments for unfashionable positions. People in my experience lose points for bad arguments, not for weird conclusions. I'd be very perplexed if someone were not willing to discuss whether or not utilitarianism is false (or whether remote working is bad etc) in front of EAs, and would think someone was overcome by irrational fear if they declined to do so. Michael Plant believes one of the allegedly taboo opinions here (mental health should be a priority) and is currently on a speaking tour of EA events across the Far East.
2. This is a good point and updates me towards the usefulness of the survey, but I wonder whether there is a better way to achieve this that doesn't carry such clear reputational risks for EA.
3. The issue is not whether my colleagues have sufficient public accessible reason to believe that EA is full of good people acting in good faith (which they do), but whether this survey weighs heavily or not in the evidence that they will actually consider. i.e. this might lead them not to consider the rest of the evidence that EA is mostly full of good people working in good faith. I think there is a serious risk of that.
4. As mentioned elsewhere in the thread, I'm not saying that EA should embrace political level self-restraint. What I am saying is that there are sometimes reasons to self-censor holding forth on all of your opinions in public when you represent a community of people trying to achieve something important. The respondents to this poll implicitly agree with that given that they want to remain anonymous. For some of these statements, the reputational risk of airing them anonymously does not transfer from them to the EA movement as a whole. For other statements, the reputational risk does transfer from them to the community as a whole.
Do you think anyone in the community should ever self-censor for the sake of the reputation of the movement? Do you think scientists should ever self-censor their views?
Replies from: Khorton, Ben Pace
↑ comment by Khorton ·
2019-10-03T15:16:48.439Z · EA(p) · GW(p)
"People in my experience lose points for bad arguments, not for weird conclusions."
I just want to note that in my experience this only happens if you're challenging something that's mainstream in EA. If I tell an EA "I'm a utilitarian," that's fine. If I say, "I'm not a utilitarian," I need to provide arguments for why. That's scary, because I've never studied philosophy, and I'm often being stared down by a room full of people with philosophy degrees.
So basically, some of us are not smart enough to make good arguments for everything we believe - and we'll only lose social points for that if we mention that we have weird beliefs.
↑ comment by Ben Pace ·
2019-10-03T18:24:11.802Z · EA(p) · GW(p)
I might have more to say later. On (1), I want to state that, to me, my position seems like the conservative one. If certain views are being politically silenced, my sense is that it's good for people to have the opportunity to state that. In the alternative, people are only allowed to do this if you already believe that they're subject to unfair political pressure. Looking over the list and thinking "Hm, about 100 people say they feel silenced or that their opinions feel taboo, but I think they're wrong about being silenced (or else I think that their opinions should be taboo!), so they shouldn't have this outlet to say that" seems like a strong case for a potential correlated failure. Like, I don't fully trust my own personal sense of which of the listed positions actually is and isn't taboo in this way, and would feel quite bad dictating who was allowed to anonymously say they felt politically pressured based on who I believed was being politically pressured.Replies from: Halstead
↑ comment by Halstead ·
2019-10-05T02:40:29.479Z · EA(p) · GW(p)
There are two issues here. The less important one is - (1) are people's beliefs that many of these opinions are taboo rational? I think not, and have discussed the reasons why above.
The more important one is (2) - this poll is a blunt instrument that encourages people to enter offensive opinions that threaten the reputation of the movement. If there were a way to do this with those opinions laundered out, then I wouldn't have a problem.
This has been done in a very careless way without due thought to the very obvious risks
Replies from: jacobjacob
↑ comment by jacobjacob ·
2020-02-04T10:23:18.202Z · EA(p) · GW(p)
If there were a way to do this with those opinions laundered out, then I wouldn't have a problem.
I interpret  you here as saying "if you press the button of 'make people search for all their offensive and socially disapproved beliefs, and collect the responses in a single place' you will inevitably have a bad time. There are complex reasons lots of beliefs have evolved to be socially punished, and tearing down those fences might be really terrible. Even worse, there are externalities such that one person saying something crazy is going to negatively effect *everyone* in the community, and one must be very careful when setting up systems that create such externalities. Importantly though, these costs aren't intrinsically tied up with the benefits of this poll -- you *can* have good ways of dispelling bubbles and encouraging important whistle-blowing, without opening a Pandora's box of reputational hazards."
1) Curious if this seems right to you?
2) More importantly, I'm curious about what concrete versions of this you would be fine with, or support?
a version with Forum users with >100 karma
Would that address your concerns? Is there anything else that would?
 This is to a large extent: "the most plausible version of something similar to what you're saying, that I understand from my own position", rather than than "something I'm very confident you actually belief".