You have a set amount of "weirdness points". Spend them wisely.
post by Peter Wildeford (Peter_Hurford)
I've heard of the concept of "weirdness points" many times before, but after a bit of searching I can't find a definitive post describing the concept, so I've decided to make one. As a disclaimer, I don't think the evidence backing this post is all that strong and I am skeptical, but I do think it's strong enough to be worth considering, and I'm probably going to make some minor life changes based on it.
Chances are that if you're reading this post, you're probably a bit weird in some way.
No offense, of course. In fact, I actually mean it as a compliment. Weirdness is incredibly important. If people weren't willing to deviate from society and hold weird beliefs, we wouldn't have had the important social movements that ended slavery and pushed back against racism, that created democracy, that expanded social roles for women, and that made the world a better place in numerous other ways.
Many things we take for granted now as why our current society as great were once... weird.
Joseph Overton theorized that policy develops through six stages: unthinkable, then radical, then acceptable, then sensible, then popular, then actual policy. We could see this happen with many policies -- currently same-sex marriage is making its way from popular to actual policy, but not to long ago it was merely acceptable, and not too long before that it was pretty radical.
Some good ideas are currently in the radical range. Effective altruism itself is such a collection of beliefs typical people would consider pretty radical. Many people think donating 3% of their income is a lot, let alone the 10% demand that Giving What We Can places, or the 50%+ that some people in the community do.
And that's not all. Others would suggest that everyone become vegetarian, advocating for open borders and/or universal basic income, the abolishment of gendered language, having more resources into mitigating existential risk, focusing on research into Friendly AI, cryonics and curing death, etc.
While many of these ideas might make the world a better place if made into policy, all of these ideas are pretty weird.
Weirdness, of course, is a drawback. People take weird opinions less seriously.
The absurdity heuristic is a real bias that people -- even you -- have. If an idea sounds weird to you, you're less likely to try and believe it, even if there's overwhelming evidence. And social proof matters -- if less people believe something, people will be less likely to believe it. Lastly, don't forget the halo effect -- if one part of you seems weird, the rest of you will seem weird too!
(Update: apparently this concept is, itself, already known to social psychology as idiosyncrasy credits. Thanks, Mr. Commenter!)
...But we can use this knowledge to our advantage. The halo effect can work in reverse -- if we're normal in many ways, our weird beliefs will seem more normal too. If we have a notion of weirdness as a kind of currency that we have a limited supply of, we can spend it wisely, without looking like a crank.
All of this leads to the following actionable principles:
Recognize you only have a few "weirdness points" to spend. Trying to convince all your friends to donate 50% of their income to MIRI, become a vegan, get a cryonics plan, and demand open borders will be met with a lot of resistance. But -- I hypothesize -- that if you pick one of these ideas and push it, you'll have a lot more success.
Spend your weirdness points effectively. Perhaps it's really important that people advocate for open borders. But, perhaps, getting people to donate to developing world health would overall do more good. In that case, I'd focus on moving donations to the developing world and leave open borders alone, even though it is really important. You should triage your weirdness effectively the same way you would triage your donations.
Clean up and look good. Lookism is a problem in society, and I wish people could look "weird" and still be socially acceptable. But if you're a guy wearing a dress in public, or some punk rocker vegan advocate, recognize that you're spending your weirdness points fighting lookism, which means less weirdness points to spend promoting veganism or something else.
Advocate for more "normal" policies that are almost as good. Of course, allocating your "weirdness points" on a few issues doesn't mean you have to stop advocating for other important issues -- just consider being less weird about it. Perhaps universal basic income truly would be a very effective policy to help the poor in the United States. But reforming the earned income tax credit and relaxing zoning laws would also both do a lot to help the poor in the US, and such suggestions aren't weird.
Use the foot-in-door technique and the door-in-face technique. The foot-in-door technique involves starting with a small ask and gradually building up the ask, such as suggesting people donate a little bit effectively, and then gradually get them to take the Giving What We Can Pledge. The door-in-face technique involves making a big ask (e.g., join Giving What We Can) and then substituting it for a smaller ask, like the Life You Can Save pledge or Try Out Giving.
Reconsider effective altruism's clustering of beliefs. Right now, effective altruism is associated strongly with donating a lot of money and donating effectively, less strongly with impact in career choice, veganism, and existential risk. Of course, I'm not saying that we should drop some of these memes completely. But maybe EA should disconnect a bit more and compartmentalize -- for example, leaving AI risk to MIRI, for example, and not talk about it much, say, on 80,000 Hours. And maybe instead of asking people to both give more AND give more effectively, we could focus more exclusively on asking people to donate what they already do more effectively.
Evaluate the above with more research. While I think the evidence base behind this is decent, it's not great and I haven't spent that much time developing it. I think we should look into this more with a review of the relevant literature and some careful, targeted, market research on the individual beliefs within effective altruism (how weird are they?) and how they should be connected or left disconnected. Maybe this has already been done some?
Also discussed on LessWrong and on the EA Facebook group.
Comments sorted by top scores.
comment by EmeryCooper (MarkCooper) ·
2021-01-21T00:22:01.706Z · EA(p) · GW(p)
Over time, I’ve become less convinced of the value of thinking explicitly about weirdness points for most individuals, and I’m concerned that for many people the concept can actually be pretty harmful. To a large extent, I’m referring less to this actual post, and more to weirdness points as a meme, which I think is somewhat less nuanced than the original post. So I might not be maximally charitable in my criticisms, since what I am criticising is the concept as it is often expressed, rather thanas it was originally expressed.
My concerns are a combination of:
- I think the model is somewhat flawed, especially in domains like hobbies and physical appearance, rather than things like policies and opinions, and that likeability is more messy and complicated
- Even if that weren’t true, I worry that, in practice, people’s attempts to make themselves less weird may in fact make them more weird.
- Finally, regardless of 1 or 2, I think that the psychological costs of worrying about weirdness points can be pretty high relative to the potential benefits, at least for a significant subset of EAs. Additionally, to the extent to which one puts credence in the model being correct, I think it can be hard to avoid these costs once one is familiar with the idea of weirdness points.
I’m going to focus primarily on point 3, since I think other people have already made points 1 and 2.
I’m going to talk about my personal experience for a bit, because I think it’s illustrative of one of the ways this can go wrong.
I encountered the concept of weirdness points as a fairly new EA. To give some context, I have always been a fairly weird person, to some extent for reasons largely outside of my control. I’ve also, for a long time, been afraid of being disliked and ostracised for this, and to some extent internalised the notion of weirdness as Bad. At the point I joined EA, I still was pretty insecure about this, but derived reassurance from being able to say that my actual goals had nothing to do with being liked by people. Then suddenly BAM, this reassurance didn’t really work anymore. Me being weird and people disliking me no longer just affected me, but, to some extent, the EA community at large. So suddenly every weird thing I did actively made the world worse! I don’t really endorse this line of reasoning, and I’m certainly not trying to suggest other people endorse it. But I did find it pretty hard to shake because there might well be *some* grain of truth to it, and was in some sense supported by some interpretations of weirdness points. In any case, I don’t think me thinking along these lines was helpful to either myself or the world, since it mostly just worsened by confidence and wellbeing, and made me more afraid to do anything, which probably, if anything, made me less likeable.
While some aspects of this are likely particular to me, I do think the meme of weirdness points and related concepts may have similar detrimental effects for other EAs, especially those with a history of poor mental health and confidence issues, which seem to be disproportionately prevalent within the community. The idea that you have a *set* number of weirdness points also seems to me to be potentially particularly harmful (regardless of whether it is true), because this seems to imply that those who already have high baseline weirdness for factors entirely outside of their control, such as neurodivergence, or atypical appearance, have a lot fewer of their weirdness points left over before they do anything at all. I think this can lead to people feeling they have to put even more effort than other people into curating their image and have even less freedom to do weird things (when this curation may be more mentally taxing for this precise group of people). Or worse, that their very presence and visibility is *by itself* harming the EA community.
The concept of weirdness points clearly has some merit to it, especially for individuals going into policy or something very public-facing. The law of equal and opposite advice applies, and it probably is net helpful for a bunch of people. However, most people already worry about being perceived as weird for normal human reasons, and I think that adding the additional worry that being perceived as weird may cause actual moral harm can be psychologically damaging for a proportion of people, and hamper their efforts to do good.
On net, I’m not sure whether it is wise as a community to spread the meme of weirdness points to the extent that we have.
comment by RyanCarey ·
2014-11-28T12:40:40.969Z · EA(p) · GW(p)
I think this is a good post for starting conversation but it has also received a lot of substantial criticism here and on LessWrong. To sum up my favourites:
I think each of these is a substantial objection to the thesis of the article and taken together they are pretty damaging.
I think that they can be tied together to be even stronger though.
If you're trying to spread a radical intellectual idea, in the beginning, you mostly want to target early adopters who have a lot of pulling power. Think of the Einstein-Russell manifesto against nuclear proliferation, which preceded the antinuclear movement by a couple of decades, or of many other social movements whose intellectual forebears who preceded them. So far, the most valuable contributors to ideas relating to effective altruism are people like Martin Rees, Jaan Tallin, James Martin, Elon Musk. To a substantial extent, they appraise ideas based on their content, rather than on the basis of who is presenting them. As publisher John Brockman puts it, "I'm not interested in the average person. Most people have never had an idea and wouldn't recognize one if did." Or, as Peter Thiel says: "What is something you believe that nearly no one agrees with you on?" For these people, holding some unorthodox ideas is a prerequisite for them to listen to you. So for people who are promoting a radical idea to others who are themselves radical thinkers, trying to be normal is just not going to cut ice.
Not everyone has to be Will MacAskill or Peter Singer. These two are writing popular books about effective altruism and do lots of TV interviews about these things. They practice talking to the masses so that others can focus on industrial and intellectual leaders, and yet others can focus on researching which charities are the most effective. The problem with this post is that it does not apply to others' roles the same as it applies to MacAskill's and Singer's. But rather, the whole spectrum of jobs are important. Arguably, MacAskill and Singer's roles are less important, because if we can't attract the general public, then we can go to high-net worth individuals. However, if we can't figure out which charities are the most effective, then recruiting is no use at all.
So even if we convince ourselves that it's more important to do fundamental research and talking to intellectual leaders while avoiding public discourse, it's very hard to stave off our human need for social approval. The bottom line, though, is that people don't need a post to tell them to be more aware of social rejection. Most people couldn't forget about social rejection if they tried. In fact, some of the people who gain the most confidence and passion in making a big change to the world do so because they realise that society's big problems are not going to be fixed by themselves. Take Nate Soares for one reasonably articulate example. This can be because of a realisation that society is pretty selfish or crazy a lot of the time, and so some people are going to have to take on the responsibility of improving it. Instead, trying to get across the idea that society is weird might be a better approach to breeding people with a sense of heroic responsibility for fixing it.
So altogether, although I agree with the post broadly - that it's valuable to be familiar and normal, I think people already behave as though it is. Instead, we should pay more attention to the fact that effective altruism is quite a radical idea, that it already has some people who are focussed on acting as its public face, and that it might be better for people to focus on feeling responsible for improving the world, or for figuring out how, before becoming so alert to social considerations.
Replies from: Peter_Hurford, kdbscott
↑ comment by Peter Wildeford (Peter_Hurford) ·
2014-11-29T18:01:11.406Z · EA(p) · GW(p)
I think all those points are correct, but I view them more as expanding nuances rather than direct counterarguments. That is, one can re-construct a version of my thesis that remains agreeable -- it would just be restricted to the domain of advocacy (which I did intend but failed to state) and would admit nuanced ways in which such points can be "earned" in addition to being spent.
So far, the most valuable contributors to ideas relating to effective altruism are people like Martin Rees, Jaan Tallin, James Martin, Elon Musk.
I think of the use case of this essay as the typical EA person trying to hold conversations with their friends about EA / rationalist / etc. topics. Presumably, this typical EA person does care to be an advocate, at least in part -- that is, all else being equal, this person would prefer their friends to adopt their EA / rationalist / etc. ideas.
For us typical EA people, we're restrained by having a more constrained budget of points in a way that Rees, Tallin, etc. don't because they're already famous and have proven themselves for having immense accomplishments that are already verifiable and can then -- cue halo effect -- be openly weird and have people think "hmm, maybe they're right".
So I think the case where one wants to advocate comes up pretty often, and the advice here applies to that for non-famous people.
Replies from: RyanCarey
↑ comment by RyanCarey ·
2014-11-30T16:44:26.285Z · EA(p) · GW(p)
Hey Peter. I think it's good to point out a main agreement that we have. I agree that changing the way you dress and look can be a low-cost and useful way to improve your ability to do advocacy. The area where I think you're missing the mark is with regard to changing your beliefs in order to advocate. If in the process of making your ideas more palatable to others, you lose track of what's true, then your advocacy might end up unhelpful or harmful.
I think you're missing the point about why I mentioned VIPs. It was to argue that being direct and honestly truth-seeking might be a better way of attracting people who have a larger than average capacity to influence the world. I was arguing that attracting people who have a larger than average capacity to influence the world is likely to be more useful than trying to influence your friends. I'm not clear which, if either of these, you disagree with. I wasn't trying to say that we should be weird in order to copy VIPs, because I agree that they're in a different situation.
For most people, looking good when they introduce their friends to effective altruism is not a neglected problem. Lots of effective altruists, like any other people, can improve their social skills and tact, but it's rare for people not to be thinking about social approval at all. Arguably, excessive conformity of opinion is one of the world's big problems - for instance, people can collectively numb themselves to big problems in the world like poverty and existential risk.
I could duplicate arguments for wariness of social biases, but instead it seems better to just link Eliezer, who's been pretty lucid on this topic: Asch's Conformity Experiment, On Expressing Your Concerns.
↑ comment by kdbscott ·
2014-11-28T14:00:30.504Z · EA(p) · GW(p)
Nice summary Ryan.
Indeed, I think the biggest challenge in terms of spreading EA is what I call "extended responsibility." Many people have difficulty taking responsibility for their own lives, let alone their family or community. EA asks you to take responsibility for the whole world, and then carry that responsibility with you for your whole life. Holy crap.
After that, the next big ask is for rational methodologies. Even if people take responsibility for their kids, they probably will rely on intuition and availability heuristics.
So discussion around EA advocacy (which is what I believe to be the topic here) could be better focused around "how to move people towards extended responsibility and rational methodologies".
Of course, that could seem like a soft approach that doesn't immediately get donations to GiveDirectly. Some of the strategies I outlined in my other comment can be used in an instance where you'd like to hard sell.
Replies from: bshannon
↑ comment by bshannon ·
2014-11-28T15:57:44.551Z · EA(p) · GW(p)
Perhaps I'm not thinking this through or I'm simply being unambitious but I don't view effective altruism as asking you to take responsibility for the whole world. I certainly don't feel an enormous weight on my shoulders. I view it more as taking responsibility for the difference between what you would ordinarily do and what you could do if you maximised your impact, which does admittedly require consideration of the whole world.
If valid, maybe that can make effective altruism a little more palatable.
comment by John_Salvatier ·
2014-11-28T07:32:41.705Z · EA(p) · GW(p)
Clean up and look good
This is one of the big benefits I've found from learning to dress well. People generally seem much more positively disposed to the weird ideas I talk about now that I dress well and have good social skills. People interpret weird ideas from high status people much more favorably than they do from less high status people.
I think the lesson here is that its useful to spend time learning to dress well, develop social skills and otherwise become high status. I have some advice on learning how to dress well (mostly for men).
Replies from: Geuss, Peter_Hurford
↑ comment by Geuss ·
2015-02-04T02:44:54.590Z · EA(p) · GW(p)
Firstly, I think this is entirely contextual: certainly in academia, as in many other typically formal environments, one can only dress casually with a certain prior status. Those who dress-down are those who don't need to impress, and thus openly signal that fact. Secondly, in many dissenting subcultures, how one dresses is an important part of that identity, i.e. for an EA to dress and behave modestly is to advertise, and indeed, help enact, ones charitable duties. Thirdly, self-objectification is pretty inhuman to most peoples sensibilities, and especially in the case of women, a pretty negative social pressure. It's also, obviously, extremely conservative ('look and behave like everyone else!' - 'sexually instrumentalise yourself to get more money!').
I'm certainly not saying one should flatly disregard their appearance, just that it contextually holds, often in dissenting subcultures, and can be rather problematic.
comment by Bitton ·
2014-11-27T21:41:32.238Z · EA(p) · GW(p)
I like this way of thinking about weirdness, Peter. I've been saying for a while that EA is associated with a lot of weird ideas that are sure to turn off many ordinary people.
Another thing I'd recommend is remaining sympathetic to mildly and moderately important issues (e.g. fighting police brutality in the USA, supporting gay rights, containing ebola, the ALS ice bucket challenge) even when you see everybody around you overrating their importance relative to other issues that you consider far more important. Raining on everybody else's warm, fuzzy parade will make you "weird," and people will be less willing to hear about your alternative causes. I think the general strategy should be to care about EA issues in addition to mainstream issues, rather than supporting them as an alternative to mainstream issues.
Replies from: None, Peter_Hurford
↑ comment by [deleted] ·
2014-11-27T22:36:39.918Z · EA(p) · GW(p)
This is something discussed a lot in the animal rights movement. Animal rights, perhaps even more so than EA, is a "weird" movement. And, unfortunately, the animal rights movement has been severely damaged by activists who sacrificed other issues for the sake of their own (e.g. PETA and sexist advertising). Recently, some members of the animal rights community have taken an intersectional approach and aligned themselves with those other issues (police brutality, gay rights, etc.). I think this is a very wise approach.
↑ comment by Peter Wildeford (Peter_Hurford) ·
2014-11-27T21:45:43.338Z · EA(p) · GW(p)
That's an important and really interesting point. I don't know if the press coverage of EA eating other issues ("yo, ALS is unimportant, focus on me instead") has been net negative, but it's worth reconsidering and looking into.
comment by Owen_Cotton-Barratt ·
2014-11-27T22:29:13.759Z · EA(p) · GW(p)
Thanks for the write-up. This gets the sense of weirdness points as I understand them pretty well.
The one thing that you say that seems wrong to me is in the title: the idea that you have a set amount of weirdness points. Rather I think you can choose to spend more or less weirdness points, but the more you spend, the less people will pay attention to you.
Of course it is a simplifying metaphor to say you have a set number (and you may often want to act this way, where that number is at the right trade-off between being weird and taking important positions). But as weirdness points are themselves a simplifying metaphor, this level may not be clear to readers meeting the idea for the first time.
Why it matters: sometimes it's right to shift the number of weirdness points you're spending. You don't want to lead people into thinking that whenever they become weird on some axis they need to rein in on another axis. Rather the important takeaway message is: weirdness is a cost, so like any other cost we should be aware of it, and it sometimes won't be worth paying.
Replies from: tomstocker, RyanCarey, John_Salvatier
↑ comment by tomstocker ·
2015-01-14T09:35:13.198Z · EA(p) · GW(p)
There are probably quite a few situations where weirdness can be a benefit: celebrities seem to be loved and listened to for their weirdness sometimes? giVes confidence to radically minded people or others sceptical of looking, gives a living example of bucking the social trend and that being ok, helps.people that might think you're doing looking to manipulate others (which is better than doing it to conform depending on the end goal but some people prefer the conform thing or the.weird thing-trustability) ? But on the whole, yes, wear chino's and a shirt and try to speak.with a deep.voice and control your facial.expressions if you're a man-it'll help, especially when talking to people in conventional roles of influence.
↑ comment by RyanCarey ·
2014-11-28T11:36:26.557Z · EA(p) · GW(p)
Yes, it's true that weirdness is a cost, though it's also less interesting than Peter's analogy.
If I try to run with a slightly more balanced version of Peter's line, I say that everyone has a weirdness budget.
If you lead a major organisation, you want to keep that budget in surplus.
If you're a non-public-facing researcher, then going over your weirdness budget is okay, if your allies can pay it off later by clarifying and softening your arguments. On a plausible narrative, Nick Bostrom and Martin Rees have similarly increased the appeal of Eliezer Yudkowsky's arguments by throwing the weight of their academic prestige behind them.
comment by kdbscott ·
2014-11-28T09:57:25.899Z · EA(p) · GW(p)
I think it's important to highlight that this article is about weirdness in the context of advocacy.
While I enjoyed the message, I'm concerned about the negative approach. People (especially weird ones) tend to be afraid of social rejection, and setting up a framework for failure (spending too many weirdness points) instead of a framework for success (winning familiarity points) can create a culture of fear/guilt around one's identity. I believe this is why some EAs had a cautious reaction to this article.
I love connecting with people, and I've found the most effective way to do so is to be vulnerable, instead of conservative. This has psychological and philosophical underpinnings, which I don't have room to expound on here.
So I propose a different metric: familiarity points. People want to affirm you, just as they want to be affirmed. We are all (including EAs, I've discovered) fundamentally human in how we've figured out how the world works. If you can tap into the basic emotions and thoughts that have brought you to your position, in a way that is familiar to others, then you can share your perspective and gain familiarity points at the same time.
You can also gain easy familiarity points with something that a lot of people are comfortable with, like talking about sports or movies. These are cheap points and not very valuable, but you can leverage them if you dig a little deeper. If you've both seen Inception, great. But if you can both talk about idiosyncrasies in Christopher Nolan's directing over his past 5 movies, then you're getting more familiarity points. You've found something that you're both weird about, and this gains you FAR more traction than simply “not scaring them off”.
Once you have gained enough familiarity points, you can start introducing unfamiliar ideas, but from a much better grounding. “You know, I was thinking about how Nolan portrays the world's collapse in Interstellar, and I was wondering what existential risks are really a problem, so I started reading material from FHI.” This is the foot-in-the-door technique, but since you already have familiarity points, you've got a warm lead, instead of a cold one (to continue the sales lingo). Or, you can use the door-in-face technique: “speaking of Interstellar, I have some crazy ideas about AI development.” Because you have familiarity points, your interlocutor wants to hear more – they consider you a kindred spirit, and thus might have some great new information (you are crazy like me, I want to hear your crazy ideas).
Of course, it's also important to incorporate expectation-setting and have preexisting constructions for conversation entry and the “close”. But I'm getting too long-winded. Maybe I should write a post.
Replies from: Peter_Hurford, RyanCarey
↑ comment by Peter Wildeford (Peter_Hurford) ·
2014-11-29T17:52:48.155Z · EA(p) · GW(p)
I think this is a great idea. I'd love to see it expanded more in a post of it's own.
X-risks via Interstellar is smart! Someone with one of those fancy public outlets (Will Macaskill? Hamilton Nolan? Peter Singer?) should get on that. :)
↑ comment by RyanCarey ·
2014-11-28T11:43:37.161Z · EA(p) · GW(p)
Good idea! I think you're right that whether or not they realise it, people are often moved to avoid weirdness because of their own insecurities, rather than because of impact. I think it would be great to write a post!
comment by frodewin ·
2014-11-27T21:43:08.297Z · EA(p) · GW(p)
I like your thoughts, however I think you oversee one important aspect: People who are more weird will take away weirdness points from less weird people. Meaning, one really radical advocate for an idea helps plenty more less radical advocates promote their position in society. People living vegan make vegetarians seem less weird than they would without vegans. People advocating total smoking bans everywhere make people asking for smoking bans indoors seem less weird, People demanding a universal basic income make people lobbying for minimum wages and social security less weird. And so on. So while a radical position might not come through, and while a person holding that position might not be successful from a superficial point of view, the position and the person will actually do good by taking away the weirdness of less radical activists. I argue that this is actually a strong multiplicator, much stronger than lowering ones own weirdness and advocating for the more socially acceptable positions and solutions. I also argue that this is the reason why policy develops through the aforementioned six stages. And I argue that the more maximum-weird people there are, the faster this development will be. In that sense I can only encourage anyone to be as radical and weird as possible when it comes to altruistic ideas and activism.
Replies from: Peter_Hurford
comment by Kaj_Sotala ·
2014-11-28T10:22:56.517Z · EA(p) · GW(p)
I agree with the general gist of the post, but I would point out that different groups consider different things weird, and have differing opinions about what weirdness is a bad thing.
To use your "a guy wearing a dress in public" example - I do this occasionally, and gauging from the reactions I've seen so far, it seems to earn me points among the liberal, socially progressive crowd. My general opinions and values are such that this is the group that would already be the most likely to listen to me, while the people who are turned off by such a thing would be disinclined to listen to me anyway.
I would thus suggest, not trying to limit your weirdness, but rather choosing a target audience and only limiting the kind of weirdness that this group would consider freakish or negative, while being less concerned by the kind of weirdness that your target audience considers positive. Weirdness that's considered positive by your target audience may even help your case.
comment by Dale ·
2014-11-30T03:39:20.749Z · EA(p) · GW(p)
I think this is good advise, but advice that people will find hard to accept. Every piece of weird behavior comes packaged with object-level inside view considerations in favor of it, which generate convenient excuses for why we should exempt it from this general argument.
"Yes, of course I agree with this evidence in general, but it doesn't really apply to me. I don't need to [wear gender-appropriate clothing / have monogamous relationships / shave my beard / eat normal food / use normal pronouns], because I have really good inside-view reasons for why my weirdness is special."
I'm as guilty of this as anyone.
Worse, being asked to seem less weird feels like asking people to give up a part of their personality. Especially for the sort of person EA tends to attract, this can seem like a much higher costs than simply donating money. Plus, you don't get to signal your virtue!
Replies from: Peter_Hurford
↑ comment by Peter Wildeford (Peter_Hurford) ·
2014-11-30T18:27:52.722Z · EA(p) · GW(p)
Thanks, this is really insightful.
I remember an anecdote from a vegan activist -- I can't find the source -- that went something like the follows:
Are you ready to protest for the movement?
Crowd chears "YEAH!"
Are you ready to get in people's faces for the movement?
Crowd chears "YEAH!"
Are you ready to face the police and go to jail for the movement?
Crowd chears "YEAH!"
Are you ready to put on a suit and look respectable for the movement?
Replies from: Dale
Crowd mills around, looking down or at each other, doesn't say anything.
↑ comment by Dale ·
2014-11-30T21:22:29.298Z · EA(p) · GW(p)
Yes! I actually spent a while looking for that quote while writing the comment, but couldn't find the source. I recalled the last line as being something like
Are you ready to put on a suit and work a 9 to 5 job for the movement?
but couldn't get it close enough for my google-fu to bridge the gap to the source.
Replies from: yboris, yboris
↑ comment by yboris ·
2014-12-09T22:22:11.353Z · EA(p) · GW(p)
I heard this too. There is a similar thought expressed from p30-31 of The Animal Activists' Handbook by Matt Ball (http://animaladvocacybook.com/). Quotes from the two pages:
"After re-evaluating his priorities and choosing a more mainstream appearance, he noticed that the quality of his conversations improved, and the respect of his listeners increased. Consequently, he became more effective an influencer of change" (p30).
"All of us who are working for things that might be seen as going against the status quo should make sure our appearance doesn't detract from our message. Our message is usually difficult enough for people to accept, without us putting up any additional barriers" (p31).
↑ comment by yboris ·
2014-12-16T20:08:03.213Z · EA(p) · GW(p)
I found the reference! It might originally have come from another excellent book: Change of Heart by Nick Cooney, in the early part of the book where he talks about self identity (no page reference because I'm looking at an eBook version). This is entirely unimportant, but in this book the words are:
"Are you willing to cut your hair and put on a suit for the environment?"
comment by Ward (AshwinAcharya) ·
2016-07-05T18:54:05.301Z · EA(p) · GW(p)
Hey! Just happened upon this article while searching for something else. Hope the necro isn't minded.
I wanted to point out that since this article was written--and especially in the last year--basic income at least has become a lot more mainstream. There's the (failed) Swiss referendum, and apparently Finland and YCombinator are both running basic income trials as well. (More locally, there's of course the GiveDirectly UBI trial as well.)
Anecdotally, it seems like these events have also been accompanied by many more people (in my particular bubble) being familiar with the idea. Empirically, see below for a graph of [number of articles mentioning basic income] per year in the New York Times in the link below. EDIT: in an April survey "A majority of Europeans (58%) reported to have at least some familiarity with the concept of basic income, and 64% of Europeans said they would vote in favour of basic income." Not sure about the US at large, though.
Obviously it's debatable how well we could have foreseen this, but it might be worth thinking about a) to what degree we can predict(/affect) which "weird" idea will gain traction and b) to what extent (the possibility of) this sort of rapid increase in acceptability allows for some relaxation of the "weirdness points" framework.
NYT link. Note, too, the basic income "bubble" in the ~'70s.
Results from that April EU survey summarized here:
comment by Dale ·
2014-12-04T01:04:26.725Z · EA(p) · GW(p)
If the real radical finds that having long hair sets up psychological barriers to communication and organization, he cuts his hair.
Saul Alinsky, Rules for Radicals.
comment by AndrewPearson ·
2014-12-02T01:25:54.058Z · EA(p) · GW(p)
I would observe that certain positions are going to be rather more or less weird depending upon the social context. An example from my personal experience would be that students are in general very pro-immigration, so talking about open borders doesn't cost me all that many weird points. On the other hand, many/most of the people I talk to on a regular basis are Christians, which means that talk of cryonics is right out and vegetarianism is often seen as contrary to certain (particularly New Testament) teachings. What this all means is that you can probably get away with a fair bit of what is commonly considered to be weirdness, providing you tailor it to the people most closely around you and their expectations of you.
A second point I would make is that, if you have already advocated for a weird position, then even if it feels like this is costing you weirdness points which would be better spent elsewhere, it may well be a bad idea to abandon the positions you have previously advanced. If people see you as a person who advances silly ideas without much evidence and then later has to retract them, this will severely damage your credibility in holding any weird positions at all.
Thank you writing this post. I could have done with reading it a couple of years back, and will aim to keep it in mind in the future.
comment by Dale ·
2014-11-30T03:59:51.459Z · EA(p) · GW(p)
Weirdness is incredibly important. If people weren't willing to deviate from society and hold weird beliefs, we wouldn't have had the important social movements that ended slavery and pushed back against racism, that created democracy, that expanded social roles for women, and that made the world a better place in numerous other ways.
Also, I think you under-state your argument here. Yes, many good changes were once weird ideas. But that's also true for many bad ideas! All progress is change but not all change is progress.
comment by Jack_LaSota ·
2014-11-27T21:56:57.366Z · EA(p) · GW(p)
I think there are people where you can earn points in their books for being weird and able to convince them of some of the weird things. "This person gave me true beliefs that are otherwise hard to come by, I should listen to them more and see if I can find more rare true beliefs among the rare things they believe." I think these people are disproportionately able to become super-weird. And that super-weird EAs (including undercover super-weird EAs) provide most of the expected value of the EA movement (because of x-risk, donating large fractions of income, and earning to give in tech startups). It may be worth driving away quite a few people with low weirdness tolerance in order to attract more people with the potential to become super-weird.
Replies from: Owen_Cotton-Barratt, cflexman
↑ comment by cflexman ·
2014-11-28T06:32:38.886Z · EA(p) · GW(p)
I also find that it's frequently the most helpful to be only a little weird in public, but once you have someone's confidence you can start being significantly more weird with them because they can't just write you off. Most of the best of both worlds.
comment by [deleted] ·
2015-02-04T02:00:48.549Z · EA(p) · GW(p)
It is like being an author, one has to always ask who your audience is. There are lots of policies, for example, that would help people but seem to be bad for business. When talking to government folks or business folks, show how they are actually good for business and lead with that. So, random third world country suffering from horrible government corruption: normal approach is to denounce corruption and demand reforms. But corruption is bad for business, the GDP grows more slowly the more corruption one has. And it costs companies money directly in bribes. What actually fights corruption is more democracy, more transparency, and more freedom of speech and assembly. What creates those is broad support from the business community for more democracy, more transparency, and more freedom of speech and assembly. The people of England did not rise up and demand the Magna Carta, the Barons marched on London, took it by force of arms, and then demanded the Magna Carta. Power responds to power. Lead with what they want, follow with how they can get it, and no need to even mention what you want in this case.
Replies from: Larks
↑ comment by Larks ·
2015-02-05T04:31:49.884Z · EA(p) · GW(p)
What actually fights corruption is more democracy ... assembly
Do you have a citation for these two having a direct causal influence?
comment by stargirlprincess ·
2015-01-20T23:13:41.449Z · EA(p) · GW(p)
An advantage of asking people to give 10% of their pre-tax income is that there is no coordination problem. Right now very few people give 10% but I do. And my money is doing good (with high probability). If I have a goal of helping people then it is highly rational for me to donate to certain charities.
On the other hand, unless you are unusually wealthy or famous, personal efforts to change politics will probably have no effect. If I dress more normally and tone down my political views maybe I can convince a slightly higher percentage of my friends. But this is going to translate into changes in the law. If every "weird" person changed their behavior maybe more good laws could get passed. But given that I am a normal person why should I "normalize" myself if I am not going to get any tangible results.
Given my analysis I do not suggest people try to act more normal. Unless you are in an unusual position trying to change politics is a waste of time of energy. Politics is much less useful than doing things like helping out other first world members. Be nicer to your friends and family. Don't spend energy or censor yourself in an attempt to change politics.