Comment by holly_elmore on Apology · 2019-03-25T21:19:59.690Z · score: 0 (3 votes) · EA · GW

By "not entirely separate," I meant something more like "the Brown accusations have put him under a level of scrutiny that makes future allegations more likely/more likely to be refelexively believed/make smaller incidents more damning, even if he weren't doing anything to provoke them." So I was referring more to whether the judges in the recent events were affected by knowledge of the Brown events, that kind of "not entirely separate." The events themselves, you're right, would have to be different instances.

What I thought was grasping at straws was your attempt at gotcha syllogistic reasoning.

Comment by holly_elmore on Apology · 2019-03-25T20:30:58.471Z · score: 2 (2 votes) · EA · GW

(I thought maybe Oli thought I knew him or something and that's why he said I was "better placed to continue the discussion.")

Comment by holly_elmore on Apology · 2019-03-25T18:32:05.776Z · score: 4 (7 votes) · EA · GW

I think you're really grasping at straws here. Is the point to depose Oli, or what? Surely you can't think you're going to get more information about what did or did not happen this way. There are many conceivable ways that the Brown allegations could color CEA's perception of more recent allegation, making the different events not entirely separate.

Comment by holly_elmore on Apology · 2019-03-25T18:20:35.057Z · score: 5 (6 votes) · EA · GW

Just to be clear, I barely know Jacy. I've seen him many times at events, including when he came to Harvard on his book tour, but I don't believe I've ever had a private conversation with him. (Fwiw he never came close to being inappropriate with me or giving me a bad vibe.)

Comment by holly_elmore on Apology · 2019-03-25T17:18:15.559Z · score: 4 (8 votes) · EA · GW

I'm not sure what you mean by A, B, and C. Just to be clear, all I'm saying is that the only thing that this apology has ruled out is "Jacy vehemently denies any possibility of wrongdoing and would not cooperate with CEA's decision regarding him." Other than that, I feel it is compatible with most scenarios of his guilt/innocence and of his reaction to being accused.

Comment by holly_elmore on Apology · 2019-03-25T13:22:34.073Z · score: 4 (7 votes) · EA · GW

"I'm merely pointing out that this gives you zero Bayesian evidence to distinguish two very different kinds of situations."

This is all I was trying to point out, too. We know he's cooperating with CEA and accepting a reprimand. I think that's all this apology tells us.

Comment by holly_elmore on Apology · 2019-03-25T13:19:23.829Z · score: 12 (8 votes) · EA · GW

I think this apology sounds a lot like the template of a dignified apology that a lot of us have in our heads. Take as much responsibility as you can, don't shrink from the accusations or blame anyone else. He speaks several times of the restorative process, and part of that is offering apologies along these lines. There are many classes you can take and books you can read (I've read some), popular in Jacy's communities, on how to give these apologies. He may well have composed it alongside CEA. Why would you think it should sound emotional, like he wrote it the moment he learned of the reprimand?

It doesn't mean much, but my first reaction was that it seemed like he was overreacting and trying to rise above by taking a lot of responsibility. I really don't know, though. I think all of our speculation on the basis of a formal apology is unlikely to clarify anything.

Comment by holly_elmore on Apology · 2019-03-25T13:15:23.850Z · score: 4 (9 votes) · EA · GW

"Consider the implications for criminal law - does this imply that all people accused should submit guilty pleas merely because they have been accused?"

Good example, actually, because false confessions are a thing. The fact that someone would confess or apologize alone does not entail guilt. You may not do it (or think you would), but false confessions happen because it's easy to imagine you did something wrong when people you trust/fear are telling you you did. I'm sure being a scrupulous and ethical person steeped in social justice ideas about being naturally ignorant of the impact of your actions doesn't help.

I believe we should respect what responsibility he takes above. I'm not trying to say he didn't do something wrong (seems very possible as well) but I think trying to discern that from this formal apology is not really possible. Saying that you would never apologize like this if you were innocent just isn't real evidence, since many people have.

Comment by holly_elmore on Apology · 2019-03-25T12:42:35.622Z · score: 8 (8 votes) · EA · GW

I think we just don't know and we're probably not going to get any more blood out of this turnip.

Comment by holly_elmore on Apology · 2019-03-25T00:28:28.931Z · score: 15 (18 votes) · EA · GW

^That said, I think we should take Jacy at his word and not argue with any responsibility he takes. I'm not trying to exonerate him. I'm just saying expressing remorse at the possibility of unintentional wrongdoing is not evidence of guilt imo. You don't know until it happens, but I can see myself reacting this way if someone came at me with a serious accusation that made me feel like a bad person. [Edit: If I was unsure whether I'd done any wrongdoing,] I'd probably instantly want to betray myself rather than face people thinking I was guilty and unremorseful.

Comment by holly_elmore on Apology · 2019-03-25T00:20:37.636Z · score: 20 (18 votes) · EA · GW

I can see how a person accused might reflexively take responsibility and do what it takes to express willingness to change. I mean, that's what we're taught to do in enlightened communities (animal rights is among the most intense, especially after #ARmetoo). I don't see Jacy stepping back and soul-searching when told of accusations as clear evidence of his guilt. Especially since the belief that powerful people can unknowingly do immense harm to vulnerable individuals is so common in lefty culture (especially AR) these days. I think it's easy to gaslight yourself and think you actually might have done something seriously wrong without knowing.

Comment by holly_elmore on Apology · 2019-03-25T00:06:23.027Z · score: 11 (4 votes) · EA · GW

Kathy Forth mentioned getting someone banned from EAG.

Comment by holly_elmore on Suggestions for EA wedding vows? · 2019-03-22T16:02:05.708Z · score: 4 (4 votes) · EA · GW

Not exactly EA, but part of a scientific worldview: I had the end of the last paragraph of Origin read at my wedding.

Thus, from the war of nature, from famine and death, the most exalted object which we are capable of conceiving, namely, the production of the higher animals, directly follows. There is grandeur in this view of life, with its several powers, having been originally breathed into a few forms or into one; and that, whilst this planet has gone cycling on according to the fixed law of gravity, from so simple a beginning endless forms most beautiful and most wonderful have been, and are being, evolved.

In fact, it's pretty un-EA to say that "higher" animals are "the most exalted object we are capable of conceiving," haha.


There's a lot of Zen stuff about using your intimate relationships as a supportive place to learn altruism which can then be applied to wider and wider circles. That seems pretty appropriate for a wedding. I don't have any links off the top of my head because I usually hear this kind of thing at dharma talks, but it's usually along the lines of someone asking a Zen master how to be a better person and getting the answer, "Every day when you wake up, think 'only for my wife, only for my wife.' When your wife's welfare is like your own, think 'only for my family'" and so on through the neighborhood, the community, the city, the country, the world. The localist hierarchy isn't EA, but the idea that you have to level up your compassion with the support and commitment of those you are close to brings EA themes together with marriage.

Comment by holly_elmore on EA jobs provide scarce non-monetary goods · 2019-03-21T02:44:44.355Z · score: 11 (6 votes) · EA · GW

Strong upvote. I think this is exactly it.

Comment by holly_elmore on Justice, meritocracy, and triage · 2019-03-20T18:59:54.871Z · score: 1 (1 votes) · EA · GW

Thanks for the cite :)

You're right, I generally think of Level 2 thinking as fighting the hypothetical. For the purposes of our philosophical games, it's really annoying when people can't answer the question and deal with fundamental tradeoffs. It's like fighting the setup to a math problem-- "Does Jane really have to divide up her apples?" They are refusing to engage with their values, which is the point. BUT, irl, it is pretty important not to get locked into a falsely narrow idea of what the situation is and leap to bite bullets. You aren't given the ironclad certainty of the hypo. If you're not sure this is a triage situation, then devote time to figuring that out.

The fear I was addressing in my Triage essay was that people get locked onto "finding another way" as their Level 1 answer. Because there are situations where a creative solution eliminates a hard choice, there must not be any hard choices! They won't take decisive triage action, because that's sub-theoretically-optimal, but they will let the worst outcomes come about (i.e., waiting too long so everyone dies) so long as they didn't have to get their hands dirty. I think the fear that people will rush to drastic action when there were alternatives is just as valid.

I do get a bit annoyed by the fear that we'll get so good at triage that reasoning developed under conditions of emergency and scarcity will get locked in. It's not just you. People seem really afraid of giving in to the logic of triage even if they understand it, like they'll lose some important moral or intellectual faculty if they do so. They especially fear that they shouldn't adopt triage ideas if they won't always have to think that way. It's like they are worried about taking the utilitarianism red pill and not being able to unsee that way of thinking even if they know it's unnecessary. It would be interesting to study why this is. Be that as it may, though, triage thinking is the best thing we have in emergency medical situations under conditions of scarcity, which still exists. Acknowledging tradeoffs and scarcity more broadly still seems pretty important to maximizing utility today as well. I don't think "we may have abundance one day, and then we wouldn't have to think about tradeoffs" is a reason not to employ triage and lose all those QALYs in the meantime. I also think it's very unlikely that triage/tradeoffs, if they were embraced where applicable today, would be much harder to unlearn in conditions of abundance than the deeper, instinctive scarcity thinking we'd have to deal with anyway.

Comment by holly_elmore on Effective Altruism and Meaning in Life · 2019-03-20T00:02:52.367Z · score: 18 (9 votes) · EA · GW

I really love this! The style makes it clear you are practicing what you preach about letting yourself play and take aesthetic delight :) I really relate to your journey thus far and it means a lot to have other people talk about it.

Comment by holly_elmore on [Link] A Modest Proposal: Eliminate Email · 2019-03-19T22:51:27.135Z · score: 2 (2 votes) · EA · GW

Autoreplies get out of hand really quick. When the autoreply bug goes through a work environment, pretty soon autoreplies are 60% of your inbox. Out of courtesy for others, I only use them for vacations.

I used to have my urgent email address in my signature so truly urgent emails could get my attention (push notification to phone). My advisor found the instructions and the implication that all emails are not important to be condescending, so I removed it. But I might reinstate it if my next position increases my email burden.

Comment by holly_elmore on [Link] A Modest Proposal: Eliminate Email · 2019-03-18T22:08:55.699Z · score: 4 (4 votes) · EA · GW

It doesn't solve all the problems of email, but as a first step, but why not simply have email checking hours instead of office hours? Almost all the shittiness of email for me isn't about the medium but the "always on" expectation. In my current job (ABD Phd student), I can restrict email checking to 2 or 3 times a day, and I'm pretty happy with that.

Hell, if people just had the expectation that emails will take at least 24 hours to answer, I think we'd be way better off. People don't prepare their initial inquiries well because sending an email is so cheap. If they weren't expecting a back-and-forth to get at the real issue for the next day, then they might do a better job figuring out their actual question in the first place.

Comment by holly_elmore on The Evolution of Sentience as a Factor in the Cambrian Explosion: Setting up the Question · 2019-03-11T22:30:50.996Z · score: 11 (5 votes) · EA · GW

Nice job :) Studying evolutionary history to understand the possibilties for minds is probably my favorite EA genre.

One criticism: We don't know if sentience really imbues or is necessary for any of the traits you listed as associated with it. You sort of addressed the hard problem after the list, but if you're not positing some benefit to awareness itself then I think you might as well say the Cambrian Explosion was due to a predator-prey arms race that elaborated the nervous system of many animal taxa and promoted fossilizable protected exteriors. Might be more accurate to say that sentience was a byproduct of the Cambrian Explosion if it's those factors you see as beneficial and you're just noting their seeming association with sentience.

If you expand this, I think it would really help to clarify what it would mean for sentience to play an active role in spurring the diversification (i.e. if sentience somehow gives the ability to learn and sense) versus the diversification promoting the things required for sentience.

Comment by holly_elmore on Three Biases That Made Me Believe in AI Risk · 2019-02-14T21:22:40.846Z · score: 13 (12 votes) · EA · GW
[...]I noticed that most of my belief in AI risk was caused by biased thinking: self-aggrandizing motivated reasoning, misleading language, and anchoring on unjustified probability estimates.

Thank you so much for your reflection and honesty on this. Although I think concerns about the safe development of AI are very legitimate, I have long been concerned that the speculative, sci-fi nature of AI x-risks gives cover to a lot of bias. More cynically, I think grasping AI risk and thinking about it from a longtermist perspective is a great way to show off how smart and abstract you are while (theoretically) also having the most moral impact possible.

I just think identifying with x-risk and hyperastronomical estimates of utility/disutility is meeting a suspicious number of emotional and intellectual needs. If we could see the impact of our actions to mitigate AI risk today, motivated reasoning might not be such a problem. But longtermist issues are those where we really can't afford self-serving biases, because it won't necessarily show. I'm really glad to see someone speaking up about this, particularly from their own experience.

Comment by holly_elmore on [blog cross-post] The remembering self needs to get real about the experiencing self. · 2019-02-10T04:07:01.240Z · score: 2 (2 votes) · EA · GW

I guess it depends on how narrowly you define EA. I think of evaluating states of pleasure/suffering, affective forecasting, and decision-making as common EA topics. My argument is related to a hedonistic utilitarian argument against preference utilitarianism, but I don't often hear people taking on shortcomings of the remembering self the way they do preferences. Usually the remembering self is held out as a superior perspective on life because it's out of the moment, when I argue it's just as selfish as the experiencing self. In fact, it's just another kind of experiencing self that wants different things.

[blog cross-post] The remembering self needs to get real about the experiencing self.

2019-02-08T18:21:18.746Z · score: 16 (8 votes)
Comment by holly_elmore on are values a potential obstacle in scaling the EA movement? · 2019-01-03T21:59:31.872Z · score: 9 (6 votes) · EA · GW

Others have said this, but you're getting at whether the movement should prioritize growth and easy assimiliation or maintaining high fidelity to its values. So far most of the core favors the high fidelity model. Personally, I agree, because EA won't be as effective or could even be destructive if EA as a movement is not anchored in its values. But we miss out on people who don't have that somewhat extreme, values-driven bent, which is a terrible loss for EA as a community.

Even at the level of organizing at Harvard, I feel torn between seeing our club's value as spreading some good values on campus (more watered down outreach) or incubating the next generation of high-power EAs (a few intense, targeted waves of outreach). I worry that we unintentionally select for a lot of baggage when we select the intense, highly values-driven people, and that the more the entire movement does that, the more blind we are to it.

Comment by holly_elmore on Cause profile: mental health · 2019-01-03T21:45:00.793Z · score: 24 (9 votes) · EA · GW

Dealing with my own mental health issues has convinced me of just how much unhappiness it causes and just how complicated it can be to address. It's not appealing like the ~$3,500 to AMF = 1 counterfactual life equation is. I think the feeling that there aren't good intervention options is the reason that mental health doesn't rank higher as a cause in EA, at least not for donating, rather than longtermism vs. presentism. I'm kind of presentist, and personally I think mental health is up there for most important cause, but I have just never been confident enough in a mental health charity to donate to it. (I'm checking out StrongMinds, though-- thanks for the suggestion.) I would donate to the CBT apps, if they were charities. They are the only intervention that is scalable and tractable enough in this space to really count for EA, imo. Or if someone started a campaign to add trace Lithium to the water, I would help. Other than that, I think we just need to develop more scalable interventions, which is not exactly tractable!

I would love to see EA take on the challenge of incubating mental health charities and vaunt mental health as a cause more. Thanks for your role in promoting it :)

Comment by holly_elmore on Who sets the read time estimates? · 2018-12-28T21:38:07.364Z · score: 3 (3 votes) · EA · GW

Do you know why 300 words per minute was chosen? I think I'm below that and I know I'm not a slow reader. I feel like estimates that help you decide how to spend your time should be a little more generous. (But maybe I *am* a slow reader, idk.)

Who sets the read time estimates?

2018-12-28T16:41:26.014Z · score: 9 (4 votes)

[blog cross-post] On privacy

2018-12-28T15:41:29.448Z · score: 30 (26 votes)
Comment by holly_elmore on How Effective Altruists Can Be Welcoming To Conservatives · 2018-12-28T14:48:47.976Z · score: 8 (4 votes) · EA · GW

Of all the academic, activist, and Silicon Valley-type communties I belong to, EA is the most inclusive to (US) conservative ideas. It's not a very high compliment, but still. The strong free market bent of EA takes most of its members away from mainstream liberal economic policies, i.e. being in favor of globalization (though this issue keeps switching sides). And people tend not to feel any shame about supporting a "conservative" policy if they arrived at it through reason and evidence.

What I do notice is contempt for the culture of American conservatism, beyond even equating it with racism and sexism. Aesthetic horror at the use of guns and big trucks, derision at the idea that anyone could believe Fundamentalist Christianity, considering suburban or rural family-centered life to be lame, condescendingly asserting that the majority of conservatives vote outside of their interests (read: because they are too dumb and driven by fear and hatred to see that we know what's best for them), everything to do with Trump...

I think the cultural stuff is a big blindspot in EA and the most significant way in which we lack needed diversity, but I'm very hopeful that with essays like this, EAs will be open to looking at conservatives differently. And I hope so, because, stripped of culture war baggage, we could use their perspective.

Comment by holly_elmore on [blog cross-post] We are in triage every second of every day · 2018-12-27T22:11:39.309Z · score: 1 (1 votes) · EA · GW

Thank you :)

Comment by holly_elmore on [blog cross-post] We are in triage every second of every day · 2018-12-13T18:58:40.005Z · score: 8 (7 votes) · EA · GW

Thanks :) Haha, yeah, when I hit the 5 post limit I realized maybe I shouldn't be treating this like an archive... It honestly didn't occur to me that the posts would spam people if I just got 'em up as quickly as possible! Still figuring out how the forum works, haha.

[blog cross-post] We are in triage every second of every day

2018-12-12T21:20:21.300Z · score: 37 (22 votes)

[blog cross-post] potential lost; substance gained

2018-12-12T21:13:20.689Z · score: 1 (1 votes)

[blog cross-post] Charity hacks

2018-12-12T20:56:43.697Z · score: 4 (2 votes)

[blog cross-post] More on narcissism

2018-12-12T20:42:17.656Z · score: 1 (1 votes)

[blog cross-post] So-called communal narcissists

2018-12-12T20:37:24.046Z · score: 9 (7 votes)