EA will likely get more attention soon

post by Julia_Wise · 2022-05-12T02:01:41.776Z · EA · GW · 29 comments

Contents

  Doing more proactive communications work
  Connecting EA projects with journalists
  FAQs
    If I see a new media piece about EA, who should I flag it to?
    Why don’t CEA or other EA orgs push back more publicly on misconceptions?
    Should I reach out to celebrities, HNWIs, etc about getting involved with EA?
    How should I respond to takes on EA that I disagree with?
None
28 comments

As EA-aligned foundations and projects direct more money, EA ideas continue to gain traction, What We Owe the Future comes out, etc, there’s naturally going to be more attention on EA soon. That attention will likely range from enthusiasm to thoughtful criticism to . . . less thoughtful criticism. If you’ve been involved in EA for a while, this transition might be a bit disorienting.

I’m writing this post on behalf of some staff (at CEA, Forethought Foundation, and Open Philanthropy) who are working on communications for EA as a movement. We’re trying to prepare for increased attention, plan the best ways to communicate complex ideas succinctly, and increase the chance that EA will be portrayed accurately and thoughtfully.

Doing more proactive communications work

For the last several years, most EA organizations did little or no pursuit of media coverage. CEA’s advice on talking to journalists [EA · GW] was (and is) mostly cautionary. I think there have been good reasons for that — engaging with media is only worth doing if you’re going to do it well, and a lot of EA projects don’t have this as their top priority.  

While this may have made sense for each individual organization, as a result, we’ve missed out on opportunities to convey the good ideas and work coming from EA. There’s also confusion out there about what EA is even about. Ideally more people would have a clearer sense of what EA is, so they can agree or disagree with an accurate representation of EA and not with a misconception.

Several EA organizations are working together with a communications advising firm to answer questions like

Connecting EA projects with journalists

I’ve been writing to EA organizations and projects to see if they have recent success stories that journalists might be interested in covering. If I’ve missed your project and you’d like some help connecting with journalists who might want to cover your work, please do get in touch! media@centreforeffectivealtruism.org

As before, if a journalist reaches out to you, we suggest you look through our guide on responding to journalists

FAQs

If I see a new media piece about EA, who should I flag it to?

Feel free to flag things to media@centreforeffectivealtruism.org and we’ll talk with our advisors about whether some kind of response makes sense. We’ll likely have heard about pieces in large-scale publications, but might miss coverage of EA in publications in languages other than English, or targeted readership that might be of interest (e.g. university student newspapers, professional sub-communities).   

Why don’t CEA or other EA orgs push back more publicly on misconceptions?

The advice we’ve gotten so far is to not repeat misconceptions. You’re unlikely to see an EA organization say “No, X isn’t true; actually Y is true.” Instead they’re more likely to talk about “Here’s why Y is important.” 

Some criticisms will be unfair or uninformed. Typically we expect to respond by writing pieces explaining our own views rather than responding directly to critical pieces.

Should I reach out to celebrities, HNWIs, etc about getting involved with EA?

Very unlikely. There are existing projects doing this, and it’s better that outreach happen in a coordinated way than a bunch of people contacting them. Simran Dhaliwal of Longview Philanthropy writes: “Please do reach out if you have a connection to an UHNW individual/family; we'd be more than happy to invest the many hours it takes to build a relationship and discuss EA concepts in-depth / assist with coordination."  simran@longview.org

How should I respond to takes on EA that I disagree with?

Maybe not at all — it may not be worth fanning the flames. 

If you do respond, it helps to link to a source for the counter-point you want to make. That way, curious people who see your interaction can follow the source to learn more.

We like Nathan Young’s advice here [EA · GW].

29 comments

Comments sorted by top scores.

comment by rifish · 2022-05-13T08:32:10.742Z · EA(p) · GW(p)

I'm a journalist, and would second this as sound advice, especially the 'guide to responding to journalists'. It explains the pressures and incentives/deterrents we have to work with, without demonising the profession... which I was glad to see! 

A couple of things I would emphasise (in the spirit of mutual understanding!): 

It can help to look beyond the individual journalist to consider the audience we write for, and what our editors' demands might be higher up in the hierarchy. I know many good, thoughtful journalists who work for publications (eg politically partisan newspapers) where they have to present stories the way they do, because that's what their audience/editors demand... There's often so much about the article they, as the reporter, don't control after they file. (Early career journalists in particular have to make these trade-offs, which is worth bearing in mind.) 

Often I would suggest it could be helpful to think of yourself as a guide not a gatekeeper. An obvious point... but this space here [waves arms] is all available to journalists, along with much else in the EA world, via podcasts, public google docs etc. There are vast swathes of material that are already public and all quotable. The community is an unusually online one compared with other fields I report on. It's great! But the problem for a journalist is therefore not really that information is scarce and hard to come by, which you as a source could gatekeep - on the contrary, it's that so much is all already there, and there's far too much of it to digest before a deadline. It means a bad journalist could cherrypick; a hurried journalist could get only a fleeting impression. Generally, what we journalists need is a guide through it all - context, history, depth - so we can form a picture that is fair and accurate. 

With that in mind, I would always advocate for speaking with us face-to-face or via video, rather than emailing...it's just a more human way of connecting, more efficient and responsive, and frankly makes it a little harder for a journalist to ignore your guidance if you have given them your time and shown you are a real person rather than a quote-machine! If a journalist asks for answers to their questions in email, for me that's a sign that they don't have that much of an interest in engaging and learning. I admit I've done it myself sometimes when pressed for time, but it's not good practice. Also it's not serving the needs of audiences/publications because, unless a source is an unusually conversational writer, it leads to flat quotes that are less natural and engaging in tone. 

Last, just to return to the OP, I agree that far more attention is coming. In the past I have observed a sentiment that seems to assume that the EA world can stay under the radar by not engaging. There's perhaps been something to that - insofar it avoids actively advertising – but I'd also say it's as much that relatively few journalists so far have had reason and motivation to look. I'd suggest that will change for a few reasons - there are many great positive and important stories to tell that interest wider audiences, as I've discovered myself, but increasingly also because journalists have a civic duty to write about concentrations of power and money. 

comment by jacquesthibs (jaythibs) · 2022-05-12T03:37:53.775Z · EA(p) · GW(p)

One thing that may backfire with the slow rollout of talking to journalists is that people who mean to write about EA in bad faith will be the ones at the top of the search results. If you search something like “ea longtermism”, you might find bad faith articles [EA · GW] many of us are familiar with. I’m concerned we are setting ourselves up to give people unaware of EA a very bad faith introduction.

Note: when I say “bad faith“ here, it may just be a matter of semantics with how some people are seeing it as. I think I might not have the vocabulary to articulate what I mean by “bad faith.” I actually agree with pretty much everything David has said in response to this comment.

Replies from: Dr. David Mathers, EA-Basti, Dr. David Mathers, Julia_Wise, howdoyousay?, dpiepgrass, Arepo
comment by Dr. David Mathers · 2022-05-12T11:05:35.785Z · EA(p) · GW(p)

In my view, Phil Torres' stuff, whilst not entirely fair, and quite nasty rhetorically, is far from the worst this could get. He actually is familiar with what some people within EA think in detail, reports that information fairly accurately, even if he misleads by omission somewhat*, and makes  criticisms of controversial philosophical assumptions of some leading EAs that have some genuine bite, and might be endorsed by many moral philosophers. His stuff actually falls into the dangerous sweet spot where legitimate ideas, like 'is adding happy people actually good anyway' get associated with less fair criticism-"Nick Beckstead did white supremacy when he briefly talked about different flow-through effects of saving lives in different places", potentially biasing us against the legit stuff in a dangerous way. 

But there could-again, in my view-easily be a wave of criticism coming from people who share Torres' political viewpoint and tendency towards heated rhetoric, but who, unlike him, haven't really taken the time to understand EA /longtermist/AI safety ideas in the first place. I've already seen one decently well-known anti-"tech" figure on twitter re-tweet a tweet that in it's entirety consisted of "long-termism is eugenics!".  People should prepare emotionally (I have already mildly lost my temper on twitter in a way I shouldn't have, but at least I'm not anyone important!) for keeping their cool in the face of criticisms that is:
-Poorly argued 
-Very rhetorically forceful
-Based on straightforward misunderstandings
-Involves infuriatingly confident statements of highly contestable philosophical and empirical assumptions. 
-Deploy guilt-by-association tactics of an obviously unreasonable sort**: i.e. so-and-so once attended a conference with Peter Thiel, therefore they share [authoritarian view] with Thiel.
-Attacks motives not just ideas
-Gendered in a way that will play directly to the personal insecurities of some male EAs.

Alas, stuff can be all those things and also identify some genuine errors we're making. It's important we remain open to that, and also don't get too polarized politically by this kind of stuff ourselves. 

* (i.e. he leaves out reasons to be longtermist that don't depend on total utilitarianism or adding happy people being good, doesn't discuss why you might reject person-affecting population ethics etc.)

** I say "of an unreasonable sort" because in principle people's associations can be legitimately criticized if they have bad effects, just like anything else. 

Replies from: jaythibs, howdoyousay?
comment by jacquesthibs (jaythibs) · 2022-05-12T14:18:54.696Z · EA(p) · GW(p)

Great points, here’s my impression: 

Meta-point: I am not suggesting we do anything about this or that we start insulting people and losing our temper (my comment is not intended to be prescriptive). That would be bad and it is not the culture I want within EA. I do think it is, in general, the right call to avoid fanning the flames. However, my first comment is meant to point at something that is already happening: many people uninformed about EA are not being introduced in a fair and balanced way, and first impressions matter. And lastly, I did not mean to imply that Torres’ stuff was the worse we can expect. I am still reading Torres’ stuff with an open-mind to take away the good criticism (while keeping the entire context in consideration).

Regarding the articles: His way of writing is by telling the general story in a way that it’s obvious he knows a lot about EA and had been involved in the past, but then he bends the truth as much as possible so that the reader leaves with a misrepresentation of EA and what EAs really believe and take action on. Since this is a pattern in his writings, it’s hard not to believe he might be doing this because it gives him plausible deniability since what he’s saying is often not “wrong”, but it is bent to the point that the reader ends up inferring things that are false.

To me, in the case of his latest article, you could leave with the impression that Bostrom and MacAskill (as well as the entirety of EA) both think that the whole world should stop spending any money towards philanthropy that helps anyone in the present (and if you do, only to those who are privileged). The uninformed reader can leave with the impression that EA doesn’t even actually care about human lives. The way he writes gives him credibility to the uninformed because it’s not just an all-out attack where it is obvious to the reader what his intentions are.

Whatever you want to call it, this does not seem good faith to me. I welcome criticism of EA and longtermism, but this is not criticism.

*This is a response to both of your comments.

comment by howdoyousay? · 2022-05-12T14:10:07.542Z · EA(p) · GW(p)

Thanks for this thoughtful challenge, and in particular flagging what future provocations could look like so we can prepare ourselves and let our more reactive selves come to the fore, less of the child selves.  

 

In fact, I think I'll reflect on this list for a long time to ensure I continue not to respond on Twitter!

comment by Sebastian Schwiecker (EA-Basti) · 2022-05-12T10:56:56.421Z · EA(p) · GW(p)

Definitely the case in Germany. Top 3 Google results for "longtermism" are all very negative posts. 2 of them by some of Germany biggest news magazines (ZEIT and Spiegel). As far as I know there is no positive content on Longtermism in German.

comment by Dr. David Mathers · 2022-05-12T12:23:45.946Z · EA(p) · GW(p)

Also, I doubt Torres is writing in bad faith exactly. "Bad faith" to me has connotations of 'is saying stuff they know to be untrue', when with Torres I'm sure he believes what he's saying he's just angry about it, and anger biases. 

Replies from: ZachWeems, MikeJ
comment by ZachWeems · 2022-05-14T20:32:57.460Z · EA(p) · GW(p)

Agreed. 

My model is, he has a number of frustrations with EA. That on its own isn't a big deal. There are plenty of valid, invalid, and arguable gripes with various aspects of EA. 

But he also has a major bucket error where the concept of "far-right" is applied to a much bigger Category of bad stuff. Since some aspects of EA & longtermism seem to be X to him, and X goes in the Category, and stuff in the Category is far-right, EA must have far-right aspects. To inform people of the problem, he writes articles claiming they're far-right. 

If EA's say his claims are factually false, he thinks the respondents are fooling themselves. After all, they're ignoring his wider point that EA has stuff from the Category, in favor of the nitpicky technicalities of his examples. He may even think they're trying to motte & bailey people into thinking EA & longtermism can't possibly have X. To me, it sounds like his narrative is now that he's waging a PR battle against Bad Guys.

I'm not sure what the Category is, though. 

At first I thought it was an entirely emotional thing- stuff that make him sufficiently angry, or a certain flavor of angry, or anything where he can't verbalize why it makes him angry, are assumed to be far-right. But I don't think that fits his actions. I don't expect many people can decide "this makes me mad, so it's full of white supremacy and other ills", run a years-long vendetta on that basis, and still have a nuanced conversation about which parts aren't bad.

Now I think X has a "shape"- with time & motivation, in a safe environment, Torres could give a consistent definition of what X is and isn't. And with more of those, he could explain what it is & why he hates it without any references to far-right stuff. Maybe he could even do an ELI5 of why X goes in the same Category as far right stuff in the first place. But not much chance of this actually happening, since it requires him being vulnerable with a mistrusted representative of the Bad Guys.

comment by MikeJ · 2022-05-12T17:35:56.115Z · EA(p) · GW(p)

Yes, i’m always unsure of what “bad faith” really means. I often see it cited as a main reason to engage or not engage with an argument. But I don’t know why it should matter to me what a writer or journalist intends deep down. I would hope that “good faith” doesn’t just mean aligned on overall goals already.

To be more specific, i keep seeing reference hidden context behind Phil Torres’s pieces. To someone who doesn’t have the time to read through many cryptic old threads, it just makes me skeptical that the bad faith criticism is useful in discounting or not discounting an argument.

Replies from: casebash
comment by Chris Leong (casebash) · 2022-05-13T04:37:17.646Z · EA(p) · GW(p)

Have you ever had conversations where someone has misrepresented everything you've said or where they kept implying that you were a bad person every time you disagreed with them?

comment by Julia_Wise · 2022-05-12T16:42:26.519Z · EA(p) · GW(p)

I agree! This is part of what we’re trying to work on, by making good-quality pieces in favor of EA and longtermism easier to find.

comment by howdoyousay? · 2022-05-13T12:22:29.161Z · EA(p) · GW(p)

Equally there's an argument to thank and reply to critical pieces made against the EA community which honestly engage with subject matter. This post (now old) making criticisms of long-termism is a good example: https://medium.com/curious/against-strong-longtermism-a-response-to-greaves-and-macaskill-cb4bb9681982

I'm sure / really hope Will's new book does engage with the points made here. And if so, it provides the rebuttal to those who come across hit-pieces and take them at face value, or those who promulgate hit-pieces because of their own ideological drives.

comment by dpiepgrass · 2022-05-13T22:59:56.527Z · EA(p) · GW(p)

Yup, I saw somebody on Medium speaking favorably about a Phil Torres piece as a footnote of his article on Ukraine (I responded here). And earlier I responded to Alice Crary's piece. Right now the anti-EAs are often self-styled intellectual elites, but a chorus of bad faith could go mainstream at some point. (And then I hope you guys will see why I'm proposing an evidence clearinghouse [EA(p) · GW(p)], to help build a new and more efficient culture of good epistemics and better information... whether or not you think my idea would work as intended.)

comment by Arepo · 2022-05-18T14:06:08.683Z · EA(p) · GW(p)

I just posted a comment giving a couple of real-life anecdotes showing this effect.

comment by Nathan Young (nathan) · 2022-05-12T09:28:17.418Z · EA(p) · GW(p)

Thanks Julia, I think this is really well put. 

Relatedly, trauma circles.

I like trauma circles as a good model for dealing with crises. When someone is in distress you dump (complain, vent, etc) away from them and comfort (listen, graciously help, etc) towards them.  In short, you use your and everyone else's closeness to the crisis to inform your response.

This is also the model I use if someone is angry at me on the internet. I do not want to dump towards them (the centre of the circle) but instead vent towards my friends. If I say anything to them I am first gracious and kind. 

This next point is tricky, but I think worth making.  For me public community spaces are "sideways" of me as regards this model - useful to dump into if necessary but not ideal. When someone is rudely frustrated with EAs on twitter, I generally avoid quote tweeting into the forum or Twitter Community (a new feature for communities on twitter)  because then everyone feels attacked, not just me.  This isn't an iron law -  sometimes the rude criticism is still really instructive, but this is my general heuristic.

My heuristics then:
- if the upset person is a friend comfort them
- if the upset person is not a friend comfort them or say nothing unless I feel  very very competent (I very rarely feel competent enough to challenge directly)
- if the upset person has written something mean, probably don't amplify it
- if I need  to vent, vent away from the upset person
- venting in private is usually better than venting in public
- if I need to vent in public, it's better to talk about my feelings rather than give a play by play of the crisis

I may edit this post based on comments. 

comment by aogara (Aidan O'Gara) · 2022-05-12T15:06:54.924Z · EA(p) · GW(p)

This is a great point. As one example of growing mainstream coverage, here’s a POLITICO Magazine piece on Carrick Flynn’s Congressional campaign. It gives a detailed explanation of effective altruism and longtermism, and seems like a great introduction to the movement for somebody new. The author sounds like he might have collaborated with CEA, but if not, maybe someone should reach out?

https://www.politico.com/news/magazine/2022/05/12/carrick-flynn-save-world-congress-00031959

comment by Abby Hoskin (AbbyBabby) · 2022-05-12T02:19:38.311Z · EA(p) · GW(p)

Thanks, Julia! The  "Advice for responding to journalists" doc you link is really excellent. Everyone should read this before speaking to the media. https://docs.google.com/document/d/1GlVEKYdJU2LqE6tXPPay_2tBmJTQrsQxAO27ZaeKAQk/edit#heading=h.86t1p0fnb9uz

Some advice I would add: if a journalist asks to interview you, try to understand where they are in their research. 

Do they have a narrative that they are already committed to and they're just trying to get a juicy quote from you? If so, it might not make sense to talk to them since they might twist whatever you say to fit the story they have already written.

Alternatively, are they in information gathering mode and are honestly trying to understand a complex issue? If they have not written their story yet and you think you can give them information that will make their writing more accurate, then it makes more sense to do an interview. 
 

comment by Charles He · 2022-05-13T10:03:31.686Z · EA(p) · GW(p)

I’m assuming and hoping Julia Wise or the respective team here has strong and adequate staffing.

I’ve got this worrying mental picture of Wise carrying both the community health and international public relations team as a one woman show, like with a headset, three keyboards and seven monitors typing furiously.

Honestly, I also low key want there to be strong people working for Wise, so we can refer to the resulting apparatus with awesome names:

  • Department of The Wise
  • The Wise Empire
  • The Era of Wise EA
  • Wisely, EA succeeds
Replies from: Julia_Wise
comment by Julia_Wise · 2022-05-13T16:21:19.113Z · EA(p) · GW(p)

It's definitely a bigger job than I can do on my own! As I said, staff at several organizations plus a communications advising firm are working on this.

We're also keeping an eye out for possible hires who are familiar with both media/communications work and EA. If that sounds like you, feel free to let me know (julia.wise@centreforeffectivealtruism.org) and I can let you know if we have a job posting.

comment by freedomandutility · 2022-05-12T14:40:41.237Z · EA(p) · GW(p)

Is there much ongoing outreach to journalists about EA projects which are really good according to most ethical views?

Eg - about Alvea, updates on CE-incubated charities, updates on Givewell’s donations, about how EA-affiliated people advised the UK government on COVID, animal welfare wins, etc

Replies from: Julia_Wise
comment by Julia_Wise · 2022-05-12T16:43:41.671Z · EA(p) · GW(p)

That list is almost identical to what I started with, yes! :)

In several cases the projects are already working on their own media outreach, but we’ll be trying to help where we can (perhaps introducing them to journalists they weren’t already in touch with) and to help smaller projects that might not have a media plan yet.

Replies from: freedomandutility
comment by freedomandutility · 2022-05-12T21:23:25.781Z · EA(p) · GW(p)

Cool, good to hear!

comment by Arepo · 2022-05-18T14:04:44.051Z · EA(p) · GW(p)

For the last several years, most EA organizations did little or no pursuit of media coverage. CEA’s advice on talking to journalists [EA · GW] was (and is) mostly cautionary. I think there have been good reasons for that — engaging with media is only worth doing if you’re going to do it well, and a lot of EA projects don’t have this as their top priority.  

 

I think this policy has been noticeably harmful, tbh. If the supporters of something won't talk to the media, the net result seems to be that the media talk to that thing's detractors instead, and so you trade low-fidelity positive reporting for lower-fidelity condemnation.

Two real-life anecdotes to support this: 

  1. At the EA hotel, we turned away a journalist at the door, who'd initially contacting me sounding very positive about the idea. He wrote a piece about it anyway, and instead of interviews with the guests, concluded with a perfunctory summary of the neighbours' very lukewarm views.
  2. At a public board games event we were introducing ourselves while setting up for a 2-hour game, and described my interest in EA as a way of making conversation. The only person at the table who recognised the name turned to me and said 'oh... that's the child molestation thing, right?' It turns out everything he knew about the movement was from a note published by Kathy Forth making various unsubstantiated accusations about the EA and rationalist movements without distinguishing between them. I felt morally committed to the game at that point, so... that was an uncomfortable couple of hours.
comment by Writer · 2022-05-12T10:17:24.828Z · EA(p) · GW(p)

Several EA organizations are working together with a communications advising firm to answer questions like

  • Who are key audiences we especially want to reach?
  • How do these audiences currently see EA?
  • What are the best ways to reach these audiences?
  • What EA ideas are especially important to convey?

I hope EA orgs end up sharing their new best guesses regarding these questions with the broader community, or at least reach out to smaller and newer organizations dedicated to outreach so that they can scale their outreach in a good direction and self-correct more easily.

comment by Charlie Dougherty · 2022-05-12T07:56:14.777Z · EA(p) · GW(p)

HI,

Could you clarify your section about connecting projects with journalists? I am not sure I understand entirely what you are looking for. Are there are particular journalists you have connections with already, is there a particular geography or topic you are thinking of, etc.?

 Also, does this meant that CEA wants to coordinate and do  outreach on behalf of all affiliated organizations and groups?

 

Thanks so much!

Charlie

Replies from: Julia_Wise
comment by Julia_Wise · 2022-05-12T16:43:12.902Z · EA(p) · GW(p)

Hi Charlie,

Yes, we have some connections with journalists already who have worked on EA-related pieces before or expressed interest in EA.

The types of projects we expect they might want to cover are those working on problems that non-EAs are also concerned about (like poverty, animal welfare, pandemic risk, and other catastrophic risks.) We expect EA community-building projects to be of less of a focus. We do expect stories about the amount of funding in EA, but we want to shed more light on the concrete work that the funding and community-building are actually for.

I don’t expect that CEA will do outreach on behalf of all EA-related organizations and groups, no. For example if EA Norway gets approached by a journalist wanting to write about the group or about EA in Norway, we’d be happy to help you decide whether to take the interview and prepare what you’d want to convey (as described in our guide to responding to journalists). But we don’t have capacity to do proactive outreach on behalf of community-building projects or organizations in EA.

If you (or anybody else) want to talk more about what this might mean for a project or organization you’re involved with, I’m happy to respond more specifically! julia.wise@centreforeffectivealtruism.org

comment by Nebulus · 2022-05-19T16:56:56.537Z · EA(p) · GW(p)

Thank you Julia for this well-written post! I had been considering writing something along these lines (because of the increase in EAs working in policy and under public scrutinee), and I am very, very glad that this is not only taken seriously, but also actively being worked on.  

comment by Peter S. Park · 2022-05-13T18:39:50.170Z · EA(p) · GW(p)

Thank you so much for this extremely important and helpful guide on EA messaging, Julia! I really appreciate it, and hope all EAs read it asap.

Social opinion dynamics seem to have the property where some action (or some inaction) can cause EA to move into a different equilibrium, with a potentially permanent increase or decrease in EA’s outreach and influence capacity. We should therefore tread carefully.

Unfortunately, social opinion dynamics are also extremely mysterious. Nobody knows precisely what action or what inaction possesses the risk of permanently closing some doors to additional outreach and influence. Part of the system is likely inherently unpredictable, but people are almost certainly not near the optimal level of knowledge about predicting such social opinion dynamics.

But perhaps EA movement-builders are already using and improving a cutting-edge model of social opinion dynamics!