Turning percentages back into people: personalizing quantification 2020-09-12T14:30:35.199Z · score: 24 (11 votes)
A message to community members, in light of global protests for racial justice 2020-06-08T22:19:33.517Z · score: 41 (28 votes)
sky's Shortform 2020-03-28T21:04:11.779Z · score: 3 (1 votes)
What to know before talking with journalists about EA 2019-09-04T19:59:21.578Z · score: 107 (48 votes)


Comment by sky on How have you become more (or less) engaged with EA in the last year? · 2020-09-12T23:23:17.528Z · score: 1 (1 votes) · EA · GW

Joey, could you say more what you mean by "concepts...that connect to impact"? I'm interested in examples you're thinking of. And whether you're looking for advances on those examples or new/different concepts?

Comment by sky on EricHerboso's Shortform · 2020-09-05T21:05:18.490Z · score: 10 (7 votes) · EA · GW

Quick meta comment: Thanks for explaining your downvote; I think that's helpful practice in general

Comment by sky on sky's Shortform · 2020-09-05T20:48:57.903Z · score: 13 (5 votes) · EA · GW

Quick thoughts on turning percentages back into people

Occasionally, I experiment with different ways to grok probabilities and statistics for myself, starting from the basics. It also involves paying attention to my emotions, and imagining how different explanations would work for different students. (I'm often a mentor/workshop presenter for college students). If your brain is like mine or you like seeing how other people's brains work, this may be of interest.

One trick that has worked well for me is turning %s back into people

Example: I think my Project X can solve a problem for more people than it's currently doing. I have a survey (N=1200) which says I'm currently solving a problem for 1% of the people impacted by Issue X. I think I can definitely make that number go up. Also, I really want that number to go up; 1% seems so paltry.

I might start with:Ok, how likely do I think it is that 1% could go up to 5%, 10%, 20%?

But I think this is the wrong question to start with for me. I want to inform my intuitions about what is likely or probable, but this all feels super hypothetical. I know I'm going to want to say 20%, because I have a bunch of ideas and 20% is still low! The %s here feel too fuzzy to ground me in reality.

Alternative: Turn 1% of 1200 back into 12 people

This is 12 people who say they are positively impacted by Project X.

This helps me remember that no one is a statistic. (A post which may have inspired this idea to begin with). So, yay, 12 people!

But going from 1% to 5% still sounds unambitious and unsatisfying. I like ambitious, tenacious, hopeful goals when it comes to people getting the solutions they're looking for. That's the whole point of the project, after all. Sometimes, I can physically feel the stress over this tension. I want this number to be 100%! I want the problem solved-solved, not kinda-solved.

At this point, maybe I could remind myself or a student that "shoulding at the universe" is a recipe for frustration. I love that concept, and sometimes it works. But often, that's just another way of shoulding at myself. The fact remains that I don't want to be less ambitious about solving problems that I know are real problems for real people.

I try the percents-to-people technique again:

  • Turn 5% of 1200 back into 60 people. Oh. That's 48 additional people. Also notice: it's only 60 people if we're talking about 48 additional people, while losing 0.
  • Turn 10% back into 120 people. 108 additional people, while losing 0.
  • Turn 20% back into 240 people. 228 additional people, while losing 0.
  • So, an increase of 5% or 20% is the difference between 48 or 228 additional people reached. I know about this program because I work on it, and I know how much goes into Project X right now to reach 12 people. I'm sure there are things we could do differently, but are they different enough to reach 228+ additional people?

Now this feels different. It's humbling. But it piques my curiosity again instead of my frustration: how would we attempt that? Could we?

  • What else do I need to know, to figure out if 60 or 120 or 240 (...or 1000, or 10000) is anywhere within the realm of possibilities for me?
  • Do I have a clear idea about what my bottlenecks or mistakes are in the status quo, such that I think there are 48 more people to reach (while still reaching the 12)? What processes would need to change, and how much?
  • This immediately brings up the response, "That depends on how long I have." (Woot, now I've just grokked why it's useful to time-bound examples for comparison's sake). We could call it 1 year, or 3, or 10, etc. I personally think 1-3 years is usually easier to conceptualize and operationalize.
  • Also, whatever I do next, it's obviously going to take notable effort. I know I can only do so much work in a day. (I probably hate this truth the most. This is definitely where I remind myself not to should at the universe). Now I wonder, is this definitely the program where I want to focus my effort for a while? Why? What if there are problems upstream of this one that I could put my effort toward instead? ...aha, now my understanding of why people care about cause prioritization just got deeper and more personally intuitive. This is a topic for another post.

To return to percentages, here's one more example. Percentages can also feel daunting instead of unambitious:

  • Going from 12 to 60 people is a 400% increase. (Right? I haven't miscalculated something basic? Yes, that's right; thank you, online calculators). 400%! Is that madness?
  • Turn '400% increase' back into 4 additional people reached, for every 1 person reached now.

That may still be daunting. But it may be easier to make estimates or compare my intuitions about different action plans this way.

If you (or your students) are like me, this is a useful approach. It gets me into the headspace of imagining creative possibilities to solve problem X, while still grounding myself within some concrete parameters rather than losing myself to shoulding.

Comment by sky on sky's Shortform · 2020-09-01T22:33:53.969Z · score: 9 (6 votes) · EA · GW

Webinar tomorrow: exploring solutions journalism [for EA writers]:

If EA journalists and writers are planning to cover EA topics, I think a solutions journalism angle will usually be the most natural fit.

The Solutions Journalism Network "train[s] and connect[s] journalists to cover what’s missing in today’s news: how people are responding to problems."

The Solutions Journalism Network is having a webinar tomorrow:

Solutions journalism

"Can be character-driven, but focuses in-depth on a response to a problem and how the response works in meaningful detail

  • Focuses on effectiveness, not good intentions, presenting available evidence of results
  • Discusses the limitations of the approach
  • Seeks to provide insight that others can use"

This is still a less common media angle. The quality of coverage will clearly still vary a lot depending on one's research, editorial input, etc, but this is a better fit than many other media angles one could take to cover topics of interest to you in EA.

More info on this type of journalism:

Comment by sky on EAGxVirtual Unconference (Saturday, June 20th 2020) · 2020-06-10T22:24:56.535Z · score: 1 (1 votes) · EA · GW

Definitely, I think for many people, the donations example works. And I like the firefighter example too, especially if someone has had first responder experience or has been in an emergency.

I'm curious what happens if one starts with a toy problem that arises from or feels directly applicable to a true conundrum in the listener's own daily life, to illustrate that prioritization between pressing problems is something we are always doing, because we are finite beings who often have pressing problems! I think when I started learning about EA via donation examples, I made the error of categorizing EA as only useful for special cases, such as when someone has 'extra' resources to donate. So, GiveWell sounded like a useful source of the 'the right answer' on a narrow problem like finding recommended charities, which gave me a limited view of what EA was for and didn't grab me much. I came to EA via GiveWell rather than reading any of the philosophy, which probably would have helped me understand the basis for what they were doing better :).

When I was faced with real life trade-offs that I really did not want to make but knew that I must, and someone walked me through an EA analysis of it, EA suddenly seemed much more legible and useful to me.

Have you seen your students pick up on the prioritization ideas right away, or find it useful to use EA analysis on problems in their own life?

Comment by sky on EAGxVirtual Unconference (Saturday, June 20th 2020) · 2020-06-10T02:23:55.705Z · score: 8 (8 votes) · EA · GW

I'm excited about this! I actually came here to see if someone had already covered this or if I should ☺️. I'd love to see a teacher walk through this.

Here's an idea I'd been curious to try out talking or teaching about EA, but haven't yet. I'd be curious if you've tried it or want to (very happy to see someone else take the idea off my hands). I think we often skim over a key idea too fast -- that we each have finite resources and so does humanity. That's what makes prioritization and willingness to name the trade offs we're going to make such an important tool. I know I personally nodded along at the idea of finite resources at first, but it's easy to carry along with the S1 sense that there will be more X somewhere that could solve hard trade-offs we don't want to make. I wonder if starting the conversation there would work better for many people than e.g. starting with cost-effectiveness. Common sense examples like having limited hours in the day or a finite family budget and needing to choose between things that are really important to you but don't all fit is an idea that I think makes sense to many people, and starting with this familiar building block could be a better foundation for understanding or attempting their own EA analysis.

Comment by sky on Call notes with Johns Hopkins CHS · 2020-05-21T11:13:29.551Z · score: 1 (3 votes) · EA · GW

I also found this helpful -- appreciate it

Comment by sky on Racial Demographics at Longtermist Organizations · 2020-05-18T14:26:55.948Z · score: 1 (1 votes) · EA · GW

Thanks for adding that resource, Anon.

Comment by sky on Racial Demographics at Longtermist Organizations · 2020-05-04T14:58:58.997Z · score: 33 (15 votes) · EA · GW

Thanks for doing this analysis! My project plans for 2020 (at CEA) include more efforts to analyze and address the impacts of diversity efforts in EA.

I'd be interested in being in touch with the author if they're open to it, and with others who have ideas, questions, relevant analysis, plans, concerns, etc.

I'm hopeful that EAs, like the author and commenters here, can thoughtfully identify or develop effective diversity efforts. I think we can take wise actions that avoid common pitfalls, so that EA is strong and flexible enough as a field to be a good "home base" for highly altruistic, highly analytical people from many backgrounds. I'm looking forward to continued collaboration with y'all, if you'd like to be in touch:

Comment by sky on What posts do you want someone to write? · 2020-03-29T13:18:27.270Z · score: 3 (2 votes) · EA · GW

Posts on how people came to their values, how much individuals find themselves optimizing for certain values, and how EA analysis is/isn't relevant. Bonus for points for resources for talking about this with other people.

I'd like to have more "Intro to EA" convos that start with, "When I'm prioritizing values like [X, Y, Z], I've found EA really helpful. It'd be less relevant if I valued [ABC ] instead, and it seems less relevant in those times when I prioritize other things. What do you value? How/When do you want to prioritize that? How would you explore that?"

I think personal stories here would be illustrative.

Comment by sky on sky's Shortform · 2020-03-28T21:04:12.017Z · score: 2 (2 votes) · EA · GW

Should reducing partisanship be a higher priority cause area (for me)?

I think political polarization in the US produces a whole heap of really bad societal/policy outcomes and makes otherwise good policy outcomes ~impossible. It has always seemed relatively important to me, because when things go wrong in the US, they often have global consequences. I haven't put that many of my actual resources here though because it's a draining cause to work on and didn't feel that tractable. I also suspected myself of motivated reasoning: I get deep joy from inter-group cooperation and am very distressed by inter-group conflict.

Then I read things like the thread below and feel like not paying more attention to this is foolish, like I've gone too far in the other direction and underweighted the importance of this barrier to global coordination. I imagine others have written about similar questions and I would be interested in more thoughts.

Comment by sky on After one year of applying for EA jobs: It is really, really hard to get hired by an EA organisation · 2020-03-24T13:01:21.399Z · score: 1 (1 votes) · EA · GW

Hi Aidan, I'm really late to this thread, but found it interesting. If you don't mind coming back in time, could you clarify this:

"I think part of what might be driving the difference of opinion here is that the type of EAs that need a 45 minute chat are not the type of EAs that 80k meets."

I imagine this is true for a lot of EA org staff. It sounded from Howie's comment like it's probably less true for coaches at 80K, though, compared to other EA org staff.

Howie's comment:

"We try to make sure that we talk to the people we think we’re best placed to help with coaching in other ways too, for example some of our advice and many of the connections we can make are particularly valuable for people who don’t already have lots of current links to other effective altruists."

I find the network constrained hypothesis interesting and am interested in exploring it, so I think clarifying our models here seems useful

Comment by sky on EA Survey 2019 Series: Community Demographics & Characteristics · 2020-03-03T14:51:12.985Z · score: 3 (2 votes) · EA · GW

I find myself navigating to this page a lot recently, thanks for publishing!

Quick UX request: could you update this post with links to subsequent posts in the series? I'm often hunting around trying to find various pieces of data, and would find that super helpful for user navigation, rather than searching on the title.

Comment by sky on The EA Hotel is now the Centre for Enabling EA Learning & Research (CEEALAR) · 2020-01-29T20:15:51.423Z · score: 17 (13 votes) · EA · GW

I think it's worth noting that the acronym for the Athena Center for EA Study is ACES! :)

Comment by sky on What to know before talking with journalists about EA · 2019-12-08T22:40:24.096Z · score: 4 (3 votes) · EA · GW

FYI: I've updated this post to show that we now have an email address for requests for media help:

Comment by sky on What to know before talking with journalists about EA · 2019-12-04T20:40:50.196Z · score: 15 (6 votes) · EA · GW

Thanks for adding this, Jonas. I just added a brief blurb that I think is related to this. (See the section about required skills, where I've added a note about being personable but willing to be "awkward"). These are the kinds of tips I'd usually discuss and rehearse with someone in an interview practice session. I notice this post is more about how to evaluate a media opportunity and self-assess readiness, rather than what to do during an actual interview. The latter is something I talk more about with people when we're rehearsing for a specific interview.

When rehearsing mock interviews with people, I've noticed that the point you raise is one of the things that most trips people up though, which I think is understandable.

If someone asks you, "Some people have said butter is blue. Do you think that's true?", it's almost a knee-jerk response to answer "Really? No, I don't think butter is blue. I believe butter is white or yellow, because....". The problem is that our natural instinct here works against us. "EAs 'don't think butter is blue'" is a much weirder and more intriguing quote than, "EAs 'think butter is white or yellow.'"

It's takes practice to get out of this habit and ensure that the words you say consist only of words you want to appear in the article, without giving fodder to competing/distracting/inaccurate messages. (You might still be misrepresented or misunderstood even then, but this is one strategy to lower that risk). The advice of interview coaches is just what you said, Jonas: that you should start right in describing your actual beliefs, and not repeat the question.

It can look something like this:

Q: Some people have said butter is blue. Do you think that's true?

[Take a breath, smile, omit the first part of the response that comes into your head. Say,..]

A: Actually, I think butter is white or yellow. [or]

A: Actually, I don't think that's within my area of expertise.

[Pause. Let it be awkward if needed, wait for a new question]. [or]

A: Hm, no; what I do think is true is...[(possibly unrelated) point that you want to give a good quote about in order to communicate with your readers/viewers].

The last approach can feel especially awkward, but can be very effective in avoiding clickbait quotes and providing content you actually want to be quoted.

Comment by sky on What posts you are planning on writing? · 2019-11-12T04:05:24.334Z · score: 2 (2 votes) · EA · GW

I would personally find this very useful!

Comment by sky on What to know before talking with journalists about EA · 2019-09-05T16:28:43.695Z · score: 2 (2 votes) · EA · GW

Links are fixed, thanks for flagging! We have different versions of our domain name we can use for our email addresses but I agree that can look confusing, so they're updated too.

Comment by sky on Four practices where EAs ought to course-correct · 2019-08-05T16:59:53.658Z · score: 2 (2 votes) · EA · GW

Thanks, Gordon; I've fixed the sharing permissions so that this document is public.

Comment by sky on Four practices where EAs ought to course-correct · 2019-07-31T21:07:09.037Z · score: 28 (11 votes) · EA · GW

[Note: I’m a staff member at CEA]

I have been thinking a lot about this exact issue lately and agree. I think that as EA is becoming more well-known in some circles, it’s a good time to consider if — at a community level — EA might benefit from courting positive press coverage. I appreciate the concern about this. I also think that for those of us without media training (myself included), erring on the side of caution is wise, so being media-shy by default makes sense.

I think that whether or not the community as a whole or EA orgs should be more proactive about media coverage is a good question that we should spend time thinking about. The balance of risks and rewards there is an open question.

At an individual level though, I feel like I’ve gotten a lot of clarity recently on best practices and can give a solid recommendation that aligns with Gordon’s advice here.

For the past several months, I’ve sought to get a better handle on the media landscape, and I’ve been speaking with journalists, media advisors, and PR-type folks. Most experts I’ve spoken to (including journalists and former journalists) converge on this advice: For any individual community member or professional (in any movement, organization, etc), it is very unwise to accept media engagements unless you’ve had media training and practice.

I’m now of the mind that interview skills are skills like any other, which need to be learned and practiced. Some of us may find them easier to pick up or more enjoyable than others, but very few of us should expect to be good at interviews without preparation. Training, practice, and feedback can help someone figure out their skills and comfort level, and then make informed decisions if and when media inquiries come up.

To add on to Gordon’s good advice for those interested, here is a quick summary of what I’ve learned about the knowledge and skills required for media engagements:

  • General understanding of a journalist’s role, an interviewee’s role, and journalistic ethics (what they typically will and will not do; what you can and cannot ask or expect when participating in a story)
  • An understanding of the story’s particular angle and where you do or don’t fit
  • Researching the piece and the journalist’s credibility in advance, so that you can…
    • evaluate and choose opportunities where your ideas are more likely to be understood or represented accurately versus opportunities where you’re more likely to be misrepresented; and
    • predict the kinds of questions you’re likely to be asked so that you can practice meaningful responses. (Even simple questions like “what is EA?” can be surprisingly hard to answer briefly and well).
  • Conveying key ideas in a clear, succinct way so that the most important things you want to say are more likely to be what is reported
    • This includes the tricky business of predicting the ways in which certain ideas might be misunderstood by a variety of audiences and practicing how to convey points in a way that avoids such misunderstandings
  • Clearly understanding the scope of your own expertise and only speaking about related issues, while referring questions outside your expertise to others

I think having more community members with media training could be useful, but I also think only some people will find it worth their time to do the significant amount of preparation required.

This feels very timely, because several of us at CEA have recently been working on updating our resources for media engagement. In our Advice for talking with journalists guide, we go into more depth about some of the advice we've received. I’d be happy to have people’s feedback on this resource!

Comment by sky on There are *a bajillion* jobs working on plant-based foods right now · 2019-07-18T04:13:35.294Z · score: 1 (1 votes) · EA · GW

I really like the broad range of skills presumably required for this list of jobs -- seems worth looking into further.