Increasing Demandingness in EA
post by Jeff Kaufman (Jeff_Kaufman)
In thinking about what it means to lead a good life, people often
struggle with the question of how much is enough: how much does our
morality demand of us? People have given a wide range of answers to
this question, but effective altruism has historically
used "giving 10%".
Yes, it's better if you donate a larger fraction, switch to a job
where you can earn more, or
career to use directly, but if you're giving 10% to effective
charity you're doing your share, you've met the bar to consider
yourself an EA, and we're happy to have you on board.
I say "historically", because it feels like this is changing; I
think EAs would generally still agree with my paragraph above, but
while in 2014 it would have been uncontroversial now I think some
would disagree and others would have to think for a while.
EA started out as a funding-constrained movement. Whether you looked
at global poverty, existential risk, animal advocacy, or movement
building, many excellent people were working as volunteers or well
below what they could earn because there just wasn't the money to
offer competitive pay. Every year GiveWell's total room
for more funding was a multiple of their money moved. In this
environment, the importance of donations was clear.
EA has been pretty successful in raising money, however, and the
primary constraint has shifted from money to people. In 2015, 80k made
a strong case for focusing on what people can do directly, not
mediated by donations, and this case is even
stronger today. Personally, I've found this pretty convincing, though in 2017 I
decided to return to
earning to give because it still seemed like the best fit for me.
What this means, however, is that we are now trying to build a
different sort of movement than we were ten years ago. While people
who've dedicated their careers toward the most critical things
have made up the core of the movement all along, the ratio of impact
Imagine you have a group of people donating 10% to the typical mix of
EA causes. You are given the option to convince one of them to start
working on one of 80k's priority areas,
but in doing so N others will get discouraged and stop donating. This
is a bit of a false dilemma, since ideally these would not be in
conflict, but let's stick with this for a bit because I think it is
illustrative. In 2012 I would have put a pretty low number for N,
perhaps ~3, partly because we were low on money, but also because we
were starting a movement. In 2015 I would have put N at ~30: a factor
of 6 because of the difference between 10% and the most that people in
typical earning to give roles can generally donate (~60%) and a factor
of 5 because of the considerations in Why
you should focus more on talent gaps, not funding gaps. With the
large recent increases in EA-influenced spending I'd roughly put N at
~300 , though I'd be interested in better estimates.
Unfortunately, a norm of "10% and you're doing your part" combines
very poorly with the reality of 100% of someone's career having ~300x
more impact than 10%. This makes EA feel much more demanding than
it used to: instead of saying "look at the impact you can have by
donating 10%", we're now generally saying "look at the impact you can
have by building your entire career around work on an important
(This has not applied evenly. People who were already planning to
make EA central to their career are generally experiencing EA as less
demanding: pay in EA organizations has gone up, there is less stress
around fundraising, and there is less of a focus on frugality or other
forms of personal sacrifice. In some cases these changes mean that if
someone does decide to shift their career it is less of a sacrifice
than it would've been, though that does depend on how the field you
enter is funded.)
While not everyone is motivated by a sense that they should be doing
their part (see: excited
vs. obligatory altruism), I do think this is a major motivation
for many people. Figuring out how to encourage people who would
thrive in an EA-motivated career to go in that direction without
discouraging and losing people for which that would be too large a
sacrifice seems really important, and I don't see how to solve it.
Inspired by conversations with Alex Gordon-Brown, Denise Melchin, and others.
 I expect people working in EA movement building have an estimate
(a) the value of a GWWC
pledge and (b) the value of a similar person going into an 80k
priority area, and this is essentially the ratio of these. I did a
small amount of looking, however, and didn't see public estimates. I
guessed ~$10k/y for (a) and ~$3M/y for (b), giving N=~300. Part of
why I have (b) this high is that I think it's now difficult to turn
additional money into good work on the most important areas. If you
would give a much higher number for (b), my guess is that you are
imagining someone much stronger than the typical person donating 10%.
Comments sorted by top scores.
comment by lexande ·
2022-04-29T23:15:46.268Z · EA(p) · GW(p)
the primary constraint has shifted from money to people
This seems like an incorrect or at best misleading description of the situation. EA plausibly now has more money than it knows what to do with (at least if you want to do better than GiveDirectly) but it also has more people than it knows what to do with. Exactly what the primary constraint is now is hard to know confidently or summarise succinctly, but it's pretty clearly neither of those. (80k discusses some of the issues with a "people-constrained" framing here.) In general large-scale problems that can be solved by just throwing money or throwing people at them are the exception and not the rule.
For some cause areas the constraint is plausibly direct workers with some particular set of capabilities. But even most people who want to dedicate their careers to EA could not become effective e.g. AI safety researchers no matter how hard they tried. Indeed merely trying may be negative impact in the typical case due to opportunity cost of interviewers' time etc (even if EV-positive given the information the applicant has). One of the nice things about money is that it basically can't hurt, and indeed arguments about the overhead of managing volunteer/unspecialised labour were part of how we wound up with the donation focus in the first place.
I think there is a large fraction of the population for whom donating remains the most good they can do, focusing on whatever problems are still constrained by money (GiveDirectly if nothing else) because the other problems are constrained by capabilities or resources which they don't personally have or control. The shift from donation focus to direct work focus isn't just increasing demandingness for these people, it's telling them they can't meaningfully contribute at all. Of course inasmuch as it's true that a particular direct work job is more impactful than a very large amount of donations it's important to be open and honest about this so those who actually do have the required capabilities can make the right decisions and tradeoffs. But this is fundamentally in tension with building a functioning and supportive community, because people need to feel like their community won't abandon them if they turn out to be unable to get a direct work job (and this is especially true when a lot of the direct work in question is "hits-based" longshots where failure is the norm). I worry that even people who could potentially have extraordinarily high impact as direct workers might be put off by a community that doesn't seem like it would continue to value them if their direct work plans didn't pan out.Replies from: tylermaule
↑ comment by tylermaule ·
2022-04-30T00:01:30.493Z · EA(p) · GW(p)
I strongly agree with this comment, especially the last bit.
In line with the first two paragraphs, I think the primary constraint is plausibly founders [of orgs and mega-projects], rather than generically 'switching to direct work'.Replies from: lexande
↑ comment by lexande ·
2022-05-01T09:32:20.035Z · EA(p) · GW(p)
Maybe, though given the unilateralist's curse and other issues of the sort discussed by 80k here I think it might not be good for many people currently on the fence about whether to found EA orgs/megaprojects to do so. There might be a shortage of "good" orgs but that's not necessarily a problem you can solve by throwing founders at it.
It also often seems to me that orgs with the right focus already exist (and founding additional ones with the same focus would just duplicate effort) but are unable to scale up well, and so I suspect "management capacity" is a significant bottleneck for EA. But scaling up organizations is a fundamentally hard problem, and it's entirely normal for companies doing so to see huge decreases in efficiency (which if they're lucky are compensated for by economies of scale elsewhere).
comment by Michael Townsend (Michael_Townsend) ·
2022-04-29T03:03:02.610Z · EA(p) · GW(p)
I think this post does a great job of capturing something I've heard from quite a few people recently.Replies from: ChanaMessinger
Especially for longtermist EAs, it seems direct work is substantially more valuable relative to donations than it was in the past, and I think your thought experiment about the number of GWWC pledges it'd make sense to trade for one person working on an 80k priority pathway is a reasonably clear way of illustrating that point.
But I think that this is a false dilemma (as you suggest it might be). This isn't just because I doubt that the pledge (or effective giving generally) are in tension, but because I think they're mutually supportive. Effective giving is a reasonably common way to enter the effective altruism community. Noticing that you can have an extraordinary impact with donations — which, even from a longtermist perspective, I still think you can have [EA(p) · GW(p)] — can inspire people to begin to taking action to improve the world, and potentially continue onto working directly. I think historically it's been a pretty common first step, and though I anticipate more direct efforts to recruit highly engaged EAs to become relatively more prominent in future, I still expect the path from effective giving --> priority path career, to continue much more often than effective giving --> someone not taking a priority path.
I've heard a lot of conflicting views on whether the above is right; it seems quite a few people disagree with me, and think there's much more of a tension here than I do, and I'd be interested to hear why. (For disclosure, I work at GWWC and personally see getting more people into EA as one of the main ways GWWC can be impactful) [EA · GW]. [EA · GW]
I suppose the upshot on this, if I'm right, is that the norm that "10% and you're doing your part" can continue, and it's not so obvious that it's in tension with the fact that doing direct work may be many times more impactful. While it may be uncomfortable that there are significant differences in the impactfulness of members of the community, I think this is/was/always will be the case.
Another thing worth adding is that I think there's also room for multiple norms on what counts as "doing your part". For example, I think you should also be commended and feel like you've done your part if you apply to several priority paths, even if you don't get one / it doesn't work out for whatever reason. Maybe Holden's suggestion of trying to get kick-ass at something, while being on standby to use your skill for good, could be another.
By way of conclusion, I feel like what I've written above might seem dismissive of the general issue that EA has yet to figure out — given the new landscape — how to think about demandingness. But I really think there is something to work out here, and so I really interesting this post for raising it quite explicitly as an issue.
↑ comment by ChanaMessinger ·
2022-04-30T08:21:05.849Z · EA(p) · GW(p)
"I still expect the path from effective giving --> priority path career, to continue much more often than effective giving --> someone not taking a priority path."
I parsed this as over 50% of people who do effective giving or take the GWWC pledge of similar go on to (or you predict will go on to) do full time impact work. Is that what was intended?
Replies from: bec_hawk, Michael_Townsend
↑ comment by Rebecca (bec_hawk) ·
2022-04-30T19:33:16.924Z · EA(p) · GW(p)
I interpreted the arrows to be causal and not just temporal. So effective giving is more often going to cause people to work in a priority path than it will cause people to not work in a priority path where they otherwise would.
↑ comment by Michael Townsend (Michael_Townsend) ·
2022-05-01T22:30:42.470Z · EA(p) · GW(p)
What Bec Hawk [EA(p) · GW(p)] said is right: my claim is that that the number of people effective giving causes to go into direct work will be greater than the number people it causes to not go into direct work (who otherwise would).
For what it's worth, I don't think >50% of people who do the GWWC plan will go onto doing direct work.
comment by mic (michaelchen) ·
2022-04-29T04:21:21.100Z · EA(p) · GW(p)
Is it possible to have a 10% version of pursuing a high-impact career? Instead of donating 10% of your income, you would donate a couple hours a week to high-impact volunteering. I've listed a couple opportunities here. In my opinion, many of these would count as a high-impact career if you did full-time.
- Organizing a local EA group
- Or in-person/remote volunteering for a university EA group [EA · GW], to help with managing Airtable, handling operations, designing events, facilitating discussions, etc. Although I don't know that any local EA groups currently accept remote volunteers, from my experience with running EA at Georgia Tech, I know we'd really benefit from one!
- If you're quite knowledgeable about EA/longtermism and like talking to people about EA, being something like an EA Guides Program mentor could be a great option. One-on-one chats can be quite helpful for enabling people to develop better plans for making an impact throughout their life. I don't know the Global Challenges Project is looking for more mentors for its EA Guides Program at this time, but it would be valuable if it had a greater capacity.
- Facilitating for EA programs that are constrained by the number of (good) facilitators. In Q1 2022, this included the AGI Safety Fundamentals technical alignment and governance tracks. (Edit) EA Virtual Programs is also constrained [EA · GW] by the number of facilitators.
- Signing up as a personal assistant for Pineapple Operations (assuming this is constrained by the number of PAs, though I have no idea whether it is)
- Phone banking for Carrick Flynn's campaign (though this opportunity is only available through May 17)
- Gaining experience that would be helpful for pursuing a high-impact career (e.g., by taking a MOOC on deep learning to test your fit for machine learning work for AI safety)
- Distilling AI safety articles [LW · GW]
- Volunteering for Apart Research's AI safety or meta AI safety projects
- Volunteering for projects from Impact CoLabs, perhaps
- Running a workplace EA group, especially if you're able to foster discussion about working on pressing problems
Part-time volunteering might nprovide as much of an opportunity to build unique skills, compared to working full-time on direct work, but I think it could still be pretty valuable depending on what you do.
In a way, sacrificing your time might be more demanding than sacrificing your excess income. But volunteering can help you feel more connected to the community and feel more fulfilling than just donating money as an individual. It might not even be a sacrifice as for some opportunities, you could get paid, either directly (as in the case of Pineapple Operations) or through applying to the EA Infrastructure Fund or Long-Term Future Fund.Replies from: Mark Xu, martin_glusker, Holly, elifland
↑ comment by Mark Xu ·
2022-04-29T11:40:52.859Z · EA(p) · GW(p)
I expect 10 people donating 10% of their time to be less effective than 1 person using 100% of their time because you don't get to reap the benefits of learning for the 10% people. Example: if people work for 40 years, then 10 people donating 10% of their time gives you 10 years with 0 experience, 10 with 1 year, 10 with 2 years, and 10 with 3 years; however, if someone is doing EA work full-time, you get 1 year with 0 exp, 1 with 1, 1 with 2, etc. I expect 1 year with 20 years of experience to plausibly be as good/useful as 10 with 3 years of experience. Caveats to the simple model:
- labor-years might be more valuable during the present
- if you're volunteering for a thing that is similar to what you spend the other 90% of your time doing, then you still get better at the thing you're volunteering for
I make a similar argument here [EA · GW].
Replies from: Linch, michaelchen
↑ comment by Linch ·
2022-04-29T16:03:51.655Z · EA(p) · GW(p)
I expect 10 people donating 10% of their time to be less effective than 1 person using 100% of their time because you don't get to reap the benefits of learning for the 10% people [emphasize mine]
"benefits of learning" doesn't feel like the only reason, or even the primary reason, why I expect full-time EA work to be much more impactful than part-time EA work, controlling for individual factors. To me, network/coordination costs seem much higher. E.g. it's very hard to manage a team of volunteer researchers or run an org where people volunteer 4h/week on average, and presumably less consistently.
↑ comment by mic (michaelchen) ·
2022-04-30T14:31:47.483Z · EA(p) · GW(p)
My bad, I meant to write "Part-time volunteering might not provide as much of an opportunity to build unique skills, compared to working full-time on direct work". Fixed.
↑ comment by martin_glusker ·
2022-04-30T14:51:26.184Z · EA(p) · GW(p)
I think in most cases, this doesn't look like using 10% of your time, but rather trading off the an optimally effective career for a less effective career with that improves along selfish dimensions such as salary, location, work/life balance, personal engagement, etc.
This picture is complicated by the fact that many of these characteristics are not independent from effectiveness, so it isn't clean. Personal fit for a career is a good example of this because it's both selfish and you'll be better at your job if you find a career with relative better fit.
comment by Will Bradshaw (willbradshaw) ·
2022-04-29T14:07:06.340Z · EA(p) · GW(p)
(This has not applied evenly. People who were already planning to make EA central to their career are generally experiencing EA as less demanding: pay in EA organizations has gone up, there is less stress around fundraising, and there is less of a focus on frugality or other forms of personal sacrifice. In some cases these changes mean that if someone does decide to shift their career it is less of a sacrifice than it would've been, though that does depend on how the field you enter is funded.)
Thanks, I found this discussion of in what ways EA is now more vs less demanding quite clarifying. I appreciate the point that for some people EA is much less demanding than it used to be, while for others it's much more so.Replies from: Jeff_Kaufman
comment by Patrick Gruban (gruban) ·
2022-04-29T14:48:17.877Z · EA(p) · GW(p)
Thank you for this post that touches on the important point of demandingness. Personally, I can see it in two ways.
On a global level giving 10% to effective causes is relatively rare. Giving What We Can has grown impressively but still, less than 1 in every 50.000 of the world's high-income population have taken it. 10% is higher than the average donations that are below 2% of GDP. Even in the EA survey [? · GW], only 1/3 have said to donate at least this amount. While some of the top areas in EA seem less funding constraint, there is still much room for spending until for example GiveDirectly can't give away any more money. In that sense, I'm very grateful to anyone who is able and willing to commit to giving 10% or more of their income and would not want to exclude them from seeing themselves as Effective Altruists. If we've funded everything that is equivalent to GiveDirectly's impact or we have at least 50 Mio. people donating 10+% then I'd revisit this but currently, there is still enough to do.
On a personal level, the concept of demandingness has no limit. 10% is just a Schelling point, something that is easy to communicate for people new to the movement, a goal to be reached. Doing good better doesn't stop there and it doesn't stop at thinking about donations. I like the framing of excited altruism better or altruism as a central purpose [EA · GW]. Another framing could be that of aiming higher: Continuously stretching for ways to have more impact while taking care of oneself. Each of these framings will have its supporters and I would encourage anyone to select the one that motivates them best. At the same time, the community and its support structure are very important to keep people healthy and motivated when they feel they are failing at their self-set goals.
comment by tylermaule ·
2022-04-29T23:19:08.454Z · EA(p) · GW(p)
Re footnote, the only public estimate I've seen is $400k-$4M here [EA · GW], so you're in the same ballpark.
Personally I think $3M/y is too high, though I too would like to see more opinions and discussion on this topic.Replies from: Jeff_Kaufman
comment by james++ ·
2022-04-29T22:24:28.806Z · EA(p) · GW(p)
Thank you, I needed to hear this stated clearly. The trend you point at closely tracks my own anxiety over time around having enough impact.
The relatively huge value of directing your whole career at EA is something that hasn't fully sunk in for me intuitively, and I expect the same for others who don't work at EA orgs.
comment by Vilfredo's Ghost (Bluefalcon) ·
2022-04-30T04:12:11.764Z · EA(p) · GW(p)
Nice little one-two punch here with you expressing a desire to increase demandingness and Julia telling ppl it's ok to leave EA. Was that planned? Replies from: Jeff_Kaufman
↑ comment by Jeff Kaufman (Jeff_Kaufman) ·
2022-04-30T07:19:06.764Z · EA(p) · GW(p)
I'm not advocating increasing demandingness; I'm observing it. I'm not sure whether I think the increase is net good (encouraging people to do directly valuable things) or bad (too much pressure, losing people who would be happy contributing at 10% of that was more clearly valued).
And no, this post wasn't coordinated with Julia. I sent a draft to her (along with several other people) before publishing, but she didn't give any feedback (and may not have read it). In general I don't coordinate with Julia in writing about EA.