“EA” doesn’t have a talent gap. Different causes have different gaps.

post by katherinesavoie · 2018-05-20T22:07:53.761Z · EA · GW · Legacy · 18 comments

Contents

  Poverty
  Animal rights
  Artificial intelligence
  Poverty talent gap
  Poverty money gap
  Animal rights talent gap
  Animal rights money gap
  Artificial intelligence talent gap
  Artificial intelligence money gap
  Meta organizations talent gap
  Meta organizations money gap
None
16 comments

There’s been a lot of discussion and disagreement over whether EA has a talent or a money gap. Some people have been saying there’s not that large of a funding gap anymore and that people should be using their talent directly instead. On the other hand, others have been saying that there definitely still is a funding gap. 

I think both parties are right, and the reason for the misunderstanding is that we have been referring to the entire EA movement instead of breaking it down by cause area. In this blog post I do so and demonstrate why we’re like the blind men touching different parts of the elephant, and how if we put all of it together, we’ll be able to make much better decisions.

Poverty

Animal rights

Artificial intelligence

Meta organizations (that fall outside of one of the above areas.)

I am not extremely confident on all these numbers (particularly the size of the AI talent gap), but I am confident of the broader claim that the gaps are different between cause areas, and we would all benefit from making that distinction in public discourse. I am happy to update these as people make good arguments for them in the comments. Below I’ll go into further details of how I came to these estimates. 

Poverty talent gap

In my experience, poverty organizations generally hire outside of the EA movement for many roles. There are still small gaps for some poverty organizations hiring management and leadership roles from the EA pool (~4). There are also some gaps in operational talent (~2). A part of this gap also comes from the possibility of founding more effective poverty charities (~4), such as a tobacco taxation or conditional cash transfer charity, like what has been done with Charity Science Health and Fortify Health

Poverty money gap

The gap for money in poverty is huge, even when only looking at charities significantly stronger than Give Directly, whose gap is very large and arguably virtually unlimited. The gap is close to $100 million after Good Ventures funds its portion. There is also reason to expect this gap to grow with recent changes in Good Venture’s funding plans and a strong group of incubation charities in GiveWell’s system. This gap only grows if you think there are strong opportunities in poverty outside of GiveWell’s list. Assuming donating 50% of a $100,000 salary, it would easily take 1,720 people doing E2G to fill this gap. And that is not even including new Givewell incubated/recommended charities!

Animal rights talent gap

The talent gap for animal rights is very large. Many AR organizations are hiring and trying to grow as fast as possible. There is also considerable scope for entrepreneurship and founding new and effective animal rights organizations. The animal rights community as a whole is very small and the number of EAs in the movement is even more limited. 

Animal rights money gap

Historically animal rights has been chronically hampered by insufficient funding across the movement. However the entrance of Open Phil to the area has created a very different situation. I now categorize the funding gaps as mixed. The funding is fairly centralized between Open Phil and the AR Funds being run by the same person (Lewis), which controls nearly 50% of all funding in AR. If you have strong agreement with Lewis about the priorities in the area, I would say the funding gap is small. However, if you have very different views, then the funding gap could be seen as large.

Artificial intelligence talent gap

The talent gap for Artificial intelligence is middling, with many organizations in the field in need of researchers as well as some gaps in meta-organizations focusing on meta-research. There are also significant gaps in operational talent to help the support structures of these organizations. 

Artificial intelligence money gap

The money gap for AI organizations seems very small, with even large funders being turned away from many projects. Many organizations have very large amounts of funding, and given the recent changes in publicity, much like animal rights, AI went from being chronically underfunded to well funded in almost all areas. Furthermore, due to the fairly wide spread of funders, even people with more unique perspectives on AI will find it hard to find good gaps. 

Meta organizations talent gap

Importantly in this section, I mostly consider meta organizations that do not fall under another cause area. For example, ACE would fall under animal rights, not under meta. The talent gap for these organizations generally seems small, with some posted roles in leadership (~7), operations (~3), research (~3) and other general roles (~3) across organizations. There seems to be some scope for founding new charities as well (~4). 

Meta organizations money gap

Much like animal rights, there's a lot of centralization of funding with a handful of funders controlling a very high percentage of total funding. Like in animal rights, there is one person who controls the EA funds on meta-organizations and is the lead investigator for Open Phil. Thus I think an EA’s perspectives on funding gaps will largely depend on how well their views align with Nick Beckstead’s. This gap can range from very small to moderate sized (low millions) depending on how broadly you define meta-organizations. 

Overall, as you can see, the talent and money gaps vary largely depending on the cause. If you think poverty is the highest impact area, earning to give is a very good choice. On the other hand, if you think animal rights is the best, figuring out how to best give your talents might be a better way forward. If you agree with Lewis, that is. Regardless of what cause you think is highest priority and what you think the gaps truly are, breaking them down by cause area will help everybody make better decisions. 

 

18 comments

Comments sorted by top scores.

comment by Denise_Melchin · 2018-05-20T23:42:00.190Z · EA(p) · GW(p)

Thanks for trying to get a clearer handle on this issue by splitting it up by cause area.

One gripe I have with this debate is the focus on EA orgs. Effective Altruism is or should be about doing the most good. Organisations which are explicitly labelled Effective Altruist are only a small part of that. Claiming that EA is now more talent constrained than funding constrained implicitly refers to Effective Altruist orgs being more talent than funding constrained.

Whether 'doing the most good' in the world is more talent than funding constrained is much harder to prove but is the actually important question.

If we focus the debate on EA orgs and our general vision as a movement on orgs that are labelled EA, the EA Community runs the risk of overlooking efforts and opportunities which aren't branded EA.

Of course fixing global poverty takes more than ten people working on the problem. Filling the funding gap for GiveWell recommended charities won't be enough to fix it either. Using EA branded framing isn't special to you - but it can make us lose track of the bigger picture of all the problems that still need to be solved, and all the funding that is still needed for that.

If you want to focus on fixing global poverty, just because EA focuses on GW recommended charities doesn't mean EtG is the best approach - how about training to be a development economist instead? The world still needs more than ten additional ones of that. (Edit: But it is not obvious to me whether global poverty as a whole is more talent or funding constrained - you'd need to poll leading people who actually work in the field, e.g. leading development economists or development professors.)

comment by Robert_Wiblin · 2018-05-21T15:01:54.294Z · EA(p) · GW(p)

"Claiming that EA is now more talent constrained than funding constrained implicitly refers to Effective Altruist orgs being more talent than funding constrained."

It would be true if that were what was meant, but the speaker might also mean that 'anything which existing EA donors like Open Phil can be convinced to fund' will also be(come) talent constrained.

Inasmuch as there are lots of big EA donors willing to change where they give, activities that aren't branded as EA may still be latently talent constrained, if they can be identified.

The speaker might also think activities branded as EA are more effective than the alternatives, in which case the money/talent balance within those activities will be particularly important.

comment by MichaelPlant · 2018-05-21T10:12:54.411Z · EA(p) · GW(p)

One gripe I have with this debate is the focus on EA orgs

I think this is a bit unfair. I took the OP to be referring the previous discussion of this by 80k, which was specifically about EA orgs.

comment by [deleted] · 2018-06-11T13:47:35.682Z · EA(p) · GW(p)

I had a similar reaction.

It was the choice of "Money gap - Large (~$86 million" in the summary that got me. It just seems immediately odd that if you think that Earning To Give to some global poverty charities is on a par with other common EA career choices in terms of marginal impact (i.e. assuming you think "poverty" should be on the table at all for us), that the size of this funding gap is the equivalent of ~$0.086pp for the bottom billion. And in fact the linked post gives a funding gap of something more like $400 million for GiveWell's top charities alone (on top of expected funding from Good Ventures and donors who aren't influenced by GiveWell), with GiveDirectly able to absorb "over 100 million dollars". But it's not so odd if you think that the expected value of donating to GiveWell-recommended charities is several orders of magnitude greater compared to the average global poverty charity. I'm aware that heavy-tailed distributions are probably at play here, but I'm very skeptical that GiveWell has found anywhere near the end of that tail (although I think they're the best we have).

Regardless of what the author meant, I think I see this kind of thinking in EA fairly regularly, and it's encouraged by giving the "neglectedness" criterion such prominence, perhaps unduly.

And yes, I also want to thank the author for encouraging people to think and talk about this in a more nuanced way.

comment by RandomEA · 2018-05-21T00:28:43.842Z · EA(p) · GW(p)

Here's what Lewis Bollard had to say about the talent vs. funding issue when asked about it on the 80,000 Hours podcast (in September 2017):

Robert Wiblin: My impression is that fa …. animal welfare organisations, at least the ones that I’m aware of, they are associated with Effective Altruism are often among the most funding constrained. That they often feel like they’re most limited by access to money. Does this suggest that people who are concerned with animal welfare should be more inclined to do earning to give and, perhaps, rather than work in the area, instead make money and give it away?

Lewis Bollard: I don’t think so. I think that that was true until two years ago, or it was true until eighteen months ago when we started ground making in this field. I think the situation has dramatically improved in terms of funding largely because of Open Phil. Entering this field, but also because there are a number of other very generous donors who’ve either entered the field or significantly increased their giving in the last two years.

Right now I think there is a bigger talent gap than financial gap for farm animal welfare groups. That’s not to say it will always be that way, and I certainly do think that someone whose aptitude or inclination is heavily toward earning to give, it could still well make sense. If someone has great quantitative skills and enjoys working at a hedge fund, then I would say earn to give. That could be still a really powerful way and we will more and more funders over time to continue scaling up the movement, but all things equal, I would encourage someone to focus more on the talent piece now because I do think that things have really flipped in the last few years, and I’m pretty optimistic that the funding will continue to grow in this space for animal welfare.

Robert Wiblin: What makes you confident about that? You don’t expect to be fired in the next few years?

Lewis Bollard: First, I hope I won’t be fired, but I think there’s a deep commitment from the Open Philanthropy Project to continue strong funding in this space, to continue funding on at least the level we’re funding currently and hopefully more.

I’ve also just seen a number of new large-ish funders coming online. Just in the last two years I would say the number of funders giving more than two hundred thousand dollars a year has doubled, and I’ve started to see real interest from some other major potential funders.

I think it’s natural that, as this issue has gained public prominence, so were there a lot of potential donors, or people who have great wealth, have realised that this is something important and this is something that they can make a great difference.

comment by jayquigley · 2018-05-23T17:14:57.973Z · EA(p) · GW(p)

For the animal advocacy space, my anecdata suggest that the talent gap is in large part a product of funding constraints. Most animal charities pay rather poorly, even compared to other nonprofits.

comment by Benjamin_Todd · 2018-05-22T03:28:59.391Z · EA(p) · GW(p)

Yes, each cause has different relative needs.

It's also more precise and often clearer to talk about particular types of talent, rather than "talent" as a whole e.g. the AI safety space is highly constrained by people with deep expertise in machine learning and global poverty isn't.

However, when we say "the landscape seem more talent constrained than funding constrained" what we typically mean is that given our view of cause priorities, EA aligned people can generally have a greater impact through direct work than earning to give, and I still think that's the case.

comment by ishaan · 2019-05-22T03:08:53.348Z · EA(p) · GW(p)

In 2015 you (Benjamin) wrote a post which, if I'm reading it right, aspires to answer the same question, but is in very direct contradiction with the conclusions of your (Katherine's) post regarding which causes are relatively talent constrained. I would be interested in hearing about the sources of this disagreement from both of you (Assuming it is a disagreement, and not just the fact that time has passed and things have changed, or an issue of metrics or semantics)

here is the relevant excerpt

https://80000hours.org/2015/11/why-you-should-focus-more-on-talent-gaps-not-funding-gaps/

...Most of the causes the effective altruism community supports are more talent constrained than funding constrained. For example (in all of the following, I’ve already taken account of replaceability): 1) International development... 2) Building the effective altruism community and priorities research... 3) AI safety research...
...The main exception to this – a cause supported by the community that seems more funding constrained than talent constrained – is ending factory farming. Jon Bockman of Animal Charity Evaluators, told me that vegan advocacy charities have lots of enthusiastic volunteers but not enough funds to hire them, meaning that funding is the greater bottleneck (unless you have the potential to be a leader and innovator in the movement). So, the more weight you put on this cause, the more funding constrained you’ll see the community. But the situation could reverse if you think developing meat substitutes is the best approach, because that could be pursued by for-profit companies or within academia.

It sounds like both of you (Katherine and Benjamin) agree that AI is "talent constrained". Pretty straightforward, it's hard to find sufficiently talented people with the specialized skills necessary.

It sounds like the two of you diverge on global poverty, for reasons that make sense to me.

Katherine's analysis, as I understand it, is straightforwardly looking at what Givewell says the current global poverty funding gap is...which means that impact via talent basically relies on doing more good with the existing money, performing better than what is currently out there. (And how was your talent gap estimated? Is it just a counting up of the number of currently hiring open positions on the EA job board?)

Benjamin's analysis, as I understand it, is that EA's growing financial influence means that more money is going to come in pretty soon, and also that effective altruists are pretty good at redirecting outside funds to their causes (so, if you build good talent infrastructure and rigorously demonstrate impact and a funding gap, funding will come)

Is this a correct summary of your respective arguments? I understand how two people might come to different conclusions here, given the differing methods of estimating and depending on what they thought about EA's ability to increase funding over time and close well demonstrated funding gaps.

(As an aside, Benjamin's post and accompanying documents made some predictions about the next few years - can anyone link me to a retrospective regarding how those predictions have born out?)

It sounds like you diverge on animal rights, for reasons I would like to understand

Benjamin, it sounds like you / Joe Bockman are saying that ending factory farming is exceptional among popular EA causes in having more talent than they can hire and being in sore need of funding.

Whereas Katherine, it sounds like you're saying that animal rights is particularly in need of talent relative to all the other cause areas you've mentioned here.

These seem like pretty diametrically opposed claims. Is this a real disagreement or have I misread? I'm not actually sure what the source of this disagreement is, other than Katherine and Joe having different intuitions, or bird's eye views of different parts of the landscape? Has Joe written more on this topic? If it's just a matter of two people's intuitions, it doesn't leave much room for evaluating either claim. (I get the sense that Katherine's claim isn't based on intuition, but the fact that EA animal organizations are currently expanding, which increases the estimated number of open job postings available in the near future. Is that correct?)

(Motivation: I'm reading this post now as part of the CE incubation program's reading list, and felt surprised because the conclusions conflicted with my intuitions, some of which I think were originally formed by reading Benjamin's posts a few years ago. As the program aims to set me on a path which will potentially help me cause redirection of funding, redirection of talent, create room for more talent, and/or create room for more funding within global poverty or animal issues, the answers to these questions may be of practical value to me.)

I'd be happy if either of you could weigh in on this / explain the nature and sources of disagreement (if there is in fact a disagreement) a bit more!

(PS - can I tag two people to be notified by a comment? Or are people notified about everything that occurs within their threads?)

comment by ishaan · 2019-05-22T03:44:12.913Z · EA(p) · GW(p)

What are your thoughts on this? https://80000hours.org/2015/11/why-you-should-focus-more-on-talent-gaps-not-funding-gaps/

In particular

> a cause supported by the community that seems more funding constrained than talent constrained – is ending factory farming. Jon Bockman of Animal Charity Evaluators, told me that vegan advocacy charities have lots of enthusiastic volunteers but not enough funds to hire them, meaning that funding is the greater bottleneck (unless you have the potential to be a leader and innovator in the movement).

Please see also my reply to Benjamin Todd's comment for a longer version of this question, which I wanted to address to both of you, but I don't think this forum has user tagging functionality.

comment by Robert_Wiblin · 2018-05-21T14:55:42.091Z · EA(p) · GW(p)

Like you, at 80,000 Hours we view the relative impact of money vs talent to be specific to particular problems and potentially particular approaches too.

First you need to look for what activities you think are most impactful, and then see what your money can generate vs your time.

comment by RandomEA · 2018-05-21T19:26:51.750Z · EA(p) · GW(p)

First you need to look for what activities you think are most impactful, and then see what your money can generate vs your time.

This statement could be interpreted as suggesting that people should use a two-step process: first, choose a problem based on how pressing it is and then second, decide how to contribute to solving that problem.* That two-step approach would be a bad idea because some people may be able to make a greater impact working on a less pressing problem if they are especially effective at addressing that problem. Because of this, information about how pressing different problems are relative to each other should not be used to choose a single problem; instead, it should be used as background information when comparing careers across problems.

*I doubt that's what you actually meant since you wrote the linked article that discusses personal fit. But I figured some people might be unfamiliar with that article, so I thought it'd be worthwhile to note the issue.

comment by Robert_Wiblin · 2018-05-21T22:46:46.617Z · EA(p) · GW(p)

Yes - the reason you need to look at a bunch of activities rather than just one activity, is that your personal fit, both in general, and between earning vs direct work, could materially reorder them.

comment by Lukas_Gloor · 2018-05-21T11:03:54.279Z · EA(p) · GW(p)

Talent gap - Middle (~50 people)

If the AI safety/alignment community is altogether around 50 people, that's a large relative gap. Depending on how you count it might be bigger than 50 people, but the talent gap seems large in relative terms in either case. :)

comment by Prabhat Soni · 2020-08-04T03:26:03.089Z · EA(p) · GW(p)

Thanks for this post, it was very insightful. Do you have any ideas on the talent/funding gap scenario for other EA cause areas like global priorities research (I believe this doesn't come under meta EA), biosecurity, nuclear security, improving institutional decision making, etc?

comment by [deleted] · 2018-06-11T12:59:14.017Z · EA(p) · GW(p)

The funding is fairly centralized between Open Phil and the AR Funds being run by the same person (Lewis), which controls nearly 50% of all funding in AR.

If this is true, I just want to take a moment to celebrate that the EA movement has more or less doubled animal rights funding globally. That's awesome!

comment by Denkenberger · 2018-05-23T16:14:09.786Z · EA(p) · GW(p)

This is very helpful. I would note that the Global Catastrophic Risk Institute does AI and is funding constrained. Of course it also does other X risk work, but I think it would be good to broaden your category to include this or have a separate category.