Comment by jon_behar on A Framework for Thinking about the EA Labor Market · 2019-05-18T16:52:04.394Z · score: 3 (2 votes) · EA · GW

There have been lots of great comments about the EA labor market, thanks to everyone who has been engaging in this discussion! I’m going to be away from internet service for about a week, but once I’m back I’ll respond to discussion that’s happened in the interim. Thanks!

Comment by jon_behar on A Framework for Thinking about the EA Labor Market · 2019-05-17T19:45:31.132Z · score: 3 (2 votes) · EA · GW

Thanks Howie. Good catch on the rent vs. own distinction, agree renting is the right reference point for junior hires. Seems like for the highest paying EA orgs, there may not be a wage gap for junior roles relative to comparable nonprofits (which I find pleasantly surprising), through presumably there’s still a large gap relative to the private sector.

Big picture, I still think EA salaries (adjusted for cost of living) are still low enough that there will be talent shortages, especially for more senior roles. It doesn’t help that many EA jobs and a very disproportionate number of the highest paying ones are located in extremely high cost of living locations. Even if SF is “only” 20% more expensive than DC (and for senior roles I'd argue for a higher adjustment), DC is an expensive city too.

Anecdotally, I moved from SF ~5 years ago and cost of living was a major factor (but not the only). My wife and I have estimated that roughly our Bay Area friends have moved elsewhere or plan to soon, and cost of living is almost always a huge factor. I heard a while back that the cost of renting a U-Haul was something like 3x higher to go from SF to Salt Lake City(?) than the reverse trip because way more people want to move out of the Bay than want to move to it. Against that backdrop, I don’t think paying what comparable non-profits do is going to be sufficient to attract the talent pool we want. There are a lot of important EA orgs in the Bay, and I'd like them to be able to hire out of the pool of people who want to buy a house and aren't independently wealthy.

Comment by jon_behar on A Framework for Thinking about the EA Labor Market · 2019-05-16T22:24:13.815Z · score: 3 (2 votes) · EA · GW

FYI, I noticed that page also had some outdated info on TLYCS that only went through 2016. You can find updated numbers/charts in TLYCS's 2018 annual report.

Comment by jon_behar on A Framework for Thinking about the EA Labor Market · 2019-05-16T18:33:32.285Z · score: 1 (3 votes) · EA · GW
Using one industry I personally happen to know well as a comparison, I think entry level salaries for research analysts at these organisations tend to be equal to or higher than salaries for economics research assistants at places like the Federal Reserve or top think tanks in DC.

Does this account for cost-of-living differences? It costs ~75% more to live in SF than DC…

Comment by jon_behar on A Framework for Thinking about the EA Labor Market · 2019-05-16T18:32:32.492Z · score: 3 (2 votes) · EA · GW

Hi Howie,

Thanks for clearing that up and updating the website!

At the most competitive EA orgs, entry level salaries in high-cost-of-living areas now typically range from ~$50k to ~$80k. The most competitive positions at those orgs typically pay at the high end of that range. That said, pay may vary outside of that range for specific positions and at other EA organisations. I'll update the page to clarify later today.

My sense is there’s quite a big gap between pay at a handful of “the most competitive EA orgs” and other EA orgs, and that there’s quite a lot of variation across orgs, causes, geographies, etc. Does 80K have a good handle on the size of these differences and/or would it be open getting more information via the annual talent survey as suggested in OP? (I’m glad to see 80K is open to adding questions to this survey, but as I mentioned elsewhere I think there are serious problems with the new question 80K has proposed.)

Comment by jon_behar on A Framework for Thinking about the EA Labor Market · 2019-05-16T16:14:02.193Z · score: 2 (1 votes) · EA · GW

Well put Gregory, you nicely captured a lot of concerns I have about the "pay by (reported) necessity" model.

Comment by jon_behar on A Framework for Thinking about the EA Labor Market · 2019-05-16T01:35:52.911Z · score: 7 (4 votes) · EA · GW

As readers of the OP can probably guess, I think 80K is dramatically overstating the earning power of working at an EA organization by ranking it 3/5 stars.

Starting salaries in the US tend to be about $50k and in the UK salaries start on around £25k. We think that these salaries are roughly what you need to be happy, but compared with jobs in the private sector, it is harder to build up a savings buffer to withstand financial difficulties, and to make career transitions which require retraining. It can also be more difficult for people with children. That said, the organisations are often willing to pay more to get the right staff, especially if you have specific skills, like web engineering. Many are also happy to pay more to people who have dependents or student debts. At more senior levels, salaries range from $60,000 – $180,000 in the US.

This analysis doesn’t capture the fact that EA jobs (especially the best paying ones) are often in very high cost of living locations. In the Bay Area $50k to start doesn’t go very far, and $180k isn’t much of an upside for someone senior (especially if they have kids, debt, etc).

Comment by jon_behar on A Framework for Thinking about the EA Labor Market · 2019-05-16T01:34:18.175Z · score: 4 (2 votes) · EA · GW
[Raising wages across the board] would cause all kinds of problems. It would worsen the already latent center/periphery divide in EA by increasing inequality, it would make it harder for new organisations to compete, it would reduce the net amount of people that we can employ, etc etc.
But I could be wrong, and I sense that some of my thoughts might be ideologically tainted. If you feel the urge to point me at some econ 101, please do.

I think you’re right that these problems in would occur if a handful of orgs with the most money started raising salaries across the board in the current environment. But a commenter on FB summed up my econ 101 read on this perfectly (to reiterate I'm not an economist): “If the community can't afford market rates maybe it's time to start admitting that the community is funding constrained.”

Comment by jon_behar on A Framework for Thinking about the EA Labor Market · 2019-05-15T18:37:54.170Z · score: 5 (3 votes) · EA · GW

I'm arguing for point A.

Comment by jon_behar on A Framework for Thinking about the EA Labor Market · 2019-05-15T14:30:44.329Z · score: 7 (3 votes) · EA · GW
I'm not arguing against the idea that some people exist that should be given the $150k that is needed to unlock their talents. I'm arguing that this group of people might be very small, and concentrated in your bubble.
I think that's the crux of the argument. If a majority of senior people needed $150k to get by, I'd agree that that should be the wage you offer. If these people make up just 1% of the population (which seems true to me), offering $150k to everyone else is just going to cause a lot of subtle cultural damage.

Very well put. Agree this is the crux of our disagreement; my intuition is that there’s a much larger pool of people who would be enticed by the higher pay.

Comment by jon_behar on A Framework for Thinking about the EA Labor Market · 2019-05-15T14:28:17.338Z · score: 7 (3 votes) · EA · GW

Using NYC as an (admittedly US-centric and high cost of living) example, the average cost of private school is ~$18k/year, and many of the good ones are around $50k. So if you think of a couple that wants to have a couple of kids, doesn’t want to send them to a bad (possibly dangerous) public school, and would like to put those kids through college, it’s unlikely those people would even consider non-profit work unless they had unusual circumstances that would allow them to do so (e.g. one partner with particularly high earning power, a trust-fund, etc.)

Comment by jon_behar on A Framework for Thinking about the EA Labor Market · 2019-05-14T17:49:23.395Z · score: 4 (2 votes) · EA · GW

Great post, strong upvote.

Comment by jon_behar on EA Still Needs an Updated and Representative Introductory Guidebook · 2019-05-14T17:45:33.381Z · score: 9 (4 votes) · EA · GW

To confirm: TLYCS the organization is playing a critical role in the book project; without the organization there absolutely wouldn’t be an updated version. The org has been essential every step of the way (including working to purchase all the necessary rights since ~2014). There’s a ton of work involved and Peter is doing a lot of it, but we’re trying to take as much off his plate as possible including pretty much everything on the promotion and distribution side. There are a lot of skills needed to pull this off, and this model plays to everyone’s comparative advantage: Peter is great at thinking and writing, while the org is better suited to set the distribution strategy (Charlie Bresler, TLYCS’s executive director, ran the marketing department for a large company with an iconic ad campaign).

Our hope is that a lot of people and organizations throughout the EA community will be able to use the book as a way to have more impact, such as GiveWell distributing the book to their donor base, groups/individuals sharing the book with people first learning about EA, making the book available for download at effectivealtruism.org, EA Global, etc. And of course our recommended charities and other effective nonprofits are mentioned throughout the book, with links embedded in the ebook version to make it easier to convert.

This seems like a good time to mention: TLYCS is fundraising for this project, and you can make an earmarked donation here. There’s more background on the book project in TLYCS’s recently released Annual Report.

The EA Meta Fund has made a $10k grant for this project which we’re extremely grateful for, but this barely makes a dent in the barebones budget, let alone what we think we ought to invest in this project. We’d love to see other donors from the EA community get involved as well.

Comment by jon_behar on A Framework for Thinking about the EA Labor Market · 2019-05-14T17:10:49.036Z · score: 3 (2 votes) · EA · GW

Related question: what are ways that EA organizations can foster strong cultures that don’t involve low salaries?

Comment by jon_behar on A Framework for Thinking about the EA Labor Market · 2019-05-14T17:09:58.770Z · score: 3 (2 votes) · EA · GW

Thanks for these thoughtful comments Gregory!

there's an adverse selection worry: low salaries may filter for dedication, but also lower-performers without better 'exit options'.

Hadn’t thought about this before, but agree it’s worrisome. Great point!

A lot has been written on trying to explain why EA orgs (including ones with a lot of resources) say they struggle to find the right people, whilst a lot of EA people say they really struggle find work for an EA org. What I think may explain this mismatch the EA community can 'supply' lots of generally able and motivated people, whilst EA org demand skews more to those with particular specialised skills. Thus jobs looking for able generalists have lots of applicants yet the 'person spec' for other desired positions have few or zero appointable candidates.

Agree with this explanation, and I think both demographics and low salaries contribute. One might frame the problem as: EA needs diverse skillsets, but the EA community is not diverse enough to have all those skillsets and the low pay filters out mission aligned people from outside the community.

Re: the relative glut of generalists, I came across an amazing stat when researching this post. In 2017 and 2018, surveys asked orgs to list up to 6 skills the EA community needs more of. Across both years, not a single person said EA as a whole needs more "People extremely enthusiastic about effective altruism" or more "People extremely enthusiastic about working on x-risk."

Comment by jon_behar on A Framework for Thinking about the EA Labor Market · 2019-05-14T16:46:08.616Z · score: 6 (2 votes) · EA · GW

Thanks for flagging this, please let me know if this clarifies:

There are a bunch of smart people in the EA community for whom “being paid 60% market rate is basically the same as being paid 90% market rate.” Let’s call them Cohort 1.

But there are also a lot of people for whom it makes a huge difference whether they’re paid 60% or 90% of market; for them it might make all the difference in whether they can even consider the job. Let’s call them Cohort 2. In some cases, people in Cohort 2 look like people in Cohort 1 but with more student loans, poorer parents, or other differences that have nothing to do with ability or mission alignment. In other cases, people are in Cohort 2 for reasons that make them systematically different in important ways. For example, I’d expect Cohort 2 to have more people with high earning power than Cohort 1.

To oversimplify, I think a lot of EAs believe that organizations should hire people from Cohort 1 because that means they can get smart, cheap, mission-aligned people. I agree that gets you smart, cheap, mission-aligned people, but think that hiring from Cohort 2 (or a Cohort 1.5 that’s somewhere in between) might still be a better strategy. In other words, we should be asking questions like “does paying 60% or 90% of market lead to more impact over the long-term?”

Comment by jon_behar on A Framework for Thinking about the EA Labor Market · 2019-05-14T15:06:37.511Z · score: 2 (1 votes) · EA · GW

Question: if all you knew about a bank was that the CEO made $75k a year, would that knowledge make you more or less likely to invest in that bank (from a purely financial perspective)? That would make me way less likely to invest.

Comment by jon_behar on A Framework for Thinking about the EA Labor Market · 2019-05-14T15:03:40.602Z · score: 2 (1 votes) · EA · GW

Here’s a simple example: imagine that you, or someone you were responsible for taking care of, had medical expenses of $100k/year. In that case, $75k wouldn’t even let you break even, you’d still be taking on lots of debt.

Other examples: you have debt, you have kids (and/or other relatives you’re financially responsible for), you live in a high cost of living location, or various other factors that have no relation to someone’s suitability for a job.

Comment by jon_behar on A Framework for Thinking about the EA Labor Market · 2019-05-14T14:58:22.871Z · score: 3 (2 votes) · EA · GW
In preventing wage dissatisfaction, I think it's better to look at perceived counterfactuals. This can come from being used to a certain wage, or a certain counterfactual wage being very obvious to you. Or it can come from your peers making a certain wage.
You seem to assume something like "people don't like to accept a wage that is lower than they can get". I suggest replacing that with "people don't like to accept a wage that is lower than they feel they can get".
I know some people that are deliberately keeping their income frozen at 15k so they won't get used to more. They reason that if they did, not only would they be psychologically attached to that wage, to a lesser extent so would their peers. In some sense they are keeping up a healthy cultural environment where it's possible to make little and still be satisfied.

Agree looking at perceived counterfactuals can be a helpful distinction.

I don’t see freezing incomes at 15k as a sustainable or scalable solution, at least in the context of harnessing resources to work on the world’s largest problems. But I think this brings up an interesting point about loss aversion and path dependency. I’d argue (and I think you’re doing the same) that people are much more likely to freeze their income at 15k at the start of their careers, but much less likely to do so after they’ve already started earning more and would need to cut back to that level.

Using @dgjpalmer’s experience as an example, I’d guess most of their Ivy League colleagues started off in the nonprofit industry rather than transitioning to it after some time in the private sector. And this dynamic introduces biases like shortages of skills that people pick up in the private sector.

I've heard of some organisations that don't have a fixed wage for a job, but a maximum. They ask their applicants "how much would you need to be satisfied", and that's how much they get. I'd expect that this practice, combined with a culture that doesn't overly discuss income or flaunt wealth, would be the best way to keep everyone satisfied, compete with industry, and still keep the average wage low.

This sounds very difficult to execute well over time, and my guess is that a lot of resentment would emerge. And doesn’t solve selection bias problems, discussed more here.

Comment by jon_behar on A Framework for Thinking about the EA Labor Market · 2019-05-14T14:56:14.667Z · score: 17 (5 votes) · EA · GW

I find this highly problematic. Candidates who need money more (e.g. those with dependents) will assume a non-profit job won’t pay enough in the first place, and won’t even apply.

It’s also worth noting that we live in a historical context where discouraging employees from disclosing how much they make has been a strategy to suppress wages, often discriminatorily. (See here for why Open Cages has taken the opposite approach and embraced salary transparency.)

Comment by jon_behar on A Framework for Thinking about the EA Labor Market · 2019-05-14T14:55:17.606Z · score: 4 (3 votes) · EA · GW

Belatedly noting an obvious conflict of interest: I work in the EA space and would stand to benefit if the ecosystem paid more.

Comment by jon_behar on Recap - why do some organisations say their recent hires are worth so much? (Link) · 2019-05-14T13:46:20.532Z · score: 11 (5 votes) · EA · GW

In addition to asking EA organizations more questions about their talent wants/needs, it’d be nice to get more information about their funding gaps. I suggest asking organizations how much they’d like to raise over 1,3, and 5 year timeframes, and also asking them to rate how difficult they found their last fundraising round.

Comment by jon_behar on Recap - why do some organisations say their recent hires are worth so much? (Link) · 2019-05-14T13:44:09.297Z · score: 10 (2 votes) · EA · GW

Great to hear you’re considering new questions for the talent survey! I’ve suggested some specific questions to help get a better understanding of how much EA organizations are paying.

As 80K puts it, “Skill bottlenecks are a matter of degree” and these degrees can vary significantly depending on the specific skills in question. But we have little hard data to quantify skill gaps across various areas.
I suggest adding new questions to the next talent survey of EA organizations to help capture some of these nuances.[9] These questions will hopefully make it easier to answer questions like: Are EA organizations talent constrained? If so, which sorts of organizations and which sorts of talent? How large are these constraints? What can be done about them?
New questions (which should continue past surveys’ practice of asking the same questions about both a junior and senior hire and anonymizing organizational responses due to the sensitivity of the information involved):
● Generally speaking, how easy/difficult do you currently find it to fill roles at your organization? (Scale of 1 = Very difficult to 5 = Very easy)
● What do you pay current employees relative to what they could earn on open market, including jobs in the for-profit sector? (Multiple choice: More; about the same; 0-10% less; 11-20% less, etc)
● What do you pay current employees relative to what they could otherwise earn in the nonprofit sector (including all nonprofits not just EA organizations)? (Multiple choice: More; about the same; 0-10% less; 11-20% less, etc)
● What do you pay current employees relative to what they could otherwise earn at another EA organization? (Multiple choice: More; about the same; 0-10% less; 11-20% less, etc)
● How much do you plan to offer future hires relative to current employees in similar roles? (Multiple choice: More; Less; About the same. If someone answers “more”: Do you plan to increase salaries for existing employees? What factors into this decision?[10]
● Would any of the following steps be helpful in closing your organization’s talent gaps? (Rate the following options on a Scale of 1 =Not at all helpful to 5 = Extremely helpful):
○ Increasing salaries
○ Investing in recruiting
○ Increased publicity of position
○ Better access to interested candidates
○ Other (please describe)
This information is relatively easy to collect and interpret, corresponds to a real-world decision organizations make (how much to pay), is grounded in observable data (market wages), and is comparable across roles, organizations, geographies, and causes.[11] To mitigate the cost of data collection, I suggest abandoning survey questions about non-traditional labor metrics that seem to be producing noisy data and to generally be causing confusion.[12] If we collect compensation data and respondents indicate which types of roles they’re thinking about as junior and senior (which will vary across organizations), we’ll have a rich and actionable picture how EA compensation stacks up to the competition for various types of skills.
Comment by jon_behar on Recap - why do some organisations say their recent hires are worth so much? (Link) · 2019-05-14T13:40:17.607Z · score: 10 (3 votes) · EA · GW

Thanks for clarifying your thinking! It’s great to see you being responsive to community feedback in this way; for instance, the discussion about low acceptance rates should help give job seekers a more accurate picture of the landscape.

I’ve got a few thoughts to share in the spirit of offering more constructive criticism that will hopefully driving further improvements.

1. After a lot of time spent by 80K and the EA community trying to interpret the results of this question, we’re at a place where:

we do not think the precise numbers are a reliable answer to decision-relevant questions for job seekers, funders, or potential employers. We think it’s likely that mistakes are driving up these estimates. Even ignoring the high probability of mistakes, the implications of the data depend heavily on exactly what is driving the results. We are very uncertain about the magnitude of various considerations, so we recommend against leaning on these numbers when making career decisions.

So why not just ditch this question in favor of something more helpful, instead of continuing to pour more resources into it? If you look at things from a cost-benefit perspective, this question clearly has a high cost due to all the clarification its required. Is the benefit high enough to warrant that? Even if you worked out all the kinks and knew all the respondents were thinking about the question the same way, and investing the time to produce an answer they trusted, what do we really learn by asking: “For a typical recent Senior/Junior hire, how much financial compensation would you need to receive today, to make you indifferent about that person having to stop working for you or anyone for the next 3 years?” Which parties would take which different actions on the margins because of that information?

2. I suspect one of the reasons why people are having trouble interpreting this question is because it doesn’t correspond to a real world decision people have to make. Confusing hypothetical questions are likely to produce non-actionable results. I suggest thinking carefully about exactly what you’re trying to capture (the value of an employee? The urgency and/or difficulty of hiring?) and looking at whether you can measure that dynamic via a clearly labeled Likert scale (e.g. a 1-7 scale rating with values like “employee is transformative in allowing us to do important new things”, “employee lets us do what we currently do, but a tiny bit better”, “it would be very easy to replace an employee of this type”, etc.) I’ve posted some specific suggestions in a separate comment.

And relatedly, I’d recommend as strongly as possible not using this question (or any question involving genies). I think its very likely to lead to the same sorts of interpretation problems.

Below is a very rough draft of one version of the question we are considering asking. We hope it would be more decision-relevant than the question used above, but we haven’t yet had time to pilot it or vet it for any issues:
Imagine that sometime in the next year you are about to hire your next junior (senior) hire. A genie appears and offers you the following choice. You can have one of the following:
1. The genie will create a person and applicant for the job from thin air. They will be as much more productive (in % terms) than the next best applicant in the pool, as your last junior (senior) hire appeared to be at the point when you were evaluating whether to hire them. This person will live out the rest of their life like any other staff member, and may well go on to do other useful work outside of your organisation later on. You should consider the benefits of that for the world as well.
2. The genie will distribute $X among whichever organisations or people you nominate – which can include you and your organisation – to be used to improve the world as much as possible. Consider all the benefits for the world this would generate.
At what value of X would you be indifferent between these two options?
Comment by jon_behar on A Framework for Thinking about the EA Labor Market · 2019-05-14T00:06:22.443Z · score: 4 (2 votes) · EA · GW

Thanks Ben!

It makes sense to me that labor supply would be relatively inelastic below market rates. But there are two (at least) nuances we should be aware of.

1) The degree of inelasticity should depend on where pay sits relative to market. If we assume candidates view money as offering decreasing marginal returns, that suggests raising pay from 60% to 70% of market will be more beneficial than raising it from 70% to 80% of market. But to understand what strategies will be effective, we need to collect money to understand where EA orgs are starting from.

2) Some people can afford to work for 60% of market, some people can’t even if they believe in the mission. As Khorton notes, that kind of pay will systematically exclude “the people who needed to care for their aging parents or pay off massive student loans or had other significant financial constraints.” I think the EA community has gotten caught up in the observation that “there are a lot of smart people willing to work for way below market” and lost track of the question of “what’s the right way to structure EA compensation to maximize impact?”

Comment by jon_behar on EA Still Needs an Updated and Representative Introductory Guidebook · 2019-05-13T22:14:16.977Z · score: 18 (7 votes) · EA · GW

For global poverty, there’ll be a great new option released later this year: an updated 10th anniversary edition of The Life You Can Save coming out in Q4. There will be updated numbers and examples, two new forewords, and increased emphasis on specific calls to action meant for a broad audience (e.g. initially asking people to make a recurring donation vs. a substantial pledge).

The price is also right, as we’ll be able to distribute free copies of the e-book (which will have links so people can take action more easily) and audiobook. The audiobook will have chapters read by celebrity narrators; this isn’t the time or place to list people involved in the project, but they’ll be a great credibility boost.

A lot of EA origin stories start with the first version of TLYCS. We’re about to have a chance to distribute a new and improved version to a much wider audience, and we hope the EA community will help spread it far and wide.

(I work for TLYCS the nonprofit, which is producing and promoting TLYCS the book.)

Comment by jon_behar on A Framework for Thinking about the EA Labor Market · 2019-05-13T17:48:51.265Z · score: 2 (1 votes) · EA · GW

Terrific anecdata, thanks for sharing! Great illustrations of how intentional compensation/HR policies can help organizations access broader pools of talent.

The EA survey showed the community as a whole tilts heavily male as you say, but I have no idea what the gender split would look like if you looked only at people who work at EA orgs (or at senior people in EA orgs). Would be fascinating to do a survey of EA employees to get a sense of demographics, skills, opportunity costs, how they found the job, etc. In a Facebook discussion about this post someone proposed looking for “a Head of Compensation and People Analytics for the EA community”, and this is the sort of data they could collect and use to inform specific policy suggestions.

Comment by jon_behar on Cross-post: Think twice before talking about ‘talent gaps’ – clarifying nine misconceptions, by 80,000 Hours. · 2019-05-13T15:43:56.347Z · score: 5 (3 votes) · EA · GW

I think this definition of talent constraints (found in clarification 1 of OP) lends itself to ambiguity and confusion:

“An organization is talent constrained when, for someone who could take (a reasonably important) job at that organization, they would typically contribute more to that organization by taking the job than earning to give… While we think this framing can sometimes be useful, it also has some problems. For example, this definition seems less useful when an organization’s best potential hires don’t have very high earning potential and wouldn’t be very good funders.”

In a post proposing we look at the EA labor market using traditional models, I suggest an alternative definition:

An organization is talent constrained when it doesn’t have (and/or can’t hire reasonably easily) the people it needs despite offering competitive compensation.

I argue that while this definition...

“may not be perfect[1], but it has the advantages of being simple and applicable across organizations, causes, roles, and geographies. Perhaps most importantly, it’s an intuitive definition: if someone hears the term “talent constrained” it will likely conjure up images of an organization that doesn’t have the talent it needs, rather than an organization where (some) potential supporters would be more valuable as employees than donors.”
Comment by jon_behar on A Framework for Thinking about the EA Labor Market · 2019-05-12T20:21:17.485Z · score: 2 (1 votes) · EA · GW

I think some EA organizations will have a good sense of how their pay stacks up, while others won’t have a good reference. One of the benefits of starting to collect info is that these latter organizations will be able to make more informed decisions.

Granular salary data would be terrific as you note, but I'm a bit concerned over how time consuming that could be for organizations to provide. It’ll also be important to supplement any data from EA employers with survey data from EA job seekers too; I doubt we’ll get a clear picture from one source alone.

Comment by jon_behar on A Framework for Thinking about the EA Labor Market · 2019-05-12T18:09:45.193Z · score: 2 (1 votes) · EA · GW

To help operationalize the discussion about compensation strategy, I’d love to know what people think about this simple scenario I posed:

Imagine an EA organization is trying to make an important hire with skills that are highly desired by the for-profit sector…[and] good candidates could earn $75,000/year elsewhere in the nonprofit sector, and $150,000 in the for-profit sector.

What’s your intuition on what that org should be willing to pay? Mine is discussed in footnote 13, not copying here to avoid biasing discussion.

Comment by jon_behar on A Framework for Thinking about the EA Labor Market · 2019-05-12T18:07:35.885Z · score: 2 (1 votes) · EA · GW

Thanks David!

Re: “structurally suppressed”, any thoughts on alternative phrasing that’d better capture the wage gap dynamic? “systematically lower?” Happy to make an edit…

I’m fully in agreement that lower salaries screen for more cause-motivated people. I just think we can find better screens that correlate less with things that we don’t care about, like privilege.

I agree that higher salaries are a turnoff to donors, and I suspect even EA donors. Various forms of overhead aversion are strong even among people who totally get that it’s rational to trade money for impact. I once heard a talk by a researcher who studies overhead aversion (Ayelet or Uri Gneezy I believe) where they opened by saying they intellectually understand that a nonprofit CEO might be more productive if they fly first class so they're rested for a big meeting, but they’d be pissed if they donated to that charity and then walked by that CEO on their way to coach. I think that’s just human nature (and suggests a particular need for EA donors can look past that mindset).

Agree that the right compensation strategy probably involves looking for a middle ground. I've started a separate thread to operationalize that discussion.

Comment by jon_behar on A Framework for Thinking about the EA Labor Market · 2019-05-12T15:20:26.595Z · score: 5 (3 votes) · EA · GW

This is a very important point. EA already skews high-privilege for a lot of reasons (e.g. founders and network effects) which introduces various biases. Even the process of applying for EA jobs favors high privilege people. So salary norms that reinforce this will compound an existing problem.

The problem isn’t that EA orgs can’t get talented people. It’s that they get the same kind of talented people, and they systematically miss out on other types of talented people that they also need. We need complementary skills (and perspectives, networks, etc.), and we’re only getting some of what we need. Finding people who are talented but not privileged is a good way (but not the only way) to close the gap.

Comment by jon_behar on A Framework for Thinking about the EA Labor Market · 2019-05-12T15:19:32.763Z · score: 7 (4 votes) · EA · GW

dgjpalmer, thanks so much for sharing your thoughts and experiences!

-I largely agree with your description of the different forms of non-monetary compensation non-profit employees receive, like different forms of status.

The only departments where wage dissatisfaction was common was where employees both worked long hours and had a skill-set that was easily transferable to the private sector. Although I'm not sure if employee turnover was necessarily higher than other departments, those who have left often have ended up in higher-paying, more conventionally prestigious jobs (eg. a Big-X consultancy or accountancy firm). Accounting, which furthermore lacks opportunity for tangible direct impact, did seem to have the greatest turnover and wage dissatisfaction.

This data point on the relative satisfaction across roles is very helpful and relevant. Your experience is totally consistent with economic theory, which would predict highest dissatisfaction in roles where there’s a high opportunity cost (people could earn lots more in the private sector) and where the work doesn’t generate much of a “warm glow” (like accounting vs. direct service). This is why I think it’s important to get a handle on what sorts of roles this applies to in the EA ecosystem and how big those opportunity costs are for the best candidates.

-I didn’t mean to suggest that nonprofits can’t attract talented people (though it sounds like your organization might have been exceptional in this area). Like you, I’ve been fortunate to work with some amazing people who have worked at steep discounts to what they could otherwise earn because they believed in a charitable mission, some of whom have volunteered their time fully. My argument is that nonprofits (especially EA nonprofits) are only able to attract a narrow type of talented person, and that this narrowness inhibits their effectiveness. For instance, as Khorton observes low salaries weed out people who aren’t mission aligned but they also weed out people without a lot of privilege. I’ll discuss this more in a response to her comment.

Comment by jon_behar on Many EA orgs say they place a lot of financial value on their previous hire. What does that mean, if anything? And why aren't they hiring faster? · 2019-05-10T18:42:43.840Z · score: 2 (1 votes) · EA · GW

I think this survey question is too hard to interpret to provide actionable information, and I’ve argued we should replace it with some alternative questions:

As 80K puts it, “Skill bottlenecks are a matter of degree” and these degrees can vary significantly depending on the specific skills in question. But we have little hard data to quantify skill gaps across various areas.
I suggest adding new questions to the next talent survey of EA organizations to help capture some of these nuances.[9] These questions will hopefully make it easier to answer questions like: Are EA organizations talent constrained? If so, which sorts of organizations and which sorts of talent? How large are these constraints? What can be done about them?
New questions (which should continue past surveys’ practice of asking the same questions about both a junior and senior hire and anonymizing organizational responses due to the sensitivity of the information involved):
● Generally speaking, how easy/difficult do you currently find it to fill roles at your organization? (Scale of 1 = Very difficult to 5 = Very easy)
● What do you pay current employees relative to what they could earn on open market, including jobs in the for-profit sector? (Multiple choice: More; about the same; 0-10% less; 11-20% less, etc)
● What do you pay current employees relative to what they could otherwise earn in the nonprofit sector (including all nonprofits not just EA organizations)? (Multiple choice: More; about the same; 0-10% less; 11-20% less, etc)
● What do you pay current employees relative to what they could otherwise earn at another EA organization? (Multiple choice: More; about the same; 0-10% less; 11-20% less, etc)
● How much do you plan to offer future hires relative to current employees in similar roles? (Multiple choice: More; Less; About the same. If someone answers “more”: Do you plan to increase salaries for existing employees? What factors into this decision?[10]
● Would any of the following steps be helpful in closing your organization’s talent gaps? (Rate the following options on a Scale of 1 =Not at all helpful to 5 = Extremely helpful):
○ Increasing salaries
○ Investing in recruiting
○ Increased publicity of position
○ Better access to interested candidates
○ Other (please describe)
This information is relatively easy to collect and interpret, corresponds to a real-world decision organizations make (how much to pay), is grounded in observable data (market wages), and is comparable across roles, organizations, geographies, and causes.[11] To mitigate the cost of data collection, I suggest abandoning survey questions about non-traditional labor metrics that seem to be producing noisy data and to generally be causing confusion.[12] If we collect compensation data and respondents indicate which types of roles they’re thinking about as junior and senior (which will vary across organizations), we’ll have a rich and actionable picture how EA compensation stacks up to the competition for various types of skills.
Comment by jon_behar on A Framework for Thinking about the EA Labor Market · 2019-05-09T19:44:44.512Z · score: 6 (2 votes) · EA · GW

Specific question for threading purposes: what do people think about changing the EA organization talent survey to include questions on how their pay compares to what candidates could otherwise earn at other nonprofits or in the for-profit sector? List of proposed questions and changes here.

Comment by jon_behar on Complex value & situational awareness · 2019-05-09T18:25:57.393Z · score: 7 (2 votes) · EA · GW

A few thoughts on this since I'm used as an example:

1. Very much agree with Holly (strong upvote) that having a main gig is critical (essential?) for situational awareness. In my case, having run Giving Games over the years it’d be really weird for me not to have picked up some situational awareness along the way. I’ve had countless conversations with different EAs (there are hundreds of contacts in the GG CRM which isn’t close to comprehensive), so I’ve met a lot of people and gotten a sense how they think. I also get a sense of how they perform on the narrow task of planning and executing a GG, in an absolute sense and relative to other people/groups. My mental model would look enormously different if I didn’t have all this context.

2. Related to 1, I think it’s been valuable that my role naturally provided vetting opportunities that help me weight information (especially if I see a pattern of behavior). This suggests that if EA is vetting constrained, it’ll be less situationally aware. And for the EA community to become situationally aware, that vetting needs to be public. My personal vetting anecdata doesn’t help other people improve their mental models, GiveWell’s research does.

3. To the extent I’m good at situational awareness, a lot of it has to do with learned skills. “Keep your world-model up to date with both social reality & objective, physical reality” was a huge part of the work I used to do in finance. I spent years doing that specific kind of work, got trained by smart people, and trained other people (which helps you learn something deeper).

4. Milan, I think you’re probably reading too much into the situational awareness/strategic advisor relationship, as strategic advisor can cover a lot of different ground.

Comment by jon_behar on EA jobs provide scarce non-monetary goods · 2019-05-09T15:38:04.701Z · score: 3 (2 votes) · EA · GW

Totally agree that EA jobs provide scarce non-monetary goods, and hat tip for looking at this issue through a standard economics lens.

I’ve argued that compensation norms based on offsetting low salaries with high non-monetary pay are problematic in part because they create unwanted biases in which sorts of candidates they attract. If you pay people with money, they can use that however they want. If you pay them in e.g. flexible hours or social status, there’ll be variability in how much people value that and you’ll disproportionately attract candidates who value it a lot. For example, I argue experienced candidates will likely prefer monetary compensation relative to inexperienced candidates for several reasons:

Low salaries make it relatively harder to find experienced candidates than inexperienced ones because of several factors that shift the labor supply curves on a relative basis:
Earning power in alternative jobs. Experienced candidates generally have better paying alternatives than their inexperienced counterparts (i.e. they have higher opportunity costs). Experienced candidates will often be sacrificing hundreds of thousands of dollars or more to work for an EA organization; that would be very rare for junior candidates.
Barriers to entry: Experienced candidates are more likely to have dependents and mortgages, and for them working at an EA organization is more likely to involve a psychologically difficult large pay cut. Inexperienced candidates, on the other hand, may see working in EA as the path of least resistance.
Non-monetary compensation. As Milan Griffes has argued, EA jobs provide scarce non-monetary goods like “social status, life-orientation, a sense of having near-maximal impact, and being part of a value-aligned, elite tribe.” Younger candidates likely perceive more value from these particular factors. (More experienced candidates, by contrast, would likely perceive relatively more value on non-monetary compensation in the form of flexible working hours or paid parental leave).

A Framework for Thinking about the EA Labor Market

2019-05-08T19:33:06.076Z · score: 81 (30 votes)

The Life You Can Save's 2018 Annual Report

2019-04-24T20:47:08.714Z · score: 23 (10 votes)
Comment by jon_behar on EA is vetting-constrained · 2019-04-19T16:59:35.134Z · score: 9 (3 votes) · EA · GW

Re: my comment about smaller projects being undervetted, I should note the level of detail provided in the last grant report from the Long Term Future EA Fund looks like a substantial step forward, “raising the bar on the amount of detail given in grant explanations.”

Comment by jon_behar on EA is vetting-constrained · 2019-04-19T16:54:37.753Z · score: 7 (2 votes) · EA · GW

Another data point suggesting a vetting bottleneck is Open Phil’s recent shift in how they’re funding EA meta/community organizations, including those who work on long-termist causes. This was motivated in part by “high uncertainty about how to set the right grant amounts for these organizations and our sense that we aren’t providing the level of accountability, oversight and vetting that we ideally would like to. We believe that individual donors (particularly to these organizations) sometimes seem to think our investigations into the organizations in question have been deeper than is actually the case.” (emphasis added/shifted)

In other words the funder with the most incentives, capabilities, and resources to vet these organizations (which I’d guess are abnormally hard to vet) doesn’t think it’s doing enough vetting, and is worried other donors are also under-vetting (based on erroneous assumptions). And it’s not just small projects that are under-vetted, the problem seems much broader.

Comment by jon_behar on The Importance of Truth-Oriented Discussions in EA · 2019-03-16T15:25:50.809Z · score: 8 (4 votes) · EA · GW

I intentionally avoided commenting on the OP’s broader claims as I’m squarely in the “Nobody's going to solve the question of social justice here” camp (per @Aidan O’Gara). I only meant to comment on the narrow issue of EA London’s gender-related attendance dynamics, to try and defuse speculation by pointing people to relevant data that’s available. In retrospect, I probably should have just commented on the thread about women being less likely to return to EA London meetups instead of this one, but here we are.

I think the quotes from the surveys offer important insights, and that it’d be bizarre to try to understand how EA London’s events are perceived without them. I didn’t claim they offer a definitive explanation (just one that’s more informed than pure intuition), and I certainly didn’t argue we should start restricting discussions on lots of important topics.

Actually, one of my biggest takeaways from the survey quotes is that there’s low-hanging fruit available, opportunities to make EA more inclusive and better at seeking truth at the same time. The cost/benefit profile of (for example) an icebreaker at a retreat is extremely attractive. It makes people feel more welcome, it builds the sort of trust that makes it easier to have conversations on controversial topics, and it makes those conversations better by inviting a broader range of perspectives. Even if you hate icebreakers (like I do), based on the survey data they seem like a really good idea for EA retreats and similar events.

Comment by jon_behar on The Importance of Truth-Oriented Discussions in EA · 2019-03-16T15:24:27.790Z · score: 7 (3 votes) · EA · GW

The comment about how the gender imbalance “led to different topics to be discussed” might (or might not) reflect alienating conversations, but I agree with your general point that the survey quotes are more about the "vibe". I think the quotes suggest that simple things like running icebreakers and saying hi to people (whether or not they are women and/or queer) can be really valuable.

Comment by jon_behar on The Importance of Truth-Oriented Discussions in EA · 2019-03-14T14:46:06.591Z · score: 30 (16 votes) · EA · GW
The authors of Making Discussions Inclusive theorise that alienating discussions are the reason why women were less likely than men to return to meetings of EA London, despite being equally likely to attend in the first place. We note that such a conclusion would depend on an exceptionally high quantity of alienating discussions, and is prima facie incompatible with the generally high rating for welcomingness reported in the EA survey. We note that there are several possible other theories… The claim is not that any of these theories are necessarily correct, just that it would be premature to assume that the main cause of the gender gap is the kinds of alienating conversations discussed in Making Discussions Inclusive.

EA London recently published a 2018 Impact Report with a whole appendix on diversity issues, which discusses this issue directly and strongly suggests alienating conversations/behavior are a very real issue. Key excerpts (emphasis added):

I’ve included an appendix on our Holiday/EA Unconference to capture some of the negative feedback we received. To my memory, this is the event we received the worst feedback on and the feedback is often related to diversity and inclusion, which seems like the most prominent theme in the negative feedback we both received and sought throughout the year.
From 31/8 to 3/9, EA London held a Holiday/EA Unconference at the EA Hotel for 26 guests (plus 3 organisers). Some guests had also attended one/both of the retreats we held at the hotel in the previous week (Life Review Weekend and Careers Week).
Guests were asked to complete a feedback form on the final evening which included the question “Was there anything that made you feel uncomfortable or unwelcome during the event? (no need to write an answer if no)”. Of the 17 people who completed the form, 6 answered this question, and the 4 answers that mention gender follow:
· “On the Friday evening as people began to arrive I felt that as the members of the group changed, the vibe changed and I felt that it was too 'male Oxbridge graduate' (which I seem to find harder to connect with).”
· “It was quite a lot more male than the careers week, which was very noticeable and led to different topics to be discussed. I felt having a more diverse EA crowd in the careers week was more welcoming and relaxed (even as a corduroy wearing straight white man!) and it would be worth considering how to attract diverse representation.” (Compare to this person’s response to the question “Anything else you want to say?” on the last day of Careers Week: “I think this was one of the most diverse EA events I have been to in terms of having a lot of people with different backgrounds, ideologies (and more women!). I enjoyed this and it made the event feel welcoming.”)
· “Yes, I believe that insufficient time was taken on the code of conduct, and in a situation where someone made a somewhat sexist comment I wasn't really sure what to do, because I don't think we discussed what was/wasn't acceptable conversation and what to do in a situation where someone felt uncomfortable.”
· “We might've needed peaceful and open icebreakers (shy people in mind!!) or a welcome session with some notes on mental health and inclusiveness to create a more welcoming atmosphere. Many arrived without doing proper hellos, and the first night people cliqued up without much mingling, and this was very different from the previous 2 retreats. Also, some women and queer men felt excluded by the way that the suddenly majority-male crowd naturally behaved (pushed aside, talked over, not said hello to etc.)”

This narrative is also consistent with the EA Survey data on welcomingness, which found women rated EA as less welcoming than men to a statistically significant degree. (The high rating for welcomingness across the whole EA Survey seems much less relevant, as those results will by definition largely reflect the beliefs of demographics with the highest representation.)

A few other notes:

· I’d guess the vast majority of behavior that’s perceived as unwelcoming wasn’t intended as unwelcoming.

· I doubt women are the only instance where an underrepresented EA demographic feels unwelcome. For example, I have a strong prior that conservatives wouldn’t feel very welcome interacting with the EA community (center left + left outnumbers center right + right by ~17x) and that this is problematic.

· Tip of the hat to EA London for writing up their experiences so others can learn from them. I wish the next group running an EA retreat had access to a consolidated resource with synthesized lessons from other groups’ experience, and practical examples of how (not) to promote an inclusive, truth-seeking culture; to the best of my knowledge this doesn’t exist.

· I wholeheartedly agree with @Aidan O’Gara’s call to operationalize discussions of this nature as much as possible. Simply distinguishing between “issues that are relevant to EA” and “good issues to discuss at an intro to EA event” would go a long way toward helping people not talk past each other.

Comment by jon_behar on SHIC Will Suspend Outreach Operations · 2019-03-08T18:20:21.001Z · score: 24 (13 votes) · EA · GW

Others have said this but it bears repeating: thank you for writing this up! This sort of detailed post-mortem is a resource the whole community can learn from. Kudos!

Comment by jon_behar on What to do with people? · 2019-03-07T15:21:42.289Z · score: 11 (7 votes) · EA · GW

GiveWell also recently announced they are doubling the size of their research team, which will presumably uncover even more giving opportunities that can absorb a lot of funding.

Comment by jon_behar on What to do with people? · 2019-03-06T17:07:57.113Z · score: 14 (7 votes) · EA · GW

Agree this is scalable, as long as people aren’t purely trying to maximize income/giving capacity which I don’t think is sustainable. (I’ve done quantitative finance while passionate about that work, and I’ve done it when I wasn’t passionate about it; the former is WAY easier). I’d love to see more early career EAs pursue work that they’re interested in and donate effectively while building skills, networks, etc.

Comment by jon_behar on After one year of applying for EA jobs: It is really, really hard to get hired by an EA organisation · 2019-03-04T17:40:44.182Z · score: 1 (1 votes) · EA · GW

Thanks, helpful to know about!

Comment by jon_behar on After one year of applying for EA jobs: It is really, really hard to get hired by an EA organisation · 2019-02-28T18:19:59.157Z · score: 5 (4 votes) · EA · GW

Thanks Michelle, great to learn about this resource, for some reason I’d thought it was only volunteer stuff. Will start posting jobs there going forward and hope other employers will too.

I would still like to see something broader exist as well… That’s the resource I’d want if I were an EA job seeker, since it’d let each candidate use their own perspective on what counts.

Comment by jon_behar on After one year of applying for EA jobs: It is really, really hard to get hired by an EA organisation · 2019-02-28T16:15:05.076Z · score: 8 (6 votes) · EA · GW

Thanks to OP and commenters for sharing their experiences! Very helpful!

Based on this feedback, seems like it would be valuable for 80K to (significantly?) cut back the degree to which they curate their job board and/or for someone to create a list of all open jobs that could be broadly defined as EA. Either (or both) of these steps should help to address the issue of too many candidates chasing too few jobs.

Comment by jon_behar on Can the EA community copy Teach for America? (Looking for Task Y) · 2019-02-26T16:48:54.498Z · score: 6 (5 votes) · EA · GW
Why isn't "Earning to give", or even just "donate effectively" sufficient to have the large positive effects "Task Y" could have?

I see “donate effectively” as “Task Y”, and would love to see that get wider acceptance. To get around concerns that people might “set and forget” their giving at a low level, I think messaging around effectiveness should include the idea of improving one’s giving over time. For instance, people can take a “personal best” approach to giving and try to give better (give more, give more effectively, do more research, etc.) each year.

My sense is this would go a long way in reducing some of the elitism concerns @John_Maxwell_IV mentioned. And rather than reducing option value, I think it would give EA a lot more flexibility and robustness if it could draw on a large pool of people with diverse skills who were sympathetic to core EA ideas. For instance, it’d be a lot easier to close the “operations gap” if there were a lot of EA sympathetic people with strong ops experience, and the same will be true of the next talent gap that comes along (my guess is that a “management gap” is the next natural progression).

Comment by jon_behar on One for the World: update after 6 months of our first staff member · 2019-02-18T20:29:13.811Z · score: 9 (4 votes) · EA · GW
One percent seems low for an initial pledge, given that the "average American" donates ~2% of income

FWIW, I don’t think this is a great reference point. The 2015 Money for Good study found a median gift of ~.4% of income in their sample (which overweighted high income households), and 1% giving would be something like to top quintile. So getting young people to (initially) donate 1% to effective causes seems like an excellent win.

A new, lower risk way to teach effective giving

2018-12-20T00:47:29.047Z · score: 7 (2 votes)

Narrative but Not Philosophical Argument Motivates Giving to Charity

2018-11-26T18:18:38.587Z · score: 13 (5 votes)

A Research Framework to Improve Real-World Giving Behavior

2018-10-04T18:25:56.012Z · score: 7 (3 votes)

The Giving Game Project's 2017 Annual Report

2018-06-05T20:58:50.122Z · score: 6 (6 votes)

The Life You Can Save's 2017 Annual Report and 2018 Strategic Plan

2018-05-03T19:53:48.663Z · score: 10 (10 votes)

The Giving Game Project's Vision and Strategic Plan

2017-05-23T23:21:42.879Z · score: 4 (4 votes)

Are Giving Games a better way to teach philanthropy?

2017-05-13T00:36:41.371Z · score: 4 (4 votes)

The Life You Can Save's 2016 Annual Report

2017-04-26T22:46:55.707Z · score: 8 (8 votes)

A Request for Funding from The Giving Game Project

2016-08-01T18:17:05.548Z · score: 14 (14 votes)

The Giving Game Project's Annual Report

2016-07-20T18:07:47.070Z · score: 12 (12 votes)

Wish Peter Singer a happy 70th birthday!

2016-06-20T21:03:15.599Z · score: 1 (7 votes)

The Life You Can Save's 2015 Year in Review

2016-02-12T23:22:45.302Z · score: 5 (5 votes)