EA Funds has appointed new fund managers

post by Jonas Vollmer, SamDeere · 2021-03-23T10:55:29.535Z · EA · GW · 33 comments


  Rationale for appointing new fund managers
  Our process
  Appointment decisions
  Guest managers
  Current fund managers
    Animal Welfare Fund
      Kieran Greig (chairperson)
      Alexandria Beck
      Lewis Bollard
      Marcus A. Davis
      Mikaela Saccoccio
      Karolina Sarek
    Long-Term Future Fund
      Asya Bergal (chairperson)
      Adam Gleave
      Oliver Habryka
      Further information and plans:
    EA Infrastructure Fund
      Max Daniel
      Michelle Hutchinson
      Buck Shlegeris


Rationale for appointing new fund managers

EA Funds has historically appointed fund managers in a somewhat ad hoc way, and we’ve never had a defined length of time that they’ll serve for. In 2020, we began consultations with existing fund managers to make the process of appointment more clearly defined. 

We decided that fund managers should serve for a defined period of time (currently two years), after which they can reapply for another term. New fund managers start with a trial period, so that we can assess their skills and collaboration with their colleagues.

Considerations that we took into account included:

Our process

The AWF, EAIF, and LTFF operate on a committee model. They all underwent a similar process for appointments, and all references to appointments in this post should be taken to be covering these three funds. By contrast, the Global Health and Development Fund will continue to be headed by GiveWell’s CEO Elie Hassenfeld and managed by GiveWell staff (James Snowden in particular).

In December 2020, we opened applications for the positions of fund manager and fund chairperson at each of the relevant funds. We asked existing fund managers, trustees, and other advisors to nominate potential candidates, and the job description and application form were circulated among these candidates. Candidates were also given the opportunity to nominate additional candidates in their own networks.

We decided to run a private process (rather than a public one) for the following reasons:

We received 66 applications from new candidates, and some fund managers chose to reapply as well. Candidates were asked to complete our work tests or submit a relevant existing work sample (e.g., from a previous application to Open Philanthropy). These work tests included a grant evaluation, a document outlining their active grantmaking ideas, and (for chairperson applicants only) a document explaining their strategic outlook for the fund. Work samples were blinded and graded by two or more reviewers in most cases.

We then attempted to resolve any remaining uncertainties through a combination of interviews with candidates, formal and informal references, and internal discussion.

Appointment decisions

Our decisions were largely informed by the quality of the work samples and prior relevant experience. We also took into account the overall composition of each team, aiming to ensure that a range of viewpoints and backgrounds was represented. 

When making the final appointment decisions, we faced some tricky judgment calls:

The appointment decisions were made by a committee composed of some CEA trustees and the EA Funds Executive Director: Claire Zabel, Owen Cotton-Barratt, Nick Beckstead, and Jonas Vollmer. Committee members recused themselves from some discussions and decisions in accordance with our conflict of interest policy. We appointed the new fund managers for two-year terms, with the first grantmaking cycle in March–April 2021 serving as a probationary period.

Guest managers

We decided to experiment with appointing guest fund managers, who will serve for one or more grant rounds. We see the following benefits to this:

However, there are also a few downsides:

Our process for appointing guest managers is very similar to the process for permanent fund managers outlined above. The March–April 2021 grantmaking cycle will help us decide to what extent we will continue our guest manager program in the future.

Current fund managers

Animal Welfare Fund

Kieran Greig (chairperson)

Kieran Greig is the Director of Research for Farmed Animal Funders, a group of large donors who each give over $250,000 annually to end factory farming.

He previously worked as a researcher at Animal Charity Evaluators, and prior to that was a co-founder of Charity Entrepreneurship and Charity Science Health. He has written about topics like improving the welfare of farmed fish and supporting plant-based alternatives to animal products [EA · GW]. He has a B. Sc. from Monash University and a Masters from La Trobe University.

Alexandria Beck

Alexandria is the Director of the Open Wing Alliance (OWA) at The Humane League. She oversees a global coalition of more than 70 organizations working to end the abuse of chickens worldwide, with a focus on institutional campaigns to ban battery cages for egg-laying hens. Since 2017, Alexandria has led the coalition's growing grant program and overseen the distribution of over 3.7 million dollars to support OWA groups’ corporate cage-free and broiler welfare campaigns.

Lewis Bollard

Lewis Bollard is the Program Officer responsible for farm animal welfare at Open Philanthropy. He leads Open Philanthropy’s strategy for Farm Animal Welfare. Before joining Open Philanthropy, he worked as Policy Advisor & International Liaison to the CEO at The Humane Society of the United States (HSUS). Prior to that, he was a litigation fellow at HSUS, a law student, and an associate consultant at Bain & Company. He has a B.A. from Harvard University in Social Studies and a JD from Yale Law School.

Marcus A. Davis

Marcus A. Davis is the Co-Executive Director and co-founder of Rethink Priorities, a think tank dedicated to figuring out the best ways to make the world a better place. He leads their research and strategy, focusing on animals. Prior to that he was a co-founder of Charity Entrepreneurship and Charity Science Health.

Mikaela Saccoccio

Mikaela Saccoccio is the Executive Director of Farmed Animal Funders (FAF), a donor learning community whose members donate USD $250,000 or more annually to charitable initiatives fighting factory farming. She brings together 35 foundations and high-net-worth individuals to build connections, foster collaboration, and amplify impact. Previously, Mikaela worked at The Humane League to end the abuse of animals raised for food. 

Karolina Sarek

Karolina is co-founder and Director of Research at Charity Entrepreneurship. There, she creates a research agenda and processes and leads the research team, aiming to find and compare the most evidence-based, cost-effective, and neglected interventions in multiple cause areas. She also serves as a board member and advisor for various EA nonprofits and think tanks, such as Fish Welfare Initiative, WANBAM, and Legal Priorities Project.

Before Charity Entrepreneurship, she co-founded an organization to improve the impact of nonprofits and social enterprises; worked on measurement and evaluation; and was a researcher for IBM and the Jagiellonian University (JU). At the age of 22, she became a university teaching fellow, lecturing at JU’s Faculty of Mathematics and Computer Science.

Long-Term Future Fund

Asya Bergal (chairperson)

Asya Bergal works as a researcher at AI Impacts and occasionally writes for the AI Alignment Newsletter. Previously, she worked as a trader and software engineer for Alameda Research, as a Fall Research Analyst at Open Philanthropy, and as a research fellow at the Centre for the Governance of AI at the Future of Humanity Institute (FHI). She has a BA in Computer Science from MIT.

Adam Gleave

Adam Gleave is an AI PhD candidate at UC Berkeley, working on technical AI safety with the Center for Human-Compatible AI (CHAI). His research focuses on improving the evaluation of deep reinforcement learning systems. Adam previously worked as a quantitative trader. He has been researching and donating to effective charities since 2014, including making several grants as part of the 2017 donor lottery.

Adam studied Computer Science at the University of Cambridge, where he led the 80,000 Hours: Cambridge group. 

Oliver Habryka

Oliver Habryka is the current project lead for LessWrong.com, where he tries to build infrastructure for making intellectual progress on global catastrophic risks, cause prioritization, and the art of rationality. He used to work at the Centre for Effective Altruism US as strategic director, ran the EA Global conferences for 2015 and 2016, and is an instructor for the Center for Applied Rationality. He has generally been involved with community organizing for the Effective Altruism and Rationality communities in a large variety of ways. He studied Computer Science and Mathematics at UC Berkeley, and his primary interests are centered around understanding how to develop communities and systems that can make scalable progress on difficult philosophical and scientific problems.

Advisors: Nick Beckstead, Nicole Ross, Matt Wage.

Further information and plans:

EA Infrastructure Fund

Max Daniel

Max is a Project Manager for the Research Scholars Programme at Oxford University's Future of Humanity Institute (FHI). Previously, he was a Senior Research Scholar at FHI, where his research focused on macrostrategy and AI governance. Max holds a master’s in mathematics with a minor in philosophy from Heidelberg University. Before joining FHI, he led the research wing of the Effective Altruism Foundation (now the Center on Long-Term Risk), where he still serves as a board member.

Michelle Hutchinson

Michelle is Assistant Director of the One-on-one team at 80,000 Hours. Before that, she set up the Global Priorities Institute, did the operational set-up of the Centre for Effective Altruism, and ran Giving What We Can. She has a PhD in global priorities research from Oxford University. 

Buck Shlegeris

Buck splits his time between EA movement building and research related to the impact of emerging technologies. He worked at the Machine Intelligence Research Institute between 2017 and 2020 on alignment research, AI safety movement building, and recruiting. He previously worked as a software engineer, occasionally working on EA-related projects such as a computational model of theories of consciousness as a contractor for Open Philanthropy. Buck has a degree in computer science from the Australian National University.

Advisors: Luke Ding, Nick Beckstead, Nicole Ross, Matt Wage.

Further information and plans:


Comments sorted by top scores.

comment by SomeonesBurnerAcount · 2021-03-23T20:11:32.860Z · EA(p) · GW(p)

It seems like everyone affiliated with the EA Infrastructure Fund is also strongly affiliated with longtermism. I admire that you are going to use guest managers to add more worldview diversity, but insofar as the infrastructure fund is funding a lot of the community building efforts for effective altruism writ large should we worry about the cause neutrality here?

Replies from: Jonas Vollmer
comment by Jonas Vollmer · 2021-03-23T20:38:52.865Z · EA(p) · GW(p)

I agree that greater representation of different viewpoints on the EA Infrastructure Fund seems useful. We aim to add more permanent neartermist fund managers (not just guest managers). Quoting from above:

  • We’ve struggled to round out the EAIF with a balance of neartermism- and longtermism-focused grantmakers because we received a much larger number of strong applications from longtermist candidates. To better reflect the distribution of views in the EA community, we would like to add more neartermists to the EAIF and have proactively approached some additional candidates. That said, we plan to appoint candidates mostly based on their performance in our hiring process rather than their philosophical views.

Does that answer your question? Please let me know if you had already seen that paragraph and thought it didn't address your concern.

EDIT: Also note that Ben Kuhn is serving as a guest manager this round. He works on neartermist issues.

(Terminological nitpick: It seems this is not an issue of "cause neutrality" but one of representation of different viewpoints. See here [? · GW] –  the current fund managers are all cause-impartial; neartermist fund managers wouldn't be cause-agnostic either; and the fund is supporting cause-general and cause-divergent work either way.)

comment by Peter Wildeford (Peter_Hurford) · 2021-03-23T21:15:33.347Z · EA(p) · GW(p)

Is the Global Health and Development Fund still going to be just Elie for the foreseeable future? (Not that there's anything wrong with that.)

Replies from: Linch, Jonas Vollmer
comment by Linch · 2021-03-23T23:11:40.046Z · EA(p) · GW(p)

I disagree. I think there's something wrong with that, inasmuch as global health donors can already directly defer to Givewell, and Elie's public views on within-cause prioritization do not appear to be obviously different from Givewell's.

Replies from: Jonas Vollmer, IanDavidMoss
comment by Jonas Vollmer · 2021-03-24T10:33:28.685Z · EA(p) · GW(p)

I don't think that's right. As you can see by comparing the payout reports of the Maximum Impact Fund with those of the Global Health and Development Fund,  these two funds serve different purposes:

  • The Maximum Impact Fund grants exclusively to GiveWell top charities implementing proven interventions
  • The Global Health and Development Fund makes grants for technical assistance that helps scale evidence-backed global health and development interventions, but isn't itself RCT-backed

So the Global Health and Development Fund is more speculative and serves a different purpose than the Maximum Impact Fund. At least for now, I'm happy that both options exist.

I'm personally also interested in setting up a third option in the global health and development space: a hits-based global health and development fund. This fund could support things like developing-world public health regulation, research on neglected developing-world diseases, policy advocacy for cost-effective interventions, research growth diagnostics, etc. We could set up this option in collaboration with GiveWell (which has been doing a lot of work in the area) or independently.

Replies from: Neel Nanda
comment by Neel Nanda · 2021-03-24T21:33:35.716Z · EA(p) · GW(p)

Huh, I find this surprising. I'd thought the Global Health and Development Fund was already intended to focus on hits-based giving in global health. Can you elaborate a bit more on what the middle ground being hit here is, by the current fund?

Replies from: Jonas Vollmer
comment by Jonas Vollmer · 2021-03-25T08:58:34.273Z · EA(p) · GW(p)

Here's my attempt at a characterization of the distinction (people should feel free to correct me if they think I'm wrong):

The GHDF made several grants to IDinsight and IPA for RCT research, which could help with the development of new proven interventions (for Covid prevention in particular). It also made grants to Instiglio for technical design of results-based financing, to J-PAL's Innovation in Government Initiative to promote evidence-based policies in developing countries, and Fortify Health, a new potential GiveWell top charity.

These grants are all related to proven, evidence-based interventions, and help scale or promote that approach. The only exception is the grant to One For The World, which is more like an EA meta intervention. EDIT: There's also the Centre for Pesticide Suicide Prevention grant, some info here.

In contrast, one could try to do things that are even more hits-based. An example of a past EA success in this area is helping reallocate £2.5 billion in DFID resources towards research funding for neglected tropical diseases (as I understand it, there were some specific reasons to believe that EAs actually had a large impact on that budget change that weren't discussed publicly). Of course, some of that research was in the form of RCTs, but I guess a lot of it was more fundamental. The key motivating factor is more like "NTD research still is a very important, neglected, and tractable area" rather than "we want to scale proven interventions". The indirectness of the policy advocacy route makes it more hits-based as well; there are no RCTs on whether that kind of policy advocacy works.

Or take the idea of developing-world public health regulation, e.g. tobacco taxation. This is not an RCT-backed intervention in a narrow sense, but nonetheless estimated to be extremely cost-effective based on some back-of-the-envelope calculations.

Another example might be ballot initiatives [EA · GW], though these are less scalable.

It's not clear that any of this requires an additional fund. Perhaps the GHDF can simply do more of both. (edited)

Replies from: CatherineHollander
comment by CatherineHollander · 2021-03-26T22:48:13.635Z · EA(p) · GW(p)

Thanks for raising these questions! I work at GiveWell, and we're planning to update the EA Global Health and Development Fund page to make the distinction between it and the Maximum Impact Fund clearer—we think we can do better to explain the difference.

Here's a quick summary:

  • The Maximum Impact Fund is granted regularly to the highest-value funding opportunities we see among GiveWell's recommended charities. This is a great option for donors who want to support GiveWell's top charities and are open to their funding being used wherever it can do the most good among them.

    You can read more about the Maximum Impact Fund, and see our past allocations, here.
  • The Effective Altruism Global Health and Development Fund has a broader remit, including programs that are new and about which little is known, as well as policy advocacy, public health regulation, technical assistance, and direct delivery programs, including GiveWell's recommended charities.  

    There's no minimum requirement for evidence for grantees' work. The Fund Manager (Elie) takes an expected value approach to calculating the potential impact of grants from the Global Health and Development Fund. He'll recommend higher-risk grants when they are more effective (in expectation) than GiveWell top charities. He'll fund GiveWell top charities if no such higher-expected-value grant opportunity exists.

    The list of past grants shows the breadth of programs that have received Global Health and Development Fund support. Here's a grant for an RCT on the effects of mask-wearing on COVID-19, a grant for a policy advocacy organization, and a grant to GiveWell top charities.

    This is a great option for donors who are open to taking calculated risks when the expected value appears higher than GiveWell's top charities, but are happy to support GiveWell's top charities if not. 

We don't expect to change the overall portfolio of opportunities that we pursue in response to the allocation of donations between these two funds. In other words, as with most giving, there is fungibility. For example, if there are insufficient funds in the Global Health and Development Fund to support a high-leverage opportunity we want to fund, we would expect to seek funding for that opportunity from Open Philanthropy, with whom we work closely, or another donor. 

I hope that helps clarify!

comment by IanDavidMoss · 2021-03-23T23:51:09.928Z · EA(p) · GW(p)

I also found that confusing, for what it's worth.

comment by Jonas Vollmer · 2021-03-24T12:39:49.644Z · EA(p) · GW(p)

It's Elie + other GiveWell staff, most notably James Snowden.

comment by BrianTan · 2021-03-23T23:52:29.735Z · EA(p) · GW(p)

I think it was sensible that EA Funds did a private appointment process rather than a public application process for this round of new fund managers. 

But for future guest fund managers, would you consider having public applications for that? Quite a few people might be interested and a fit for such a role, but not closely connected with anyone who might nominate them to apply for the role. Maybe even just an expression of interest form would be something EA Funds could have, for people to signal they want to either be a guest fund manager or a permanent one.

Replies from: Jonas Vollmer
comment by Jonas Vollmer · 2021-03-24T12:50:07.674Z · EA(p) · GW(p)

Right now, I don't think this would be better than the alternative because:

  • To make the form useful, we'd have to ask quite a few questions and people would have to spend a correspondingly large amount of time filling it in. This is already a big issue for the permanent fund managers, but the cost-benefit-ratio looks even worse for guest managers. E.g. if 200 people fill in the form and spend 1h on it per person, that's a 200 hours time cost for EA community members. Out of those people, we might only appoint ~5 as guest managers, and guest managers would only spend ~50h total on the fund. It's likely not worth spending 200 hours of EAs' time (plus time of the chairperson, EA Funds team, and CEA operations) on optimizing how we spend 250 hours of guest managers time.
  • I expect a lot of people would fill it in, so it would be a lot of work to sort through the replies. I think it would be better to spend the additional time getting more recommendations and reaching out to them, because it will involve fewer wasted applications, for the reasons outlined in the "Our process" section above:
  • Fund managers control large and fast-growing pools of money and have access to sensitive data. We want to feel confident in their judgment and trustworthiness based on their track record. We felt that it would be considerably harder to sufficiently vet these qualities in candidates who weren’t familiar to us or informal advisors.
  • We believe that fund managers with a large network in the relevant areas do substantially better. Given that we weren’t expanding into new areas, we thought we would likely learn about the strongest and most well-networked candidates through our extended networks. This seems to have been borne out in practice: applicants who were recommended directly to us on average performed substantially better on our anonymized work tests than applicants who were recommended by other candidates or community members.
  • Public hiring processes tend to be more effortful, both for us and the applicants. By reaching out to people who we thought might be a good fit, we hopefully saved a lot of time for ourselves and others.
  • The guest manager model is generally quite effortful already, and making it even more effortful might tip the balance form "overall worth it" to "overall not worth it". However, if it turns out to work very well and we find a way to make it more efficient, I think it'll be worth reconsidering having a public application form.


That said, if you think there are significant benefits of having such a form that I'm overlooking, I'm keen to hear them and open to changing my mind and potentially implementing your suggestion.

Replies from: MichaelA, BrianTan
comment by MichaelA · 2021-03-25T04:36:38.326Z · EA(p) · GW(p)

FWIW, currently, I feel an intuition that the ratio of EA org network-based/private hiring rounds to EA org public hiring rounds is slightly too high.[1] But: 

  • I'm not sure how much I trust that intuition
  • I haven't thought it through in detail
  • The intuition isn't specific to EA Funds
  • I think you'd already be aware of the main considerations pushing in favour of my intuition
  • The reasoning you give sounds pretty valid

In any case, I definitely want to express appreciation for how it seems that, in these comment threads, you are consistently attempting to transparently explain your view and the reasoning behind it and clearly indicate that you'd welcome pushback. That seems really good for the EA Funds staying healthy and gradually improving over time.

[1] I feel that it may be worth noting that I formed this view after starting to be reached out to as part of a few network-based hiring rounds (or similar). 

comment by BrianTan · 2021-03-25T11:23:43.910Z · EA(p) · GW(p)

I guess as a personal example, I would be somewhat interested to apply to be a guest fund manager for the EA Infrastructure Fund. I'm not sure who within the EA Funds network would directly recommend or refer me to apply, hence an expression of interest form is the main way I can signal that I am interested and potentially a good fit for this role.

I think if you opened up public applications for the role, I would like to think I would be within the top 25-50% of applicants (I'm very uncertain here though, since I don't know who I'm applying against. Also, I'm likely to just be in the average candidate range, i.e. the 50% mark). But I think there's a small chance, maybe 5-10%, that I would be as good as the guest fund managers you would be willing to hire that apply through your private application process. And I think a 5-10% chance is good enough for me to spend up to 1 hour applying for the role.

I also want to be able to know what EA Funds looks for to gauge someone's grantmaking skill/ability, and going through an application process with questions about this would help me learn about my own grantmaking skills/abilities/fit too.

Also, an expression of interest form should only take 10 minutes to fill out hopefully, and you can contact the top 10-30 candidates (i.e. people who have the most reputable backgrounds) who fill it up to go through a longer, 1-hour application form. 

I think my background is somewhat reputable enough within the EA community, but I don't know who in the EA Funds network would recommend me to apply to be a fund manager. And maybe there's ~10 other people in similar situations as me who exist, a couple of which could be a good fit to be a guest fund manager, if only they were recommended/referred, or if they could signal their interest through a short form!

Overall though, I understand EA Funds' viewpoint, and would understand if you continue to keep using a private application process. I guess someone who wanted to apply to be a guest fund manager or permanent fund manager could literally just email you though to signal that they are interested, so that they could be invited to apply. But having an expression of interest form could streamline that for you, and increase the chance that someone who's a good fit but isn't connected would still signal to EA Funds that they are interested.

Replies from: Jonas Vollmer, Maxdalton
comment by Jonas Vollmer · 2021-03-25T21:17:00.228Z · EA(p) · GW(p)

Thanks, that makes sense! Before we can do this, we'd probably need a more reliable and faster way to screen through applications. If we develop that, it could be a good idea.

I think our current process would definitely include candidates that are as well-networked in the EA community as you are. I personally was aware of some of your work, and some of our informal advisors have talked to you before I think.

Replies from: BrianTan
comment by BrianTan · 2021-03-26T00:42:02.486Z · EA(p) · GW(p)

Got it, thanks for explaining!

comment by MaxDalton (Maxdalton) · 2021-03-25T11:34:16.023Z · EA(p) · GW(p)

By the way, EA Funds ran this application process and EA Funds now operates independently of CEA [EA · GW].

Replies from: BrianTan
comment by BrianTan · 2021-03-25T11:45:20.036Z · EA(p) · GW(p)

Thanks for clarifying! I edited my comment to say EA Funds instead of CEA now.

comment by Jonas Vollmer · 2021-07-06T16:59:44.499Z · EA(p) · GW(p)

I am very excited to announce that we have appointed Max Daniel [EA · GW] as the chairperson at the EA Infrastructure Fund. We have been impressed with the high quality of his grant evaluations, public communications, and proactive thinking on the EAIF's future strategy. I look forward to having Max in this new role!

comment by BrianTan · 2021-03-23T14:24:43.630Z · EA(p) · GW(p)

Thanks for this post and congrats to the new fund managers! I wish the outgoing fund managers well too.

One quick question: Are all the fund manager roles still on a volunteer basis?

Also, I just saw the new EA Funds homepage. It looks good, and the copy seems well-written and well thought out to me. Kudos!

Replies from: abergal
comment by abergal · 2021-03-23T19:56:32.211Z · EA(p) · GW(p)

Fund managers can now opt to be compensated as contractors, at a rate of $40 / hour.

comment by Peter Wildeford (Peter_Hurford) · 2021-03-23T19:19:48.923Z · EA(p) · GW(p)

Why the secrecy around the identity of the guest managers?

Replies from: Habryka
comment by Habryka · 2021-03-23T19:25:55.571Z · EA(p) · GW(p)

I... don't know either. I think I can tell you who the guest managers are, at least I don't think anybody told me to keep it secret, but I will wait 24 hours for someone to object before I post it here.

Replies from: Jonas Vollmer
comment by Jonas Vollmer · 2021-03-23T20:46:27.457Z · EA(p) · GW(p)

Current guest managers:

  • EA Infrastructure Fund: Ben Kuhn.
  • Long-Term Future Fund: Daniel Eth, Ozzie Gooen, Evan Hubinger.

I've also edited the post to include that information.

There's no secrecy here. Their names will appear in the public payout reports in May. If there's a lot of interest from the community in always learning two months earlier who the guest managers are, we could consider having a public list somewhere. I thought the two-month delay probably wouldn't bother anyone and it's a bit of a hassle to keep the list continually updated (especially if the list of guest managers changes multiple times, as was the case for this round).

comment by MichaelA · 2021-03-25T04:07:33.020Z · EA(p) · GW(p)

We’re also experimenting with a new system of guest fund managers, allowing people who might be a good fit to provide input to the fund for a single grant round. We hope that this will give more people in the community an opportunity to improve their judgment, reasoning, and grantmaking skills, add additional viewpoint diversity to the grant evaluation process, and build a bank of strong candidates to potentially appoint as regular fund managers when we need more capacity.

Just want to quickly say that this seems like a really good idea, and I'm glad EA Funds are doing it.

comment by Cienna · 2021-03-23T14:17:37.480Z · EA(p) · GW(p)

Congratulations to y’all on your new roles and responsibilities!!!

comment by MichaelA · 2021-03-25T04:09:22.927Z · EA(p) · GW(p)

He has been researching and donating to effective charities since 2014, including making several grants as part of the 2017 donor lottery.

Some readers might be interested in checking out Adam's report [EA · GW] about what he ended up donating his lottery "winnings" to and why. (See also Relative Impact of the First 10 EA Forum Prize Winners [EA · GW], which provides some commentary on the value of donations and the writeup.)

comment by ofer · 2021-03-30T09:35:34.111Z · EA(p) · GW(p)

Committee members recused themselves from some discussions and decisions in accordance with our conflict of interest policy.

Is that policy public?

Replies from: Jonas Vollmer
comment by Jonas Vollmer · 2021-03-30T13:42:31.154Z · EA(p) · GW(p)

Not yet, but I hope to publish it soon. (Sometime this year, ideally within the next few weeks.)

Replies from: ofer
comment by ofer · 2022-01-10T16:50:35.068Z · EA(p) · GW(p)

Hi Jonas,

Is there still an intention to make the EA Funds Conflict of Interest policy public?

Replies from: Jonas Vollmer
comment by Jonas Vollmer · 2022-01-13T11:34:40.489Z · EA(p) · GW(p)

Yes, we still do have that intention. We're currently thinly staffed, so I think it'll still take a while for us to publish a polished policy. For now, here's the current beta version of our internal Conflict of Interest policy:

Conflict of interest policy

We are still working on the conflict of interest policy. For now, please stick to the following:

  • Please follow these two high-level principles:
    • 1. Avoid perceived or actual conflicts of interest, as this can impair decision-making and permanently damage donor trust.
    • 2. Ensure all relevant information still gets shared. The conflict of interest policy shouldn’t significantly hamper our ability to exchange information or evaluate grants.
  • Any relationship that could cause significantly biased judgment (or the perception of that) constitutes a potential conflict of interest, e.g. romantic/sexual relationships, close work relationships, close friendships, or living together.
  • If you think you have a potential conflict of interest, disclose the conflict of interest to the chair of the fund. If you are the chair, disclose it to the ED. There is no requirement to specify the type of relationship (e.g., whether it’s a romantic relationship or friendship), but it can be useful to give a rough indication of how large the concern is in the situation at hand.
  • The default suggestion is that you recuse yourself from discussing the grant and voting on it. You should also unshare yourself from the evaluation document (without looking at its contents) or ask someone to do so, so the other fund managers can speak their mind freely. (If you are doing so as the chairperson, you should ask someone else to oversee the grant at a high level, e.g., the chairperson of another fund, or the ED.) For unusually significant conflicts of interest (e.g., best friend or partner), the voting takes place in the evaluation document instead of the Airtable. The other fund managers can still ask you for information or for your opinion on the grant, and you can still offer to give input (just like an informal reference/advisor).
  • If the above means we can’t evaluate a grant, we will consider forwarding the application to another high-quality grantmaker if possible. If delegating to such a grantmaker is difficult, and this policy would hamper the EA community’s ability to make a good decision, we prefer an evaluation with conflict of interest over none (or one that’s significantly worse). However, the chair and the EA Funds ED should carefully discuss such a case and consider taking additional measures before moving ahead.
  • There is no requirement to disclose conflicts of interests or recusals in our public grant payout reports, but you may choose to disclose if you find it useful.
  • The chairperson and/or ED take appropriate actions if the policy is violated, such as recommending greater caution in mild cases, issuing a warning in moderately bad cases, or removing the fund manager from their position and returning grant funding in severe cases.
  • Example 1: Someone applies for a grant related to community building, and one of the fund managers is close friends with one of the grantees. The fund manager recuses themselves, but still reads the “Application for Sharing” document that we usually send to references. They share their thoughts with the primary investigator, who adds their input to the evaluation document and ensures they're taken into account appropriately by the other fund managers.
  • Example 2: Someone applies for a technical AI safety research grant, and they’re housemates with the only person who is knowledgeable about the particular research area, and other grantmakers in this area are either too busy or don’t have great judgment (in our view). In that case, we would carefully discuss the conflict of interest, potentially involve further people (e.g., ED or the trustees), and then potentially make the grant anyway.
comment by MichaelA · 2021-03-25T04:27:39.153Z · EA(p) · GW(p)

The current fund managers predominantly have expertise in AI and macrostrategy. For grant evaluations related to other existential risks or longtermist causes, we plan to continue to get external advice.

I'm glad both that you explicitly acknowledge this potential limitation of the LTFF, and that you have that plan in place for addressing it. 

One of the handful of things I've previously felt a bit uncomfortable/uncertain about regarding the LTFF was that it (a) seemed to mostly have AI-focused fund managers and mostly(?) give to AI-related things, yet (b) presented itself as interested in a broader range of longtermist issues and didn't make it clear that it would focus much more on AI than on other things. 

I didn't see this as a major flaw, since: 

  • I do think AI should receive more longtermist attention than any single other topic (though not >50% of all longtermist attention)
  • The LTFF did also give to some other things
  • The LTFF did report all its payouts and at least snippets of its reasoning for each decision

But the situation still seemed not quite ideal. 

I still don't feel sure that the LTFF is doing everything it should on this front, but it now seems likelier that that's the case or almost the case.

We may also appoint fund managers who are experts in those areas, but given that the number of applications in other individual categories is relatively small, we tentatively prefer appointing fund managers with a generalist longtermist background.

  • That sounds reasonable to me.
  • I think I'd personally see it as ideal if the LTFF always had at least one member who focuses at least 50% of their efforts on longtermist priorities other than AI (e.g. biorisk, nuclear risk, forecasting, improving policy, global governance).
  • It seems a bit of a shame that that isn't currently the case.
    • I'm not saying EA Funds made bad hiring decisions; there are of course other considerations at play when deciding about specific candidates.
    • But I think Ozzie Gooen and maybe Daniel Eth would count, so having them as guest managers - as well as drawing out other people as advisors where relevant - seems to help on this front.
    • I don't think Oliver Habryka would count for what I have in mind, even though I assume he spends less than 50% of his time on AI
      • Though I'm not at all saying he shouldn't be on the committee, and I've very much appreciated his detailed writeups about LTFF decisions.
  • And in any case, it doesn't seem very important to me that there's at least one person who focuses at least 50% of their efforts on a single, specific longtermist priority other than AI (as opposed to a grab bag of longtermist priorities other than AI). So "tentatively prefer[ing] appointing fund managers with a generalist longtermist background", rather than ones with expertise in a specific non-AI area, seems fine to me. 
Replies from: Jonas Vollmer
comment by Jonas Vollmer · 2021-03-25T08:31:40.996Z · EA(p) · GW(p)

Thanks, I personally agree with these points, and I think this is a useful input for our internal discussion.