EA Infrastructure Fund: May–August 2021 grant recommendations

post by Max_Daniel, Buck, Michelle_Hutchinson, Chi, MichaelA · 2021-12-24T10:42:08.969Z · EA · GW · 19 comments

This is a link post for https://funds.effectivealtruism.org/funds/payouts/may-august-2021-ea-infrastructure-fund-grants


  Grant Recipients
    Grants evaluated by Michael Aird
    Grants evaluated by Max Daniel
    Grants evaluated by Michelle Hutchinson
    Grants evaluated by Buck Shlegeris
    Grants evaluated by Chi Nguyen


Between May and August 2021, the EA Infrastructure Fund has recommended the following grants. These include both grants from our Q2 grant cycle and from the rolling applications process [EA · GW] we’ve adopted since. (Earlier grants are covered in the report on our Q1 grant cycle [EA · GW].) This post covers grant applications received before August 3rd (with the exception of one such grant that was decided later and will be included in the next report), and in three cases follow-up grants for grants that were awarded before the end of August.

Note that we’re providing less detailed payout reports than last round. This is because we’re constrained by fund manager time, and so more detailed payout reports trade off directly with our grantmaking ability. We feel that it is valuable for donors and applicants to understand the fund managers’ thinking, but that after the detailed payout reports from last round [EA · GW] and our AMA [EA · GW] earlier this year, there are diminishing returns from continuing to publish in-depth payout reports.

(Note that us being constrained by fund manager time does not mean that ‘EA is primarily vetting constrained.’ We think that receiving a larger number of applications proposing highly valuable projects could increase the overall impact of the EAIF’s grants much more than fund managers spending more time on evaluating current applications. In fact, we suspect marginal fund manager time would be best spend on active grantmaking, i.e., soliciting more such applications. In this sense, the bigger constraint seems to be the supply of high-impact project ideas, project founders, and/or matching projects to founders.)

Strategic updates include:

The EAIF has functionally spent down its funds that is, the total grant funding we’d like to award for existing applications exceeds our available funds. You can help to close this funding gap by donating to the EAIF. However, donors who care about whether or not their donations would funge with Open Philanthropy grants to the EAIF may want to wait until we announce the outcome of the EAIF's full funding application with Open Philanthropy.

Despite the funding gap, applications to the EAIF remain open. We are confident that we can acquire additional funding, and in the meantime can refer time-sensitive grants to private funders.


Grant Recipients

Grants evaluated by Michael Aird

(For most of these grants, I (Michael) also made multiple forecasts that informed the grant decisions. I plan to look back on these forecasts later to help myself reflect on and improve my intuitions and reasoning. But I won’t report these forecasts below because many would be hard to interpret without much more context on the grant, some are sensitive, and we’re generally opting for fairly brief payout reports this round.)

Grants evaluated by Max Daniel

Grants evaluated by Michelle Hutchinson

Grants evaluated by Buck Shlegeris

Grants evaluated by Chi Nguyen


Comments sorted by top scores.

comment by Ozzie Gooen (oagr) · 2021-12-25T20:59:06.719Z · EA(p) · GW(p)

Really happy to see this! 

I wanted to leave my very-quick takes, in case they might be useful.
(Do flag if they're more annoying than beneficial, and feel free to ignore)


  • I'm impressed by the number of grants and the writeup.
  • It seems like EA community building is growing a lot and that's exciting.
  • I wonder if some of these grants/writeups were worth the time. The managers are pretty senior, and some of these grants are tiny (<$4k). At very least, some of the writing seems a bit excessive. (It's valuable, for sure, but the opportunity cost is also good. This is clearly a nitpick.) 
  • Related to the above, I'm a bit curious if some people might be undervaluing their work or undervaluing the benefits for more funding. Some of these numbers seem like they are pretty low. That said, the fund would need more money to raise payments across the board. (I'd like to see this, but it's interesting that other funders aren't quickly coming in with that money)
  • I think it would have been a bit neater, from a funder perspective, if the longtermist/animals/welfare-specific parts would have been funded instead by those respective funds. I feel pretty mixed about having them here, because I'd expect it to make donations less promising for donors of any of the three preferences/beliefs.
  • The fact that it took 4 months from the end of the decision period (August) to now (December 25) is of course a natural thing to work to improve upon. I imagine it would have been fine if this would have been released in parts, so that more could have been released sooner. Also, I would have probably traded off less detail for having it sooner. 

On some specific grants:

Seems pretty safe and I look forward to the results. One small point is that I think Metaculus might be a c-corp startup; if this is the case, it could be neat if future donations come with equity. (Mainly a concern if donations increase, and the investment is general-purpose)

Effective Thesis
This surprised me. I like the idea of effective thesis, and haven't been keeping track of it closely. I'm curious how 3 FTE will be spent here. Mainly, I'm curious about just how effective this sort of intervention is, particularly at a larger scale.

Ewelina Tur
Therapy sessions seem like a really safe bet. There are lots of mental health issues in our community (and in all the other communities I know of). I feel a bit weird about subsidizing services, as opposed to just giving more money to the people who might need the services, but subsidization does have some advantages, especially early on.

Rachel Shu

So we decided to provide $25,000 for initial work on the series, in hopes that that would allow Rachel and Larissa to make enough progress to give us a better sense of what impacts the full project would have, helping us to thereby decide whether the remainder of the work should be funded.

I like seeing negotiation via the grantmaking process like this, as opposed to just saying yes/no. Trying to push for more small-scale projects at first seems pretty good.

Effective Institutions Project

we view the area of ‘improving institutions’ as requiring stronger strategic foundations before it can productively absorb significant amounts of resources

I think this is broadly reasonable, but I am curious on what work would really move the needle here. If it's the case that such work would reveal that this area is underfunded, then it would be very high-priority for someone to do this work, if it were obvious what exactly to do. (I'm biased, as I work in the area and would also like for this work to exist)


Replies from: Khorton, oagr, MichaelStJules
comment by Kirsten (Khorton) · 2021-12-26T11:31:12.482Z · EA(p) · GW(p)

I agree with the comment on small grants. I'd be happy to see policies to save grantmakers time, such as:

(a) the EA funds doesn't make grants of less than $5k, or

(b) Grants of less than $5k are only approved for specific categories and it's a yes/no thing where the write-up just lists "grants in category X"

I worry about the opportunity costs of grantmakers spending time evaluating very small grants. I'd like to encourage people who are thinking of applying for e.g. just web hosting to think through the next few steps they'll need for their charitable initiative (maybe they'll need web hosting, then a lawyer, then an accountant) and apply for a few next steps at once!

It's much better in my view if we can convince people to apply for a few steps on their charitable project rather than reapplying to the fund, with extra work for themselves and the grantmaker and significant time lost, every few months.

Replies from: MichaelA
comment by MichaelA · 2021-12-27T09:23:40.599Z · EA(p) · GW(p)

Here are some excerpts from the EAIF application page which might be of interest in this context:

Grant sizes are typically between $5,000 and $200,000, but can be as low as $1,000 and higher than $500,000. EA Funds can make grants to individuals, non-profit organizations, academic institutions, and other entities. You do not need to be based in the US or the UK to apply for a grant. If you are unsure whether you are eligible for a grant, please simply apply.


Please aim to submit as few applications as possible. E.g., new projects should apply every 2–6 months, established organizations once a year, unless there is a significant reason for submitting multiple applications. Please think ahead about possible further expenses and consider including contingency in your budget.

I think the second of those paragraphs is in line with you suggesting that people should apply for funding for a few steps at once, rather than just for the first step. And I think that's indeed often the best move.

That said, I'd personally prefer that people still feel that it's fine to apply for grants of just $1k-5k, if they think that's the best move if their case. This is because:

  • That can be the appropriate amount for some projects
    • Sometimes that just really is all someone needs for something. 
    • Or sometimes they really should start a project with a pilot / initial steps that will only cost roughly 1k-5k, and should only get funding for further work after that pilot / those initial steps are completed or partially completed. 
  • Small projects are still often fairly impactful
    • I think that, for approved grants, net positive impact will be positively correlated with grant size, but I'd guess the correlation is moderate rather than strong. Some small projects 
  • It seems important that there's some mechanism for funding small things, and EA Funds seems like currently one of the best mechanisms for that
    • There's also mechanisms like being friends with pretty well off EAs who know your skills and plans well and are willing to donate to your work, but that will miss many people & projects that should get funded
    • Part of why this seems important is as a stepping stone towards more ambitious work; often a lot of the value of the small projects is giving someone a chance to test & build fit for some kind of work that they could then do more of later. If the initial steps weren't funded, the whole journey might not happen (so to speak). 
  • Small projects are often not very time-consuming to evaluate
    • There's a (I'd say) moderate correlation between grant size applied for and time spent on evaluating the grant. I'd guess we generally move fewer dollars per hour when looking at small grants than at big grants, so small grants are in some sense less efficient as a use of fund managers time, but only moderately so. 

(These are just my personal quickly written views, and I acknowledge that many of those statements are quasi-quantitative yet vague and not based on systematically looking at data - hopefully it's useful anyway.)

Replies from: oagr
comment by Ozzie Gooen (oagr) · 2021-12-27T18:22:43.582Z · EA(p) · GW(p)

Good points. I think I agree that being able to offer grants in between $1k-$5k seems pretty useful. If they get to be a pain, I imagine there will be ways to lessen the marginal costs. 

comment by Ozzie Gooen (oagr) · 2021-12-25T21:00:19.126Z · EA(p) · GW(p)

I think I'd like to see more "quick takes" by more people on things like this. It's fine if they're rough, but it helps provide both a sanity check and a survey of what community members think.

comment by MichaelStJules · 2021-12-25T23:52:34.376Z · EA(p) · GW(p)

I think it would have been a bit neater, from a funder perspective, if the longtermist/animals/welfare-specific parts would have been funded instead by those respective funds. I feel pretty mixed about having them here, because I'd expect it to make donations less promising for donors of any of the three preferences/beliefs.


+1, although I can see some as pretty borderline, e.g. a seminar or course on longtermism or another cause is definitely still community building, and can bring in more community builders who might do broader EA community building. Brian Tan, Shen Javier, and AJ Sunglao ($11,000) is cause-specific (mental health), but doesn't really fit in the other funds (not that you've suggested they don't fit here). Funding work that supports multiple groups or unaffiliated individuals within an area that falls entirely under the scope of a single fund seems borderline, too.

comment by mic (michaelchen) · 2021-12-24T23:01:48.281Z · EA(p) · GW(p)

Anonymous ($3,900): This person works half-time on EA community building and their laptop broke; this funding allowed them to buy a new laptop and an external monitor. Due to the urgency in this case, a fund manager talked with the grantee and ascertained their productivity needs, and then we made an ad hoc decision. If we receive a larger volume of similar applications in the future, we will put together a more systematic policy for when we fund productivity-increasing equipment and for how to determine the size of such grants.

I trust that this grant was worthwhile, but from this description, it's not clear to me why buying a new laptop and external monitor requires $3,900 (e.g., maybe it was for a gaming laptop to run machine learning experiments). For future grant descriptions, it might be worthwhile to explain the cost when it seems surprisingly high.

Replies from: Habryka, MichaelA
comment by Habryka · 2021-12-25T04:22:43.833Z · EA(p) · GW(p)

Huh, I am surprised about this. My current guess is that anyone in a job that makes more than $80k a year (or produces >$80k of value) should spend at least $1-2k on their laptop (my general recommendation is to just get the maxxed out latest Macbook, maybe skipping the graphics card if they aren't planning to do anything graphics related). And then with two external monitors it seems very reasonable to go up to ~$4k. 

People use their laptop for 8+ hours a day. It seems very likely that it's worth spending a lot of money making that be better. 

Replies from: MichaelStJules
comment by MichaelStJules · 2021-12-25T08:26:36.962Z · EA(p) · GW(p)

I agree about the laptop price, but I think external monitors should only cost about $200 each, from a quick Google search. Seems like it should have been <$3K total.

Replies from: oagr
comment by Ozzie Gooen (oagr) · 2021-12-25T19:55:47.473Z · EA(p) · GW(p)

This is an incredibly minor point, but I normally recommend  monitors in the $300 to $500 range. And for some cases $1k to $2k is fine. (For bright displays, extra high resolution, any sort of media work, or widescreen.) I find the $200 ones to be pretty low-brightness / low-resolution / small.

comment by MichaelA · 2021-12-26T08:45:42.874Z · EA(p) · GW(p)

(I'm one of the guest fund managers for this round, but this is my personal opinion only)

For future grant descriptions, it might be worthwhile to explain the cost when it seems surprisingly high.

I think this is a reasonable idea to float, but I'd disagree that it should be encouraged more than it is at the moment, especially for very small grants like that $3.9k one. Reasons include:

  • Fund managers' time has a quite high opportunity cost, so our default should be to not add more detail to these reports unless that seems very useful
  • If a given small grant turned out to cost 2 times as much as it should've, that'd hardly at all affect the cost-effectiveness of the relevant EA Fund. So it doesn't seem like very useful info for donors deciding whether to donate to the fund, nor a very useful thing to hold fund managers accountable on in hopes that that leads to them making better decisions.
    • One might think "Yes, this one grant was small, but if there were many such grants then it could be useful to check the value-for-money of all of them or just a random subsample of them." But even if there'd been 10 grants like this one, that'd still just be 39k out of the 1.7m paid out in this round, i.e. ~2%. So knowing whether those grants could've been smaller without reducing impact would still hardly affect estimates of the overall cost-effectiveness of EAIF in this round. 
    • Meanwhile, readers should probably be very uncertain about how much positive or negative impact came from lots of the grants, and I'd say there's "where most of the action is" in evaluating how good EAIF is as a donation opportunity.
  • I think readers should be far more uncertain about the impact of the purchase of the laptop & monitors than about whether that level of spending was necessary for achieving that impact. So if more detail was worth providing, I'd probably suggest starting with more detail on the former. (Like what will this person be doing, why does Buck thing it'd be useful, and why does Buck in general think better equipment is valuable for increasing impact.)
  • I think one of the main sources of value for these payout reports is insights into what fund managers (and the people they consult with) see as valuable projects, why they think those projects are valuable, what their models of various cause areas and projects and such are, and how they think. I think cost breakdowns wouldn't help much with that.
    • Here are some things I'd be more excited about from this perspective (though they're probably still not worth the time):
      • Payout reports more clearly signalling how our (i.e., fund managers) excitement levels about different grants varied
      • Making clear our biggest reservation(s) about each grant
      • Giving Fermi estimates or forecasts of the impacts (positive or negative) of the grant
      • Spending longer writing out background thinking on a topic that a given grant connected to
        • E.g., me writing in more detail about my views of the paths to impact for Metaculus and for forecasting
  • Fund managers often don't know exactly what the cost breakdown will be, and I'm confident that that's a good thing.
    • (To be fair, if we switched to showing cost breakdowns in payout reports, we could just leave things as approximations.)
    • Reasons why this situation is good:
      • It's good if grantees have flexibility to make adjustments to precisely how they use the money
      • It's good if fund managers make decisions relatively quickly (in terms of calendar time and in terms of hours spent), and things like checking what specific models of laptop and external monitor would be chosen and how much they'd cost seems like it won't improve decisions enough to be worth the time
      • It's bad if grantees apply for precisely what their best guess of the required amount is, such that there's a (say) 25% chance they later realise that it would've been better to apply for more and the fund manager would've approved that, but now they have to either just use some of their personal savings, apply again, or spend less than would be ideal
        • So it's often best for people to apply for a bit more than their best guess of what they need
        • Applying again is less good than just having a larger original grant because it takes up some additional time from the grantee, the fund manager(s), and the ops people who process the grant, and because it creates another delay before the grantee gets the decision & money
  • I think many people who are in the EA community and are doing or are on track to do high value "direct work" are being too frugal with their money and insufficiently conscious of the value of their time. In other words, many people should be spending more to allow themselves to be more productive during their productive hours (e.g., by having a better computer or paying for books/software that would be useful) or to allow themselves to have more productive hours (e.g., spending less time searching for deals or free options, perhaps having a cleaner). I'd worry that encouraging cost breakdowns for small grants like this $3.9k one is the kind of thing that could exacerbate these problems.
comment by Neel Nanda · 2021-12-24T15:08:38.617Z · EA(p) · GW(p)

This seems like a really exciting set of grants! It's great to see EAIF scaling up so rapidly.

Replies from: jared_m
comment by jared_m · 2021-12-26T13:46:57.003Z · EA(p) · GW(p)

Agreed. Chiming in that the microCOVID Project's calculator and work has been invaluable to our family since 2020. I don't know Rachel, Larissa, or others involved in the project - but they're in our personal pantheon of pandemic heroes. We lost one family member in NY to COVID. It's easy to imagine we and others would have experienced more loss, absent their work.

Could not pass up this opportunity to thank them publicly, and note how excited we are to watch any videos they produce as a result of this funding.

comment by henrith · 2021-12-25T12:45:02.458Z · EA(p) · GW(p)

Some feeedback: This is a fun and interesting way to learn about things going on in the EA community so I appreciate you posting it to the forum.

To me the description lengths work well for this kind of post, as I trust I can find more information about most of the projects if/when I go look for it, and about specific decisions if you keep doing AMAs.

comment by AdamGleave · 2021-12-30T23:58:56.007Z · EA(p) · GW(p)

That being said, we might increase our funding threshold if we learn that few grants have been large successes, or if more funders are entering the space.

My intuition is that more funders entering the space should lower your bar for funding, as it'd imply there's generally more money in this space going after the same set of opportunities. I'm curious what the reasoning behind this is, e.g. unilateralist curse considerations?

Replies from: MichaelA
comment by MichaelA · 2021-12-31T08:33:29.587Z · EA(p) · GW(p)

My guess is that it's mostly that more funders being in the space increases the chance that good things get funded even if EAIF doesn't fund them, thus reducing the cost of false negatives (i.e., EAIF rejecting things that in reality should've been funded), thus reducing the cost of raising the bar. (But that's just a guess.)

comment by MichaelStJules · 2021-12-26T05:09:14.485Z · EA(p) · GW(p)
  • Due to an increase in the number of high-quality applications, we believe that grants from Open Philanthropy will be crucial for making sure that all sufficiently impactful projects in the EA infrastructure space can be funded. However, if grantseekers received funding from both Open Philanthropy and the EAIF, this could result in a total grant larger than deliberately chosen by Open Philanthropy. It could also duplicate effort between the funders, and grants to larger organizations tend to be outside our wheelhouse.
    • On the other hand, grantees with large budgets would ideally be supported by multiple funders, each contributing roughly ‘their fair share’.
  • Our tentative policy for responding to this challenge is to adopt a heuristic: by default, the EAIF will not fund organizations that are Open Philanthropy grantees and that plan to apply for renewed funding from Open Philanthropy in the future.
    • However, we will consider exceptions: if your organization is an Open Philanthropy grantee, please explain in your EAIF application why your funding request can’t be covered by past or future Open Philanthropy grants. Valid reasons can include unanticipated and time-sensitive opportunities that require a small-to-medium grant with a fast turnaround, or funding requests restricted to a different purpose than the activities supported by Open Philanthropy.

It seems like it would be better to decide this more on an individual basis (beyond the exceptions), depending on the exact reasons why Open Phil didn't fund them further, which you could ask them for (assuming this doesn't take too much of everyone's time). Besides only wanting to contribute 'their fair share' (donor coordination), they may also want to reduce (direct) dependence on Open Phil and have others vet these opportunities semi-independently. The organizations for which those were the only reasons Open Phil didn't fund them more are plausibly the best ones to donate marginal funds to even after Open Phil grants, and ruling them out could mean individual donors can do better by donating to them than to the EAIF. Of course, Open Phil also might not be aware of many of EAIF's grantees at all (or able to donate to them for various reasons), or Open Phil could make wrong decisions to not fund EAIF grantees it was aware of, so EAIF could therefore beat Open Phil by funding them.


For individuals supported by Open Philanthropy, e.g. through their Technology Policy Fellowship [EA · GW], AI Fellowship, or Early-Career Funding [EA · GW], there is no change. They continue to remain eligible for EAIF funding as before.

Is the different treatment here primarily because "grants to larger organizations tend to be outside our wheelhouse"? It seems like Open Phil should be less hesitant to fully fund these, because

  1. small EA donors are less likely to notice these opportunities,
  2. there's no public support test to maintain charitable status (it's not a charity at all), and
  3. leaving room for more funding here so that others have to re-vet many small opportunities is less efficient than having others re-vet fewer large opportunities.
Replies from: Davidmanheim, MichaelStJules
comment by Davidmanheim · 2021-12-26T06:57:39.927Z · EA(p) · GW(p)

I think that coordination costs are high, and this is a reasonable way to shortcut the problem by publicly telling Open Phil not to rely on EAIF to top up grants. If that should be changed for some reason, Open Phil can tell EAIF directly.

comment by MichaelStJules · 2021-12-26T09:45:27.966Z · EA(p) · GW(p)

I'd guess that individual donors planning to support the EAIF should top up Open Phil grantees, either with or instead of donating to the EAIF.

Based on the argument here [EA · GW] by Ben Todd, I think individual donors who donate to Open Phil grantees do at least as good as both Open Phil and the EAIF, if all of the following assumptions hold:

  1. Open Phil's judgement is sufficiently good,
  2. The reason Open Phil doesn't fund EAIF grantees directly is because Open Phil believes them to be less cost-effective on the margin than its own direct grantees, so, at least,
    1. not just due to legal reasons,
    2. not just due to falling outside the scope of all of their grantmaking areas,
    3. not just because Open Phil tends to avoid small grants of the size the EAIF makes, and
    4. not just because Open Phil was unaware of them, and
    5. not due to any combination of the above.
  3. Open Phil's grant sizes are sufficiently sensitive in the right way to how much funding its grantees get.

Basically, under these assumptions, Open Phil grantees are either most cost-effective on the margin and in expectation, or Open Phil responds by granting less to those grantees and spending that funding on the next best marginal opportunities, which may include the EAIF itself. If Open Phil thought another dollar to EAIF would have been better than the last dollar to one of its other grantees, it would have granted that dollar to the EAIF instead, and vice versa. The "vice versa", that EAIF is at least as good on the margin as Open Phil's last dollar to its other grantees, seems to contradict the conjunction of 1 (Open Phil's judgement) and 2 (Open Phil believes EAIF grantees are less cost-effective than Open Phil grantees).

So, under these same assumptions, individual donors who donate to the EAIF risk doing worse than Open Phil, since the EAIF has ruled out contributing to opportunities that are better in expectation than their actual EAIF grantees.


All of the above assumptions seem plausibly wrong. That being said, Open Phil could in principle avoid subpoints 2.1, 2.2 and 2.3 by just deferring more of its grantmaking to the EAIF.

Note that this doesn't require the EAIF managers to have worse judgement than Open Phil. The point is that Open Phil starts with the most promising opportunities it can, and the EAIF does not. You could think of Open Phil and the EAIF as one organization, and you can do better by topping up the best opportunities for which they left room for other donors than by adding to the marginal opportunities.