Posts

Comments

Comment by morganlawless on Long-Term Future Fund: April 2019 grant recommendations · 2019-04-10T17:51:36.841Z · EA · GW

Thanks for the response. I don’t have the time to draft a reply this week but I’ll get back to you next week.

Comment by morganlawless on Long-Term Future Fund: April 2019 grant recommendations · 2019-04-08T21:05:06.720Z · EA · GW

Hello, first of all, thank you for engaging with my critique. I have some clarifications for your summary of my claims.

  1. Ideally, yes. If there is a lack of externally transparent evidence, there should be strong reasoning in favor of the grant.

  2. I think that there is no evidence that using $28k to purchase copies of HPMOR is the most cost-effective way to encourage Math Olympiad participants to work on the long-term future or engage with the existing community. I don't make the claim that it won't be effective at all. Simply that there is little reason to believe it will be more effective, either in an absolute sense or in a cost-effectiveness sense, than other resources.

  3. I'm not sure about this, but this was the impression the forum post gave me. If this is not the case, then, as I said, this grant displaces some other $28k in funding. What will that other $28k go to?

  4. Not necessarily that risky funds shouldn't be recommended as go-to, although that would be one way of resolving the issue. My main problem is that it is not abundantly clear that the Funds often make risky grants, so there is a lack of transparency for an EA newcomer. And while this particularly applies to the Long Term fund, given it is harder to have evidence concerning the Long Term, it does apply to all the other funds.

Comment by morganlawless on Long-Term Future Fund: April 2019 grant recommendations · 2019-04-08T20:36:18.083Z · EA · GW

Mr. Habryka,

I do not believe the $28,000 grant to buy copies of HPMOR meets the evidential standard demanded by effective altruism. “Effective altruism is about answering one simple question: how can we use our resources to help others the most? Rather than just doing what feels right, we use evidence and careful analysis to find the very best causes to work on.” With all due respect, it seems to me that this grant feels right but lacks evidence and careful analysis.

The Effective Altruism Funds are "for maximizing the effectiveness of your donations" according to the homepage. This grant's claim that buying copies of HPMOR is among the most effective ways to donate $28,000 by way of improving the long-term future rightly demands a high standard of evidence.

You make two principal arguments in justifying the grant. First, the books will encourage the Math Olympiad winners to join the EA community. Second, the book swill teach the Math Olympiad winners important reasoning skills.

If the goal is to encourage Math Olympiad winners to join the Effective Altruism community, why are they being given a book that has little explicitly to do with Effective Altruism? The Life You Can Save, Doing Good Better, and _80,000 Hours_are three books much more relevant to Effective Altruism than Harry Potter and the Methods of Rationality. Furthermore, they are much cheaper than the $43 per copy of HPMOR. Even if one is to make the argument that HPMOR is more effective at encouraging Effective Altruism — which I doubt and is substantiated nowhere — one also has to go further and provide evidence that the difference in cost of each copy of HPMOR relative to any of the other books I mentioned is justified. It is quite possible that sending the Math Olympiad winners a link to Peter Singer’s TED Talk, “The why and how of effective altruism”, is more effective than HPMOR in encouraging effective altruism. It is also free!

If the goal is to teach Math Olympiad winners important reasoning skills, then I question this goal. They just won the Math Olympiad. If any group of people already had well developed logic and reasoning skills, it would be them. I don’t doubt that they already have a strong grasp of Bayes’ rule.

I also want to point out that the fact that EA Russia has made oral agreements to give copies of the book before securing funding is deeply unsettling, if I understand the situation correctly. Why are promises being made in advance of having funding secured? This is not how a well-run organization or movement operates. If EA Russia did have funding to buy the books and this grant is displacing that funding, then what will EA Russia spend the original $28,000 on? This information is necessary to evaluate the effectiveness of this grant and should not be absent.

I have no idea who Mikhail Yagudin is so have no reason to suspect anything untoward, but the fact that you do not know him or his team augments this grant’s problems, as you are aware.

I understand that the EA Funds are thought of as vehicles to fund higher risk and more uncertain causes. In the words of James Snowden and Elie Hassenfeld, “some donors give to this fund because they want to signal support for GiveWell making grants which are more difficult to justify and rely on more subjective judgment calls, but have the potential for greater impact than our top charities.” They were referring to GiveWell and the Global Health and Development Fund, but I think you would agree that this appetite for riskier donations applies to the other funds, including this Long Term Future Fund.

However, higher risk and uncertainty does not mean no evidentiary standards at all. In fact, uncertain grants such as this one should be accompanied with an abundance of strong intuitive reasoning if there is no empirical evidence to draw from. The reasoning outlined in the forum post does not meet the standard in my view for the reasons I gave in the prior paragraphs.

More broadly, I think this grant would hurt the EA community. Returning to the quote I began with, “Effective altruism is about answering one simple question: how can we use our resources to help others the most? Rather than just doing what feels right, we use evidence and careful analysis to find the very best causes to work on.” If I were a newcomer to the EA community and I saw this grant and the associated rationale, I would be utterly disenchanted by the entire movement. I would rightly doubt that this is among the most effective ways to spend $28,000 to improve the long term future and notice the absence of “evidence and careful analysis”. If effective altruism does not demand greater rigor than other charities, then there is no reason for a newcomer to join the effective altruism movement.

So what should be done?

  1. This grant should be directed elsewhere. EA Russia can find other funding to meet its oral promise that should not have been given without already having funding.

  2. EA Funds cannot both be a vehicle for riskier donations as well as the go-to recommendation for effective donations, as is stated in the Introduction to Effective Altruism. This flies in the face of transparency for what a newcomer would expect when donating. This is not the fault of this grant but the grant is emblematic of this broader problem. I also want to reiterate that I think this grant still does not meet the evidentiary standard, even when it is considered under the view of EA Funds as a vehicle for riskier donations.

Sincerely, Morgan Lawless