Prioritization when size matters: Value of informationpost by jh · 2022-01-07T05:16:50.324Z · EA · GW · None comments
Recently, I outlined a new framework [EA · GW] and model [EA · GW] that goes beyond marginal, ITN-style thinking by accounting for the optimal size of funding. This framework was originally developed to help major donors and impact investors with their prioritization decisions. That said, the associated ideas may be useful to any donor. This post highlights one of these ideas.
The idea is that you should choose opportunities for further investigation based both on your current assessments of marginal effectiveness and how much your assessments might change with further investigation. Because, if an opportunity turns out to be great then you can put lots of funding into it. That is, there is 'value of information'.
I think this is intuitive. It also follows from other heuristics, like 'explore and exploit'. Thus, many people in the EA community may already be acting in alignment with this idea.
However, I think there has been so much emphasis on marginal, ITN-style thinking in the EA community that it is worth highlighting this idea. 'Value of information' doesn't have to be a separate consideration or heuristic - it arises simply when you take the first step towards non-marginal thinking by also considering funding size.
To make this clear, suppose you are trying to choose which high-level opportunities to prioritize (e.g. cause areas, or even charities within an area). You have an initial assessment, , of the marginal impact per dollar, , of an opportunity. Perhaps from doing ITN-style assessments or direct cost-effectiveness analyses. You also expect that if you conduct further analyses then your assessment of will update with zero average expected change and standard deviation .
If you only prioritize based on , then there is zero value in doing further analysis because the expected value of after doing that work is still .
However, all else equal, you will want to put more funding into opportunities with higher . Conceptually, think of this as the optimal funding size depending linearly on . Then the total impact value you expect to generate with an opportunity will depend on the product of marginal impact per dollar and funding size: .
So, you should prioritize opportunities based on their expected value after further analysis . The value of information is .
This can result in some opportunities with low but high being top priorities.
One potential example of this is (was) climate change. Over the past several years, I would say that climate has gone from being a relatively neglected cause area in EA (e.g. see this post [EA · GW]) to being an important part of the EA funding landscape.
The big strike against climate in pure ITN-style framing is that it isn't neglected because almost everyone is aware of the issue and tons of funders are putting money towards it. However, it's such a big, varied topic that it seems reasonable to expect high from additional research. Given this, and with the benefit of hindsight, it seems correct that many EAs looked into climate (and continue to do so).
What other examples can you think of?
Comments sorted by top scores.