Blueprint for billionaires?

post by alsilverback · 2022-05-11T16:14:44.511Z · EA · GW · 4 comments

This is a question post.

Does EA or other cause prioritization org have a very rough, non-nuanced, breakdown of how to spend, say, 100 bil eur for maximal effect?

Assume modern world context, and that the philanthropist has no philosophical, cause, timeframe, etcetera preferences, just optimize for impact


answer by alexrjl · 2022-05-17T13:43:44.743Z · EA(p) · GW(p)

There are a few organisations who work with high net worth individuals to deploy their money, and my guess is that anyone with this kind of capital would be able to speak to all of them fairly easily. might be interesting for you to check out, as well as

If it's actually 100B though, that's bigger than the two biggest EA-adjacent foundations which currently exist, so talking to either of them would be sensible. 

comment by Linch · 2022-05-17T15:54:50.139Z · EA(p) · GW(p)

Agreed, I think the best move as a new mega-billionaire in EA is to set up your own grantmaking agency/apparatus to figure out where you are in the landscape so you can judiciously deploy funds. I think the second best move would be to work closely with Open Phil and Future Fund and learn from them. 

comment by alsilverback · 2022-05-17T15:28:10.850Z · EA(p) · GW(p)

I'm aware of organizations and initiatives that bridge between capital and impact,  what I was looking for, however, was if there is an oversimplified and rough impact portfolio. I underline 'oversimplified' and 'rough' as any such portfolio will depend on the donor's nature and preferences (government of a poor country, climate-activist billionaire, tech co's CSR, etc...) but perhaps someone somewhere did such a division with at least a hint of quantitative diligence and with the breadth of taking all main thematic categories in consideration.


For example, if someone has a 100bil (or a 100k) how much, very roughly, makes sense to be given to long-term causes vs current suffering alleviation, or how much towards AI alignment vs policy lobbying, etc..


All answers are probably wrong, but they help getting closer to the optimal

answer by Henry Howard · 2022-05-11T18:11:09.224Z · EA(p) · GW(p)

No but we should aim to because there are many billions in government and private funds that could be redirected to more effective causes.


Comments sorted by top scores.