Help me understand this expected value calculationpost by AndreaSR · 2021-10-14T06:23:33.750Z · EA · GW · 8 comments
This is a question post.
I'm looking at one of Bostrom's papers (Existential Risk Prevention as Global Priority, p. 19). He includes this expected value calculation which I just can't make sense of:
"Even if we give this allegedly lower bound on the cumulative output potential of a technologically mature civilisation [he's referring to his estimate of 10^52 future lives here] a mere 1 per cent chance of being correct, we find that the expected value of reducing existential risk by a mere one billionth of one billionth of one percentage point is worth a hundred billion times as much as a billion human lives."
When trying to repeat his calculation, I reason as follows: reducing the risk of losing 10^50 expected lives by 10^-20 - that's the same as increasing the probability of getting 10^50 by 10^-20. So, it should go 10^50*10^-20 = 10^30. However, he writes that the expected value of this change is equal to 10^20 lives. It's a fairly trivial calculation, so I assume there's something obvious I've overlooked. Can you help me see what I'm missing?
Comments sorted by top scores.