Help me understand this expected value calculation

post by AndreaSR · 2021-10-14T06:23:33.750Z · EA · GW · 8 comments

This is a question post.

Hi there!

I'm looking at one of Bostrom's papers (Existential Risk Prevention as Global Priority, p. 19). He includes this expected value calculation which I just can't make sense of:

"Even if we give this allegedly lower bound on the cumulative output potential of a technologically mature civilisation [he's referring to his estimate of 10^52 future lives here] a mere 1 per cent chance of being correct, we find that the expected value of reducing existential risk by a mere one billionth of one billionth of one percentage point is worth a hundred billion times as much as a billion human lives."

When trying to repeat his calculation, I reason as follows: reducing the risk of losing 10^50 expected lives by 10^-20 - that's the same as increasing the probability of getting 10^50 by 10^-20. So, it should go 10^50*10^-20 = 10^30. However, he writes that the expected value of this change is equal to 10^20 lives. It's a fairly trivial calculation, so I assume there's something obvious I've overlooked. Can you help me see what I'm missing?

Answers

answer by mic (michaelchen) · 2021-10-14T11:41:56.599Z · EA(p) · GW(p)

Your calculation looks correct to me. (WolframAlpha confirms "10^52 * 1% * 1 billionth * 1 billionth * 1%" is 10^30.) It seems that Nick Bostrom is underestimating the expected value by 10^10.

comment by JP Addison (jpaddison) · 2021-10-14T11:54:52.384Z · EA(p) · GW(p)

A minor factor of ten billion 😉

Replies from: Linch
comment by Linch · 2021-10-14T17:24:17.189Z · EA(p) · GW(p)

A mere order of magnitude of an order of of magnitude!

comment by AndreaSR · 2021-10-15T17:11:02.594Z · EA(p) · GW(p)

Thanks for your reply. I'm glad my calculation doesn't seem way off. Still feel like it's too obvious a mistake for it not to have been caught, if it indeed were a mistake...

8 comments

Comments sorted by top scores.

comment by Linch · 2021-10-14T17:23:42.828Z · EA(p) · GW(p)

Not an excuse, but maybe Bostrom was using the old British definition of "billion," rather than the American and modern British definition of billion? 

Replies from: Larks, AndreaSR
comment by Larks · 2021-10-14T21:44:13.190Z · EA(p) · GW(p)

Even then it seems off?

"Even if we give this allegedly lower bound on the cumulative output potential of a technologically mature civilisation [he's referring to his estimate of 10^52 future lives here] (+52) a mere 1 per cent chance (-2) of being correct, we find that the expected value of reducing existential risk by a mere one billionth (-12) of one billionth (-12) of one percentage point (-2) is worth (=) a hundred (+2) billion (+12) times as much as a billion (+12) human lives."

52-2-12-12-2 = 24 != 26 = 2+12+12

Replies from: Linch
comment by Linch · 2021-10-14T22:34:11.943Z · EA(p) · GW(p)

Sure but what's 2 OOMs between friends?

comment by AndreaSR · 2021-10-15T17:11:56.765Z · EA(p) · GW(p)

Yeah, I've had the same thought. But as far as I can tell, it still doesn't add up, so I figured there must be something else going on. Thanks for your reply, though.