peterbarnett's Shortform

post by peterbarnett · 2021-03-08T23:03:29.429Z · EA · GW · 4 comments


Comments sorted by top scores.

comment by peterbarnett · 2021-03-08T23:03:29.788Z · EA(p) · GW(p)

Flipping the Repugnant Conclusion

Imagine a world populated by many, many (trillions) of people. These people's lives aren't purely full of joy, and do have a lot of misery as well. But each person thinks that their life is worth living. Their lives might be a be bit boring or they might be full of huge ups and downs, but on the whole they are net-positive.

From this view it seems really strange to think that it would be good for every person in this world to die/not exist/never have existed in order to allow a very small number of privileged people to live spectacular lives. It seems bad to stop many people from living a life that they mostly enjoy, in order to allow the flourishing of the few.

I think this hypothetical is a decent intuition pump for why the Repugnant Conclusion isn't actually repugnant. But I do think it might be a little bit dishonest or manipulative. It frames the situation in terms of fairness and equality; we can sympathize with the many slightly happy people who are maybe being denied the right to exist, and think of the few extremely happy people as the privileged elite. It also takes advantage of status quo bias; by beginning with the many slightly happy people it seems worse to then 'remove' them. 

Replies from: MHarris,
comment by MHarris · 2021-03-11T14:54:58.259Z · EA(p) · GW(p)

I've always thought the Repugnant Conclusion was mostly status quo bias, anyway, combined with the difficulty of imagining what such a future would actually be like.

I think the Utility Monster is a similar issue. Maybe it would be possible to create something with a much richer experience set than humans, which should be valued more highly. But any such being would actually be pretty awesome, so we shouldn't resent giving it a greater share of resources.

Replies from: Daniel_Eth
comment by Daniel_Eth · 2021-03-22T21:08:19.326Z · EA(p) · GW(p)

Humans seem like (plausible) utility monsters compared to ants, and  many religious people have a conception of god that would make Him a utility monster ("maybe you don't like prayer and following all these rules, but you can't even conceive of the - 'joy' doesn't even do it justice - how much grander it is to god if we follow these rules than even the best experiences in our whole lives!"). Anti-utility monster sentiments seem to largely be coming from a place where someone imagines a human that's pretty happy by human standards, and thinks the words "orders of magnitude happier than what any human feels", and then they notice their intuition doesn't track the words "orders of magnitude".

comment by James Smith ( · 2021-03-11T14:10:01.415Z · EA(p) · GW(p)

I like this perspective. I've never really understood why people find the repugnant conclusion repugnant!