## Posts

What reason is there NOT to accept Pascal's Wager? 2022-08-04T14:29:29.155Z

Comment by Khai on What reason is there NOT to accept Pascal's Wager? · 2022-08-07T12:37:10.431Z · EA · GW

The only point I was making was that not all versions of God are equally likely, so the possible utilities of heaven and hell don't cancel. I don't know what the most likely form of God is, but it sounds like we both agree that not all of them are equally likely.

Comment by Khai on What reason is there NOT to accept Pascal's Wager? · 2022-08-07T12:30:54.396Z · EA · GW

I clarified in my edit at the top of my post what I mean by "accept Pascal's Wager". To repeat I see it as accepting the idea that way to do the most (expected) good is to prevent as many people as possible from going to hell, and cause as many as possible to go to heaven, regardless of how likely it is that heaven/hell exists (as long as it's non-zero).

As for what this entails I have no idea. For now I'm just trying to decide whether to pursue this aim or not. The way I would actually do that comes later, if I choose to accept.

Comment by Khai on What reason is there NOT to accept Pascal's Wager? · 2022-08-07T12:21:29.773Z · EA · GW

Oh wait sorry I got confused with totally different comment that did add an extra assumption. My bad...

As for the actual comment this thread is about, expected value theory can be derived from the axioms of VNM-rationality (which I know nothing about btw), whereas proposition 3 is not really based on anything as far as I'm aware, it's just a kind of vague axiom of itself. I feel we should restrain from using intuitions as much as possible except when forced to at the most fundamental level of logic — like how we don't just assume 1+1=2, we reduce it to a more fundamental level of assumptions: the ZFC axioms.

In summary, propositions 1 and 3 are mutually exclusive, and I think 1 should be accepted more readily due to it being founded in a more fundamental level of assumptions.

Comment by Khai on What reason is there NOT to accept Pascal's Wager? · 2022-08-06T13:05:48.060Z · EA · GW

I suppose I could see reason to make this assumption, given that you could get used to the luxuries of heaven and it would start to be less pleasurable. However this doesn't really eliminate the problem because there's still the possibility that this assumption is incorrect, meaning the probability of infinite payoff is still non-zero and therefore the wager still stands.

Comment by Khai on What reason is there NOT to accept Pascal's Wager? · 2022-08-06T13:01:42.333Z · EA · GW

That's true but I think we need to make the least number of intuition based assumptions possible. Yitz's suggestion adds an extra assumption ON TOP of expected value theory, so I would need a reason to add that assumption.

Oops I got mixed up and that response related to a totally different comment. See my reply below for my actual response

Comment by Khai on What reason is there NOT to accept Pascal's Wager? · 2022-08-05T17:41:17.731Z · EA · GW

This argument is one that makes intuitive sense, and of course I am no exception to that intuition. However intuition is not the path to truth, logic is. Unless you can provide a logic-founded reason why almost certain loss with a minuscule chance of a huge win is worse than unlikely loss with a probable win, then I can't accept the argument.

Comment by Khai on What reason is there NOT to accept Pascal's Wager? · 2022-08-05T17:28:35.025Z · EA · GW

OMG this is EXACTLY the kind of reply I was looking for, and more. Thank you so much!

Since I'm pretty new into philosophy, I believe what you say although I don't understand it. However you have given me a ton of invaluable starting points from which I can now begin learning how to answer these kind of questions myself.

You can be fairly confident that your comment will end up triggering a major (and probably inevitable) turning point in my philosophical journey and therefore my life since it sounds like utilitarianism in the form I have always followed is flawed and will need to be revised or even scrapped entirely.

Once again, thanks so much!

Comment by Khai on What reason is there NOT to accept Pascal's Wager? · 2022-08-05T16:52:38.012Z · EA · GW

Please could you elaborate on the relevance to Pascal's Wager? I don't see who is "out to get you" in Pascal's Wager

Comment by Khai on What reason is there NOT to accept Pascal's Wager? · 2022-08-05T16:42:00.136Z · EA · GW

I see what you're saying but you'd need to provide me with a reason to accept your axiom.

Since I'm a moral realist, you'd have to convince me that it is likely to be true, rather than simply that it is convenient.

Comment by Khai on What reason is there NOT to accept Pascal's Wager? · 2022-08-05T16:32:16.802Z · EA · GW

Well the only existing evidence for the nature of a God, given it exists, are the beliefs billions of people have held over thousands of years. This evidence suggests (no matter how weakly), that God is as they think it is. In the absence of any other evidence, this means it is more likely that God is as they think than anything else.

(Especially so when you think about how many people have believed these things and over how much time; surely it's reasonable to consider the possibility that they are right. [I think I might be talking about "epistemic humility" but I'm not familiar with the terminology])

Comment by Khai on What reason is there NOT to accept Pascal's Wager? · 2022-08-05T15:35:14.717Z · EA · GW

While I still disagree that the decision is non-binary, you do bring up a possibility I hadn't thought of which is that NO ACTION could be the BEST ACTION if you think practicing the wrong religion makes you more likely to go to hell and less likely to go to heaven.

Although now I think about it, that wouldn't imply no action, rather that you should encourage atheism, behaviour generally agreed upon across religions, and possibly converting people from one religion to a more likely one.

Comment by Khai on What reason is there NOT to accept Pascal's Wager? · 2022-08-04T17:05:32.441Z · EA · GW

The way I see it, the wager IS binary, but the choice is "act as though heaven/hell exists: yes or no". If you answer "yes", then of course there are multiple ways to proceed from that point, but that doesn't mean the wager itself isn't binary.

If I decide to accept the wager, the next step will be a WHOLE other thing and definitely not binary.

Comment by Khai on What reason is there NOT to accept Pascal's Wager? · 2022-08-04T16:58:17.496Z · EA · GW

I mean accepting that the way to do the most good (in terms of expected value) is to prevent as many people as possible from going to hell and cause as many as possible to go to heaven.

As for what this would entail, I have no idea because I'm pretty uninformed when it comes to religion.

Comment by Khai on What reason is there NOT to accept Pascal's Wager? · 2022-08-04T16:53:23.334Z · EA · GW

Obviously pieces of the Bible can be used to justify any viewpoint, but I think it's at least worth mentioning this one verse that points directly against the Christian God being evidentialist:

John 20:29
Jesus said to him, "Because you have seen me, you have believed. Blessed are those who have not seen, and have believed."

I see this as saying that doubting your faith by needing evidence is less noble that having full trust in your faith by not requiring evidence. In other words, true faith doesn't need evidence.

I found this quote when someone pointed it out displayed at the front of a church, and regardless of its relevance to this conversation, I think it's a fascinating verse, especially since it was considered important enough for this church to place in large writing at its entrance.

Comment by Khai on What reason is there NOT to accept Pascal's Wager? · 2022-08-04T14:59:30.782Z · EA · GW

Huh, that's an interesting position that I wish I could agree with, but I just can't see why beliefs billions of people have had for thousands of years would be less likely to be true than a God who does in fact exist but is totally different from what everyone thought and instead rewards... reason?

Do you think you could elaborate on why this Evidentialist God seems more likely to you?

Comment by Khai on What reason is there NOT to accept Pascal's Wager? · 2022-08-04T14:50:31.175Z · EA · GW

Ahh yes your last paragraph is a good point that I hadn't considered. It doesn't convince me that I should reject the wager, but it does mean that I shouldn't take extreme actions that go against most people's moral beliefs in pursuit of these types of wagers.

Comment by Khai on What reason is there NOT to accept Pascal's Wager? · 2022-08-04T14:38:56.074Z · EA · GW

:( this is not the answer I was hoping for... (I don't believe in heaven or hell so the prospect of accepting the wager is a bit depressing)

Thanks a lot though for the response and the really helpful link!

Comment by Khai on EA Survey 2020: Cause Prioritization · 2022-06-05T08:25:41.777Z · EA · GW

It would be great to see stats for how many people identify whether longtermism, shorttermism, or neither.

This seems to be a major divide in cause prioritisation and questions are often raised about how much of the community are longtermist so it seems like this information would be very valuable.

Comment by Khai on Tiny Probabilities of Vast Utilities: Defusing the Initial Worry and Steelmanning the Problem · 2022-04-24T07:26:24.944Z · EA · GW

Huh it's concerning that you say you see standard utilitarianism as wrong because I have no idea what to believe if not utilitarianism.

Do you know where I can find out more about the "undefined" issue? For me this is pretty much the most important thing for me to understand since my conclusion will fundamentally determine my behaviour for the rest of my life, yet I can't find any information except for your posts.

Thanks so much for your response and posts. They've been hugely helpful to me

Comment by Khai on Tiny Probabilities of Vast Utilities: Defusing the Initial Worry and Steelmanning the Problem · 2022-04-21T17:41:06.762Z · EA · GW

I am not very experienced in philosophy but I have a question.

You present a problem that needs solving: The funnel-shaped action profiles lead to undefined expected utility. You say that this conclusion means that we must adjust our reasoning so that we don't get this conclusion.

But why do you assume that this cannot simply be the correct conclusion from utilitarianism? Can we not say that we have taken the principle axioms of utilitarianism and, through correct logical steps, deduced a truth (from the axiomatic truths) that expected utility is undefined for all our decisions?

To me the next step after reaching this point would not be to change my reasoning (which requires assuming that the logical processes applied were incorrect, no?) but rather to reject the axioms of utilitarianism, since they have rendered themselves ethically useless.

I have a fundamental ethical reasoning that I would guess is pretty common here? It is this: Given what we know about our deterministic (and maybe probabilistic) universe, there is nothing to suggest any existence of such things as good/bad or right choices/wrong choices and we come to the conclusion that nothing matters. However this is obviously useless and if nothing matters anyway then we might as well live by a kind of "next best" ethical philosophy that does provide us with right/wrong choices, just in case of the minuscule chance it is indeed correct.

However you seem to have suggested that utilitarianism just takes you back to the "nothing matters" situation which would mean we have to go to the "next next best" ethical philosophy.

Hmm I just realised your post has fundamentally changed every ethical decision of my life...

It would be greatly appreciated if anyone answers my question not only the OP, thanks!