Practical ethics given moral uncertainty

post by William_MacAskill · 2012-01-31T05:00:01.000Z · EA · GW · Legacy · 5 comments

Practical ethics aims to offer advice to decision-makers embedded in the real world. In order to make the advice practical, it typically takes empirical uncertainty into account. For example, we don’t currently know exactly to what extent the earth’s temperature will rise, if we are to continue to emit CO2 at the rate we have been emitting so far. The temperature rise might be small, in which case the consequences would not be dire. Or the temperature rise might be very great, in which case the consequences could be catastrophic. To what extent we ought to mitigate our CO2 emissions depends crucially on this factual question. But it’s of course not true that we are unable to offer any practical advice in absence of knowledge concerning this factual question. It’s just that our advice will concern what one ought to do in light of uncertainty about the facts. But if practical ethics should take empirical uncertainty into account, surely it should take moral uncertainty into account as well. In many situations, we don’t know all the moral facts. I think it is fair to say, for example, that we don’t currently know exactly how to weigh the interests of future generations against the interests of current generations. But this issue is just as relevant to the question of how one ought to act in response to climate change as is the issue of expected temperature rise. If the ethics of climate change offers advice about how best to act given empirical uncertainty concerning global temperature rise, it should also offer advice about how best to act, given uncertainty concerning the value of future generations. Cases such as the above aren’t rare. Given the existence of widespread disagreement within ethics, and given the difficulty of the subject-matter, we would be overconfident if we were to claim to be 100% certain in our favoured moral view, especially when it comes to the difficult issues that ethicists often discuss.

So we need to have an account of how one ought to act under moral uncertainty. The standard account of making decisions under uncertainty is that you ought to maximise expected value: look at all hypotheses in which you have some degree of belief, work out the likelihood of each hypothesis, work out how much value would be at stake if that hypothesis were true, and then trade off the probability of a hypothesis being true against how much would be at stake, if it were true. One implication of maximizing expected value is that sometimes one should refrain from a course of action, not on the basis that it will probably be a bad thing to do, but rather because there is a reasonable chance that it will be a bad thing to do, and that, if it’s bad thing to do, then it’s really bad. So, for example, you ought not to speed round blind corners: the reason why isn’t because it’s likely that you will run someone over if you do so. Rather, the reason is that there’s some chance that you will – and it would be seriously bad if you did.

With this on board, let’s think about the practical implications of maximising expected value under moral uncertainty. It seems that the implications are pretty clear in a number of cases. Here are a few.

1. One might think it more likely than not that it’s not wrong to kill animals for food. But one shouldn’t be certain that it’s not wrong. And, if it is wrong, then it’s seriously wrong – in the same ballpark as murder. So, in killing an animal, one risks performing a major moral wrong, without any correspondingly great potential moral upside. This would be morally reckless. So one ought not to kill animals for food.

2. One might think it more likely than not that it’s not wrong to have an abortion, for reasons of convenience. But one shouldn’t be certain that it’s not wrong. And, if it is wrong, then it’s seriously wrong – in the same ballpark as murder. So, in having an abortion for convenience, one risks performing a major moral wrong, without any correspondingly great potential moral upside. This would be morally reckless. So one ought not to have an abortion for reasons of convenience.

3. One might think it more likely than not that it’s not wrong to spend money on luxuries, rather than giving it to fight extreme poverty. But one shouldn’t be certain that it’s not wrong. And, if it is wrong, then it’s seriously wrong – in the same ballpark as walking past a child drowning in a shallow pond. So, in spending money on luxuries, one risks performing a major moral wrong, without any correspondingly great potential moral upside. This would be morally reckless. So one ought not to spend money on luxuries rather than giving that money to fight poverty.

5 comments

Comments sorted by top scores.

comment by Vaidehi Agarwalla (vaidehi_agarwalla) · 2021-01-07T12:38:08.450Z · EA(p) · GW(p)

A number of links in this article are broken - would it be possible to fix them? Specifically the links from 1. and 2. "in the same ballpark as murder"

Replies from: edoarad
comment by EdoArad (edoarad) · 2021-01-08T10:59:47.004Z · EA(p) · GW(p)

The second is here (paywalled) and I am not sure what was the first. If you find it I can use the moderator privileges and edit the post to fix the links.

Replies from: Tristan Cook
comment by Tristan Cook · 2021-05-30T09:10:08.882Z · EA(p) · GW(p)

From 1. "the same ballpark as murder" the Internet Archive has it saved here
The link in 3 "in the same ballpark as walking past a child drowning in a shallow pond" is also dead, but  is in the Internet archive here

Edit: the link in 2 is also archived here

comment by Prabhat Soni · 2020-10-30T07:45:49.199Z · EA(p) · GW(p)

Thanks! After so long I finally understood moral uncertainity :P

comment by Hedgie · 2022-04-11T11:23:22.687Z · EA(p) · GW(p)

This blog, written in 2012, is outdated and inaccurate with regard to its statements on climate change and provides dangerous fodder to the arguments of climate denialists or even those who just wish to equivocate and obscure the issues. It should therefore be edited or deleted.

It says, “For example, we don’t currently know exactly to what extent the earth’s temperature will rise, if we are to continue to emit CO2 at the rate we have been emitting so far.” This is categorically not true. Because we do know to what extent the earth’s temperature will rise if we continue on our current (April 2022) trajectory of CO2 emissions: 2.7°C.

See: World On Course For 2.7°C Temperature Rise By 2100 – Even If All Current Climate Commitments Are Met - Health Policy Watch

The blog post goes on to say, “The temperature rise might be small, in which case the consequences would not be dire. Or the temperature rise might be very great, in which case the consequences could be catastrophic.” Again, for a given amount of CO2, the rise in temperature is relatively predictable. Uncertainty only kicks in with regard to positive feedback loops - e.g. when melting of the permafrost causes methane to be released, which increases the temperature, which causes more methane to be emitted, then permafrost to melt further etc. But there is no uncertainty regarding the link between emissions and temperature rise per se.

Next, the blog says, “To what extent we ought to mitigate our CO2 emissions depends crucially on this factual question. But it’s of course not true that we are unable to offer any practical advice in absence of knowledge concerning this factual question. It’s just that our advice will concern what one ought to do in light of uncertainty about the facts.”

Again, no. These statements are wildly fuzzy and inaccurate. There is no uncertainty about this aspect of the science and facts of climate change. For any given amount of CO2 emissions, we can predict within a relatively high degree of accuracy, what temperature rise will ensue, and what the planet will then look like. Yet climate denialists can point to this blog post by a well-known philosopher and say, hey, look people, this guy at the University of Oxford, no less, is in doubt about the climate change science, he says the facts are uncertain.

Philosophers should beware of straying into fields in which they have no clear understanding, and regarding which they cannot communicate in a scrupulously clear manner. The consequences could be downright dangerous.

https://report.ipcc.ch/ar6wg2/pdf/IPCC_AR6_WGII_SummaryForPolicymakers.pdf