Questions for Will MacAskill's fireside chat in EAGxAPAC this weekend

post by Bridget_Williams · 2020-11-19T12:16:54.485Z · EA · GW · 1 comments

Contents

  Please post questions at least one hour before the event for them to be considered.
None
1 comment

Hi everyone, I’m hosting a 25-minute fireside chat with Will MacAskill in the EAGxAPAC conference happening this weekend, and I’d love to hear your suggestions on what questions I should ask him. 

The fireside chat is happening at 08:30 AM UTC on Saturday 21st November. 

Please post questions at least one hour before the event for them to be considered.

Thanks in advance!

1 comments

Comments sorted by top scores.

comment by RandomEA · 2020-11-23T03:22:16.306Z · EA(p) · GW(p)

It looks like I'm  too late. But here's something I've been wanting to ask. 

In your paper "The Definition of Effective Altruism," you distinguish effective altruism from utilitarianism on various grounds, including that:

  • EA does not claim that a person must sacrifice their personal interests (e.g. having children) when doing so would bring about greater good; and
  • EA does not claim that a person must violate non-consequentialist constraints in the rare [EA · GW] situations when doing so might bring about greater good.

For me, this points to a broader principle that EA does not require a person to sacrifice something "morally major" to bring about greater good. This would imply that an EA can choose to prioritize things like a duty to contribute their fair share, a duty to family members, and a duty to rescue those they are uniquely positioned to rescue over bringing about greater good. 

However, in a 2015 debate, you argued (scenario; response) that a person alone in a burning building should choose to rescue a Picasso painting (assuming they can keep it) over a child since the money from selling the painting could be used to save thousands of children. 

Do you think effective altruism necessarily entails that position or were you just speaking to what is morally better?