What values would EA want to promote?

post by James_Banks · 2020-07-09T06:35:10.156Z · EA · GW · No comments

This is a question post.

Contents

  Answers
    Wei_Dai
    G Gordon Worley III
    kewlcats
    EdoArad
    HaukeHillebrandt
None
No comments

Problem areas beyond 80,000 Hours' current priorities [EA · GW] mentions "Broadly promoting positive values".


I have some some questions:

What are the values that are needed to further EA's interests?

Where (in which cultures or areas of culture at large) are they deficient, or where might they become deficient in the future?

Problem areas... [EA · GW] mentions "altruism" and "concern for other sentient beings". Maybe those are the two that EA is most essentially concerned with. If so, what are the support values needed for maximizing those values?

Answers

answer by Wei_Dai · 2020-07-09T22:02:33.619Z · EA(p) · GW(p)

If so, what are the support values needed for maximizing those values?

I think a healthy dose of moral uncertainty (and normative uncertainty in general) is really important to have, because it seems pretty easy for any ethical/social movement to become fanatical or to incur a radical element, and end up doing damage to itself, its members [LW · GW], or society at large. ("The road to hell is paved with good intentions" and all that.)

A large part of what I found attractive about EA is that its leaders emphasize normative uncertainty so much in their writings (starting with Nick Bostrom back in 2009), but perhaps it's not "proselytized" as much as it should be day-to-day.

answer by G Gordon Worley III · 2020-07-09T16:27:34.831Z · EA(p) · GW(p)

At its heart, EA seems to naturally tend to promote a few things:

  • a larger moral circle is better than a smaller one
  • considered reasoning ("rationality") is better than doing things for other reasons alone
  • efficiency in generating outcomes is better than being less efficient, even if it means less appealing at an emotional level

I don't know that any of this are what EA should promote, and I'm not sure there's anyone who can unilaterally make the decision of what is normative for EA, so instead I offer these as the norms I think EA is currently promoting in fact, regardless of what anyone thinks EA should be promoting.

answer by kewlcats · 2020-07-11T18:05:02.613Z · EA(p) · GW(p)

Not exactly answering your question, but I think EA has really good communication norms--such as steelmanning your opponent, focusing on empiricism, open discussion, double crux, no persona attacks, etc.

I do think the broader society can benefit significantly in discussing thorny topics (i.e. politics) if they adopted these communication norms.

answer by EdoArad · 2020-07-09T10:04:47.158Z · EA(p) · GW(p)

This is an interesting question. 

One possible value is something like intrinsically valuing Truth or Better Reasoning [EA · GW]. Perhaps also something like Productivity/Maximisation. The rationality community is perhaps a good example of promoting such values (explicitly here). 

It feels somewhat double-edged to promote instrumental values. This can cause all types of troubles if it's misinterpreted or too successful.

What do you think are the important values? 

comment by James_Banks · 2020-07-09T21:37:11.404Z · EA(p) · GW(p)

I'm basically an outsider to EA, but "from afar", I would guess that some of the values of EA are 1) against politicization, 2) for working and building rather than fighting and exposing ("exposing" being "saying the unhealthy truth for truth's sake", I guess), 3) for knowing and self-improvement (your point), 4) concern for effectiveness (Gordon's point). And of course, the value of altruism.

These seem like they are relatively safe to promote (unless I'm missing something.)

Altruism is composed of 1) other-orientation / a relative lack of self-focus (curiosity is an intellectual version of this), 2) something like optimism, 3) openness to evidence (you could define "hope" as a certain combination of 2 and 3), 4) personal connection with reality (maybe a sense of moral obligation, a connection with other being's subjective states, or a taste for a better world), 5) inclination to work, 6...) probably others. So if you value altruism, you have to value whatever subvalues it has.

These also seem fairly safe to promote.

Altruism is supported by 1) "some kind of ambition is good", 2) "humility is good but trying to maximize humility is bad" (being so humble you don't have any confidence in your knowledge prevents action), 3) "courage is good but not foolhardiness", 4) "will is good, if it stays in touch with reality", 5) "being 'real' is good" (following through on promises, really having intentions), 6) "personal sufficiency is good" (you have enough or are enough to dare reach into someone else's reality), 7...) probably others.

These are riskier. I think one thing to remember is that ideas are things in people's minds, that culture is really embodied in people, not in words. A lot of culture is in interpersonal contact, which forms the context for ideas. So ideally, if you promote values, you shouldn't just say things, but should instruct people (or be in relationship with people) such that they really understand what you're saying. (Advice I've seen on this forum.) Genes become phenotype through epigenetics, and concepts become emotions, attitudes, and behaviors through the "epiconceptual". The epiconceptual could be the cultural background that informs how people hear a message (like "yes, this is the moral truth, but we don't actually expect people to live up to the moral truth"), or it could be the subcultural background from a relationship or community that makes it make sense. The practices and expectations of culture / subculture. So values are a thing which are not promoted just by communicators, but also by community-builders, and good communities help make risky but productive words safe to spread.

answer by HaukeHillebrandt · 2020-07-10T13:39:47.050Z · EA(p) · GW(p)

My interpretation of this was promoting robustly good values (e.g. violence is bad) at scale as an effective intervention.

For instance, these are values that the UK government tries to promote:

"Champion democracy, human rights and the rule of law, and address global challenges, including through campaigns on preventing sexual violence in conflict, reducing modern slavery and promoting female education. Promote human and environmental security through London Illegal Wildlife Trade Conference. Deepen relationships between states and people, including through the Commonwealth Summit."

https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/827788/FCOAnnualReport201819.pdf#page=32

comment by James_Banks · 2020-07-10T18:16:54.113Z · EA(p) · GW(p)

A few free ideas occasioned by this:

1. The fact that this is a government paper makes me think of "people coming together to write a mission statement." To an extent, values are agreed-upon by society, and it's good to bear that in mind. (Working with widespread values instead of against them, accepting that to an extent values are socially-constructed (or aren't, but the crowd could be objectively right and you wrong) and adjusting to what's popular instead of using a lot of energy to try to change things.)

2. My first reaction when reading the "Champion democracy,..." list is "everybody knows about those things... boring", but if you want to do good, you shouldn't be dissuaded by the "unsexiness" of a value or pursuit. That could be a supporting value to the practice of altruism.

No comments

Comments sorted by top scores.