Is value drift net-positive, net-negative, or neither?
post by MarisaJurczyk
This is a question post.
I've been asked variations of this question a few times recently, as I'm studying value drift for my undergraduate thesis, so I thought I would seek out others' thoughts on this.
I suppose part of this depends on how we define value drift. I've seen value drift defined as broadly as changes in values (from the Global Optimum podcast) and as narrowly as becoming less motivated to do altruistic things over time (from Joey Savoie's forum post [EA · GW]). While the latter seems almost certainly net-negative, how the former plays out is a little less clear to me.
This leads me to wonder if there might be different kinds of value drift that may be varying degrees of good or bad.
answer by Larks
) · GW
There are many different things that can cause us to change what we value. Some seem like change-processes that our current values would endorse:
- I thought a lot about the issue, decided that two of my values were in conflict, and chose to prioritize one over the other.
Some seem like random processes we should resist:
- I got hit in the head, and the damage caused me to change my personality and values.
Some seem actively adversarial:
- Years of propaganda wore me down and caused me to love Dear Leader.
- I subconsciously realized that opinion X was high status, and found it expedient to adopt this as well.
In general I think people's opinions on the issue depend on how common they think these different cases are. I am generally quite pessimistic here; I think the first case is quite rare, and most cases that appear to be of this form are really examples of the third or fourth case. This makes me pessimistic about the long-term future, and I am interested in what we can do to reduce the influence of the last three cases.
answer by Darius_Meissner
) · GW
How bad (or possibly good) value drift and lifestyle drift are will depend your definition of the phenomenon, as you acknowledge yourself. The way I conceptualise them in the EA Forum article I wrote on the topic ('Concrete Ways to Reduce Risks of Value Drift [EA · GW]'), makes them strongly net-negative. In the post I (briefly) make the case that reducing risks of value drift and lifestyle drift may be an altruistic top-priority.
Here's how I think about the topic:
I use the terms value drift and lifestyle drift in a broad sense to mean internal or external changes that would lead you to lose most of the expected altruistic value of your life. Value drift is internal; it describes changes to your value system or motivation. Lifestyle drift is external; the term captures changes in your life circumstances leading to difficulties implementing your values. Internally, value drift could occur by ceasing to see helping others as one of your life’s priorities (losing the ‘A’ in EA), or loosing the motivation to work on the highest-priority cause areas or interventions (losing the ‘E’ in EA). Externally, lifestyle drift could occur (as described in Joey's post [EA · GW]) by giving up a substantial fraction of your effectively altruistic resources for non-effectively altruistic purposes, thus reducing your capacity to do good. Concretely, this could involve deciding to spend a lot of money on buying a (larger) house, having a (fancier) wedding, traveling around the world (more frequently or expensively), etc. Quoting myself:
Of course, changing your cause area or intervention to something that is equally or more effective within the EA framework does not count as value drift. Note that even if your future self were to decide to leave the EA community, as long as you still see ‘helping others effectively’ as one of your top-priorities in life it might not constitute value drift. (...)
Most of the potential value of EAs lies in the mid- to long-term, when more and more people in the community take up highly effective career paths and build their professional expertise to reach their ‘peak productivity’ (likely in their 40s). If value drift is common, then many of the people currently active in the community will cease to be interested in doing the most good long before they reach this point. This is why, speaking for myself, losing my altruistic motivation in the future would equal a small moral tragedy to my present self. I think that as EAs we can reasonably have a preference for our future selves not to abandon our fundamental commitment to altruism or effectiveness.
↑ comment by ishi ·
2019-05-06T17:22:46.712Z · EA(p) · GW(p)
I have experienced both value drift and lifestyle drift (mostly the latter) and this has happenned in the last 3 years --and I first heart heard of EA about 3 years ago. My values are still close to what they always have been . I come from this tradition of environmentalism/nature/voluntary simplicity, social justice, tolerance, anti-authoritarian activism, 'scientific humanism'or 'ethical culture', as well as scientific rationality (math based) as well as musical interests .
While I could live a lifestyle based on those values, I was fairly happy. But I got into some personal and work related environments which meant I had to compromise on many of my values---surrounded by people who variously had no interest in rationality , nature or science, didn't like the kind of music I enjoyed, sometimes intolerant, and even 'authoritarians' though these people considered themselves what are called derogatoritarily 'SJWs'--people who are 'never wrong' and always saving the world. If I was around scientists, not uncommonly they had limited 'altruistic' inclinations---much more concerned with their careers and status than with the world around them--so their Focus as used in the EA Framework was on themselves, and viewed that as the best way to better the world.
Also altruism from a biological/scientific view some say technically doesn't exist---you can't help others if you can't help yourself. (Though of course many philanthropists like Carnegie and Mellon have helped others---its a time dependent process. Some historians have argued that 'British Colonialism' was a good thing.)
I wrote this because one of my interests are diffusion-drift equations (used in theoretical biology, physics, etc.) though I'm not an expert in them. I was sort of hoping to combine them with the EA framework, but the languages and dialects are so different, i'm thinking its a waste of time. Just as other environments I've been in conflict with my values, people I've encountered in EA personally or on-line don't value that approach. So I'm having more value drifts---stop trying to associate with people who don't share your values--be less tolerant. And for me, maybe stop valuing math reasoning and rationality of the kind humans do so much, because the world is fundamentally irrational.
answer by kbog
) · GW
Value drift towards the right values (i.e. Effective Altruism) is good, value drift away from them is bad. Value drift among EAs is likely to be a bad thing due to regression to the mean. We can imagine better values within EA, but there's no reason to expect value drift to go in the right direction. Of course we can identify better values and promote them among EAs, but that seems notably distinct from value drift.
On the other hand, we can imagine people with bad values who should regress to the mean, and would encourage value drift there.
↑ comment by Mati_Roy ·
2020-09-03T23:34:39.241Z · EA(p) · GW(p)
"good" and "bad" are usually use to make a value-judgement; so saying "better values" is a confusion. it's *state of affairs* that are good/bad *according* to values.Replies from: kbog
↑ comment by kbog ·
2020-10-22T09:42:52.457Z · EA(p) · GW(p)
"Value drift towards the right values" = transition from our current state of affairs, to a state of affairs where more people have values which are closer/farther from ours.
answer by Mati_Roy
) · GW
It's a convergent instrumental goal to preserve one's values. If you change your goals / values, you will generally achieve / fulfill them less.
Value-drifting someone else might be positive for you, at least if you only consider the first-order consequences, but it generally seems pretty unvirtuous and uncooperative to me. A world where value-drifting people is socially acceptable is probably worse than a world where it's not.
↑ comment by Khorton ·
2020-09-04T00:05:21.117Z · EA(p) · GW(p)
The current world is pretty accepting of value drift if you drift towards something good.
Everyone who joins EA has drifted away from another set of values!
Replies from: Mati_Roy
↑ comment by Mati_Roy ·
2020-09-04T02:26:37.628Z · EA(p) · GW(p)
if by "something good" you mean "something altruistic", then yes I agree. it's good for someone when others become altruistic towards them.
Comments sorted by top scores.
comment by RomeoStevens ·
2019-05-06T18:04:36.801Z · EA(p) · GW(p)
My sense is that there are definitely divergent and convergent value drifts. That is to say, some people seem to become less coherent over time and others more coherent.