Against value drift

post by toonalfrink · 2019-10-29T20:04:23.510Z · score: 10 (25 votes) · EA · GW · 10 comments

The idea of “value drift” strikes me as based on a naivety that people have altruistic values.

I don’t think that’s how people work. I think people follow their local incentives, and a line can always be traced from their actions to some kind of personal benefit.

This is not a cynical idea. It is a tremendously hopeful idea. It all adds up normality. The fact that we see people act altruistically all the time means that it is possible to align selfish interests with the “good”. that our environments are already shaped in such a way. It suggests that good outcomes are simply a matter of creating and upholding the right incentive structures. Heaven will follow by itself.

So given that people follow their selfish incentives, why do they drift away from the EA community?

Here’s another idea: that motivation is always relative. People aren’t as propelled towards something as its objective value. They’re as motivated as the badness of the next best alternative. If your plan B becomes better, your plan A is suddenly not as interesting anymore, even if it's "objective value" is still the same. People might try to explain their behavior, lamenting that they've changed. Maybe they just got better options.

How does this relate to value drift? The naive model might be that you have some variables in your head. Each one of them gives some numeric value to some virtuous and lofty good, like “global health” and “the long term future” and “freedom of speech” and whatnot.

I’d like to propose that the model is more like this: the variables are there, but they don’t point to virtuous and lofty goods. They point to things about you. Power. Survival. Prestige. Odds of procreation. The values are highly stable, and motivation only really changes as the environment does.

And there’s absolutely positively nothing bad about this whatsoever. It all adds up to normality. In fact these “degenerate” values even add up to something as noble as EA. Greed is good, as long as it’s properly channeled.

Value drift isn’t some kind of unexplainable shift in attitude. It’s a shift in perceived incentives. Correctly perceived or not.

One might have gotten involved in EA on the heuristic that maximizing impact will lead to the highest prestige. They might have learned in EA that this heuristic doesn’t always work. Maybe they weren’t praised [LW · GW] enough. Maybe they found that there was too much competition to be noticed. Maybe they found that the world doesn’t necessarily reward good intentions, so they dropped them.

Maybe they left because there wasn’t much more to learn. Maybe they left because they felt threatened by weird political ideas. Maybe they felt censored. Maybe they found a different social environment that was much more rewarding. Maybe they felt like the establishment of EA wasn’t taking them seriously.

I don’t have a simple answer, but the current concept of “value drift” is very much a pet peeve. We have to account for people’s selfish sides. As long as we’re in denial about that, we won’t get as much of the altruistic side.

So I propose we call it incentive drift instead.

10 comments

Comments sorted by top scores.

comment by Pablo_Stafforini · 2019-10-30T01:51:41.810Z · score: 36 (18 votes) · EA · GW

What kind of evidence will cause you to abandon the view that people always act selfishly?

comment by G Gordon Worley III (gworley3) · 2019-10-30T17:37:20.103Z · score: 14 (7 votes) · EA · GW

I agree that there is something very confused about worries of value drift. I tried to write something up about it before [LW · GW], although that didn't land so well. Let's try again.

I keep noticing something is confused when people worry about value drift because to me it seems they are worried they might learn more and decide they were wrong and now want something different. That to me seems good: if you don't update and change in the face of new information you're less alive and agenty and more dead and static. People will often phrase this though as worries that their life will change and they won't, for example, want to be as altruistic because they are pulled away by other things, but to me this is a kind of confused clinging to what is now and expecting it to forever be. If you truly, deeply care about altruism, you'll keep picking it in every moment, up until the world changes enough that you don't.

Talking in terms of incentives I think helps make this clearer, in that people may want to be against the world changing in ways that will make it less likely to continue into a future they like. I think it's even more general, though, and we should be worried about something like "world state listing" where the world fails to be more filled with what we desire and starts to change at random rather than as a result of our efforts. In this light worry about value drift is a short-sighted way of noticing one doesn't want to the world state to list.

comment by tessa · 2019-11-07T21:37:07.662Z · score: 9 (4 votes) · EA · GW

It seems they are worried they might learn more and decide they were wrong and now want something different... If you truly, deeply care about altruism, you'll keep picking it in every moment, up until the world changes enough that you don't.

I don't object to learning more and realizing that I value different things, but there are a lot of other reasons I might end up with different priorities or values. Some of those are not exactly epistemically virtuous.

As a concrete example, I worry that living in the SF bay area is making me care less about extreme wealth disparities. I witness them so regularly that it's hard for me to feel the same flare of frustration that I once did. This change has felt like a gradual hedonic adaptation, rather than a thoughtful shifting of my beliefs; the phrase "value drift" fits that experience well.

One solution here is, of course, not to use my emotional responses as a guide for my values (cf. Against Moral Intuitions) but emotions are a very useful decision-making shortcut and I'd prefer not to take on the cognitive overhead of suppressing them.

comment by G Gordon Worley III (gworley3) · 2019-11-11T19:10:24.658Z · score: 1 (1 votes) · EA · GW
As a concrete example, I worry that living in the SF bay area is making me care less about extreme wealth disparities. I witness them so regularly that it's hard for me to feel the same flare of frustration that I once did. This change has felt like a gradual hedonic adaptation, rather than a thoughtful shifting of my beliefs; the phrase "value drift" fits that experience well.

This seems to me adequately and better captured as saying the conditions of the world are different in ways that make you respond differently that you wouldn't have endorsed prior to those conditions changing. That doesn't mean your values changed, but the conditions to which you are responded changed such that your values are differently expressed; I suspect your values themselves didn't change because you say you are worried about this change in behavior you've observed in yourself, and if your values had really changed you wouldn't be worried.

comment by some_arts_student · 2019-10-31T03:57:36.821Z · score: 4 (3 votes) · EA · GW

Thanks Gordon; this nicely puts into words something I think about this. If a person changes their views as the result of research, introspection, or new information, this could appear, from the perspective of people still holding their former views, to be value drift. Even someone's process of becoming altruistic could appear this way to people who hold their former values.

comment by SamuelKnoche · 2019-11-01T07:20:23.698Z · score: 10 (3 votes) · EA · GW

The idea that people always act selfishly is probably a bit extreme. But there's something very important pointed out in this post: considering selfish incentives is extremely important when thinking about how EA can become more sustainable and grow.

Just a few selfish incentives that I see operating within EA: the forum cash prize, reputation gains within the EA community for donating to effective charities, reputation gains for working for an EA org, being part of a community...

The point here is not that these are bad, but that we should acknowledge that these
are selfish incentives, and think about how to best design these selfish incentives to align them with EA goals.

To answer technicalities [EA · GW]' comment, I'm pretty sure that even the people he lists did what they did at least in part because of a hope for future recognition or because they hoped for approval from only one or two people in their immediate vicinity. Of course, their main motivation was just to do good, but saying they were driven by no selfish incentive whatsoever would be denying human nature.

Also, book recommendation: The Elephant in the Brain


comment by technicalities · 2019-11-05T12:34:29.319Z · score: 2 (2 votes) · EA · GW

Sure, I agree that most people's actions have a streak of self-interest, and that posterity could serve as this even in cases of sacrificing your life. I took OP to be making a stronger claim, that it is simply wrong to say that "people have altruistic values" as well.

There's just something up with saying that these altruistic actions are caused by selfish/social incentives, where the strongest such incentive is ostracism or the death penalty for doing it.

comment by technicalities · 2019-10-30T19:05:32.213Z · score: 10 (6 votes) · EA · GW

How does this reduction account for the many historical examples of people who defied local social incentives, with little hope of gain and sometimes even destruction? (Off the top of my head: Ignaz Semmelweis, Irena Sendler, Sophie Scholl.)

We can always invent sufficiently strange posthoc preferences to "explain" any behaviour: but what do you gain in exchange for denying the seemingly simpler hypothesis "they had terminal values independent of their wellbeing"?

(Limiting this to atheists, since religious martyrs are explained well by incentives.)

comment by Will Kirkpatrick · 2019-10-30T01:21:58.848Z · score: 5 (5 votes) · EA · GW

Perhaps incentive drift is more accurate, but it certainly seems to rob the individual of their agency. I know I am a collection of the circumstances I was raised in, however, that does not mean that I can pass blame onto those around me when I choose something wrong.

Perhaps the choice between the two words is a difference between Instrumental and Value rationalistic choices. Where a Value rationalist would prefer to use the term Incentive drift because it more accurately describes the reality of this "drift." An Instrumental rationalist would prefer to use the term Value drift, because it is more likely to result in individuals taking precautions, and therefore a better long term outcome for EA as a whole.

As I am an Instrumental rationalist, I believe that sticking with the term "Value drift" would place the emphasis on the individual in circumstances where it matters. We could then use the term "Incentive drift" to refer to the overall effect on people in the community of different features we have. (Thus enabling us to retain the benefits of it's use to describe effects on the community.)

For example, the lack of "Praise" as you refer to it in your link is something that has pushed many individuals away from effective altruism and rationality in general. To use the new word, it causes incentive drift away from EA.

Value drift is a much more individual term to me. The major fear here is no longer contributing to those things that I previously considered valuable. This might be a result of incentive drift, but it is my values that have changed.

Regardless of whether my thoughts are accurate, thank you for taking the time to post today. These are the kinds of posts that keep me coming back to the EA community and I appreciate the time and effort that went into it.

comment by MichaelStJules · 2019-10-31T03:18:10.207Z · score: 2 (2 votes) · EA · GW

I think it's plausible that changing incentives and "better" options coming along might explain a lot of the drift. However, rather than "Power. Survival. Prestige. Odds of procreation.", I think they'll be less selfish things like family, or just things they end up finding more interesting ; maybe they'll just get bored with EA.

However, I think you underestimate how deeply motivated many people are to help others for their own sake, out of a sense of duty or compassion. Sure, this probably isn't most people, and maybe not even most EAs, although I wouldn't be surprised if it were.

https://slatestarcodex.com/2017/08/16/fear-and-loathing-at-effective-altruism-global-2017/

https://www.theguardian.com/world/2015/sep/22/extreme-altruism-should-you-care-for-strangers-as-much-as-family

http://bostonreview.net/books-ideas-mccoy-family-center-ethics-society-stanford-university/lives-moral-saints

https://forum.effectivealtruism.org/posts/4gKqaGdDLtxm6NKnZ/figuring-good-out-january [EA · GW]

https://forum.effectivealtruism.org/posts/FA794RppcqrNcEgTC/why-are-you-here-an-origin-stories-thread [EA · GW]