Can my self-worth compare to my instrumental value?

post by C Tilli · 2020-10-11T07:32:35.995Z · score: 82 (46 votes) · EA · GW · 21 comments

A personal reflection on how my experience of EA is similar to my experience of religious faith in that it provides a sense of purpose and belonging, but that I miss the assurance of my own intrinsic value and how that can make it difficult to maintain a stable sense of self-worth.

Note: I realize that my experience of religion and faith is probably different from that of a lot of other people. My aim is not to get into a discussion of what religion does right or wrong, especially since I am no longer religious.


I grew up with a close connection to my local church and was rather religious until my mid-late teenage years. I am now in my thirties and have been involved with the EA movement for a couple of years. To me, there are similarities between how I remember relating to faith and church and how I now relate to the EA philosophy and movement.

For me, both provide (provided) a strong sense of purpose and belonging. There is a feeling that I matter as an individual and that I can have an important mission in life, that I can even be some kind of heroine. For both, there is also a supportive community (of course not always for everyone, but my experience has been mainly positive in both cases) that shares my values and understands and supports how this sense of mission affects many of my important life decisions. This is something that I find very valuable.

However, in comparison to what my faith and church used to offer me, there is something lacking in the case of EA. I miss the assurance that I as a person have an intrinsic value, in addition to my instrumental value as a potential world-saviour. With faith, you are constantly reminded that God loves you, that God created you just as you are and that you are therefore, in a sense, flawless. There is a path for everyone, and you are always seen and loved in the most important way. This can be a very comforting message, and I feel it has a function to cushion the tough demands that come with the world-saving mission. The instrumental value you have through your mission to do good is in a way balanced by the assurance that no matter what, you also have infinite intrinsic value.

With EA, I don’t find any corresponding comforting thought or philosophy to rest in. If I am a well-off, capable person in the rich world, the QALYs I could create or save for others are likely to be much more than the QALYs I can live through myself. This seems to say that my value is mostly made up of my instrumental value, and that my individual wellbeing is less important compared to what I could achieve for others.

I believe that if community members perceive that their value is primarily instrumental, this might damage their (our) mental well-being, specifically risking that many people might suffer burnouts. The idea that most of the impact is achieved by a few, very impactful people could also make the people who perceive themselves as having potential for high impact particularly vulnerable, since the gap between their intrinsic value or self-worth and their instrumental value would seem even wider.

If the value of our work (the QALYs we can save) is orders of magnitude greater than the value of ourselves (the QALYs we can live), what does that mean? Can we justify self-care, other than as a means to improve ourselves to perform better? Is it possible then to build a stable sense of self-worth that is not contingent on performance?

I have read several previous posts on EA’s struggling with feelings of not achieving enough (In praise of unhistoric heroism [EA · GW], Doing good is as good as it ever was [EA · GW], Burnout and self-care [EA · GW]), and to me this seems closely related to what I’m trying to address here.

I’m not sure what can be done about this on a community level. As an individual, I believe it will be important for me to find a way to maintain a stable sense of self-worth, while still staying intellectually honest with myself and committed to the EA ideals. If there are others who have also thought about or struggled with this, I would greatly appreciate your input.

21 comments

Comments sorted by top scores.

comment by willbradshaw · 2020-10-11T10:15:28.275Z · score: 53 (27 votes) · EA(p) · GW(p)

This definitely resonates with me, and is something I've been thinking about a lot lately, as I wrestle with my feelings around recreational activities and free time. I'm not sure if what follows is exactly an answer to your question, but here's where I'm at in thinking about this problem.

I think one thing it's very important to keep in mind is that, in utilitarianism (or any kind of welfarist consequentialism) your subjective wellbeing is of fundamental intrinsic value. Your happiness is deeply good, and your suffering is deeply bad, regardless of whatever other consequences your actions have in the world. That means that however much good you do in the world, it is better if you are happy as you do it.

Now, the problem, as your post makes clear, is that everyone else's subjective wellbeing is also profoundly valuable, in a way that is commensurate with your wellbeing and can be traded off against it. And, since your actions can affect the wellbeing of many other people, that indirect value can outweigh the direct value of your own wellbeing. This is the fundamental demandingness of consequentialist morality that so many people struggle with. Still, I find it helpful to remember that the same reasoning that makes other people so valuable also makes me valuable, in a deep and fundamental and moral way.

Turning, to instrumental value, I have two things to say. The first is about instrumental value in general, and the second is about the specific instrumental value of self-kindness.

The first thing I want to say is that almost everything I value I value instrumentally, and that fact does not make the value of those things less real, or less important. I care a great deal about freedom and civil liberties and democracy, and would pay high costs to protect those things, even though I only value them instrumentally, as ways to create more happiness and less suffering. I hate racism and speciesism and sickness and ageing, not because they are intrinsically bad in themselves, but because they are the source of so much suffering and foregone happiness. For some reason, we tend to view other things' instrumental value as deeply important, and our own instrumental value as a kind of half-real consolation prize. I think this is a tragic error.

Secondly, with regard to our own instrumental value, most people tend to significantly underestimate just how instrumentally valuable their mental health is. In my experience, when people think and talk about the instrumental value of their own wellbeing, they seem to have in mind about some kind of relaxation reserve that it's important to keep full in order to avoid burnout. I think something like this is probably true, but I also think that there's much deeper and broader instrumental value in being kind to yourself.

My ideas here aren't fully developed, but I think there's something toxic about too much self-abnegation, that whittles away at one's self-esteem and courage and enthusiasm and instinctive kindness toward others. At least for me, self-denial and guilt push me towards a timid and satisficing mindset, where I do what is required to not feel bad about myself and don't envision or reach out for higher achievements. It also makes me less instinctively kind to others, which has a lot of compounding bad effects on my impact, and also makes it harder for me to see and embrace new and different opportunities for doing good.

I'm still thinking through this shift in how I think about the instrumental value of my own wellbeing, but I think it has some pretty important consequences. Compared to the reserve-of-wellbeing model, it seems to militate in favour of being more generous to myself with my free time, less focused on self-optimisation insofar as that feels burdensome, and more focused on self-motivation through rewards rather than threats of self-punishment. How exactly this kind of thinking cashes out into lifestyle choices probably varies a lot from person to person; my main goal here is to illustrate how one's conception of one's instrumental value should be broader and deeper than just "if I don't relax sometimes I'll burn out".

In summary:

  • The same thing that makes it important to work for the wellbeing of others also makes you deeply and intrinsically valuable – to me, to others here, and hopefully also to yourself.
  • The instrumental value of your wellbeing is also deeply important, not merely some kind of second prize. Think about how you think about other things that you value a lot instrumentally, and compare how you think about your own instrumental value: are they the same?
  • The variety and scale of the effects of your wellbeing on your impact are probably greater than you think: your wellbeing isn't just instrumentally valuable, it's very very instrumentally valuable, in all kinds of hard-to-quantify ways.
  • Even if, at some point in the future, your wellbeing no longer has much instrumental value, you will still be just as intrinsically valuable as you are now: which is to say, very. The thing that makes you value the other sentient beings whose wellbeing you strive for will still apply to you: as long as you exist, you are important.
comment by willbradshaw · 2020-10-11T15:36:53.698Z · score: 19 (13 votes) · EA(p) · GW(p)

Something I didn't say in my big comment above: I'm really happy the people in this thread are approaching this with the goal of "still staying intellectually honest with" ourselves. I think there's a lot of seductive but misleading thinking in this space, and that there's a strong urge to latch onto the first framing we find that makes us feel better in the face of these issues. I'm happy to see people approach this problem in the same truth-first mindset they apply to doing good in the world.

comment by willbradshaw · 2020-10-12T17:16:07.503Z · score: 10 (5 votes) · EA(p) · GW(p)

On this point...there are a few arguments made in other comments here that I don't find very persuasive, but am avoiding arguing against for fear of seeming disagreeable or causing distress to people with fragile self-worth. What are people's thoughts about norms around arguing in these kinds of situations – or even raising the question in the first place?

EDIT: From my side, if there's an argument that I'm making that someone think is shaky, I'd rather they told me so – privately or publicly, as they prefer.

comment by C Tilli · 2020-10-14T06:54:15.474Z · score: 4 (4 votes) · EA(p) · GW(p)

I can obviously only speak for myself, but for me just having this kind of conversation is in itself very comforting since it shows that there are more people who think about this (i.e. it's not just "me being stupid"). Disagreement doesn't seem threatening as long as the tone is respectful and kind. In a way, I think it rather becomes easier to treat my own thoughts more lightly when I see that there are many different ways that people think about it.

comment by konrad · 2020-10-17T12:50:29.482Z · score: 2 (2 votes) · EA(p) · GW(p)

I think we can assume that people on this forum seek truth and personal growth. Of course, this is challenging for all of us from time to time.

I think having a norm of speaking truthfully and not withholding information is important for community health. Each one of us has to assume the responsibility of knowing our own boundaries and pushing them within reasonable bounds, as few others can be expected to know ourselves well enough. Combined with the fact that in this case people have consciously decided to *opt in* to the discussion by posting a comment, I would think it overly cautious to refrain from replying.

There surely are edge cases that are more precarious and deserve tailored thought but I think this isn't one.

If you know somebody well enough to think they are pushing their boundaries in unsustainable ways, I would reach out to them and mention exactly that thought in a personal message. Add some advice on how to engage with the community and its norms sustainably, link to posts like this showing that we all struggle with similar problems, and then people can also work through possible problems regarding "not feeling good enough".

Personally, I'd rather be forced to live in reality than be protected because people worry I might not be able to come to grips with it. One important reason for which I like the EA community is that it feels like we all have consented to hear the truth, even if it might be uncomfortable and imply labour.

comment by FCCC · 2020-10-11T17:06:27.375Z · score: 3 (3 votes) · EA(p) · GW(p)

It happens in philosophy sometimes too: "Saving your wife over 10 strangers is morally required because..." Can't we just say that we aren't moral angels? It's not hypocritical to say the best thing is to do is save the 10 strangers, and then not do it (unless you also claim to be morally perfect). Same thing here. You can treat yourself well even if it's not the best moral thing to do. You can value non-moral things.

comment by willbradshaw · 2020-10-11T17:36:52.716Z · score: 7 (4 votes) · EA(p) · GW(p)

This feels...not wrong, exactly, but also not what I was driving at with this comment. At least, I think I probably disagree with your conception of morality.

comment by C Tilli · 2020-10-12T07:37:40.219Z · score: 6 (5 votes) · EA(p) · GW(p)

Thanks a lot for this comment. I feel like I need to read it over again and think more about it, so I don't have a detailed or clever response, but I really appreciate it. The comparison to other things that have mainly or only instrumental value, and how much we actually value those things, was also a new and useful perspective for me.

comment by SamMumm · 2020-10-11T13:42:12.437Z · score: 4 (3 votes) · EA(p) · GW(p)

Thank you so much, C Tilli, for putting this in words and blogpost it. I have similar thoughts, but I never could articulate them so clearly.

Like you, I had various connections to Christians and the church when I was younger. I am no longer religious, but I miss this comforting feeling of self-worth and being loved, no matter what, that came with the beliefes that were held in my community.

Like you, I was not yet able to find a similar comfort in the EA movement and it challenges my perceived self-worth. And also thank you willbradshaw for you answer.

I do totally understand the worth of instrumental value, but it is still not as reassuring for me as I wish it would be.

Do I just have to accept that feeling and is that some kind of "price" to pay, when you stop believing in stuff, that was designed to comfort people (and probably also to establish power over them, but I put that aside for the moment) and rather seek out a fact-based worldview. Or is it more a matter of getting used to it, slowly shifting your views and perspectives, and - after some time - getting the same comfort from the believes you expressed above?

comment by willbradshaw · 2020-10-11T15:36:35.026Z · score: 9 (3 votes) · EA(p) · GW(p)

I think all the different framings you suggest are at least partly true.

I think this is one of the fundamental challenges of EA, and is going to take a lot of different people thinking hard about it to really come to grips with as a community. I think it will always be a challenge – EA is fundamentally about (altruistic) ambition, and ambition is always going to be in some degree of tension with the need for comfort, even if it simultaneously provides a great deal of meaning.

As you say, I'm not sure EA will ever be as comforting as religion – it's optimising for very different things. But over time I hope we will generate community structures and wisdom literature to help manage this tension, care for each other, and create the emotional (as well as intellectual) conditions we need to survive and flourish.

comment by Ramiro · 2020-10-14T03:05:02.521Z · score: 8 (5 votes) · EA(p) · GW(p)

First, of course, thanks, C Tilli, for the post, and thanks willbradshaw for these comments.
This pierced my mind:

As you say, I'm not sure EA will ever be as comforting as religion – it's optimising for very different things. But over time I hope we will generate community structures and wisdom literature to help manage this tension, care for each other, and create the emotional (as well as intellectual) conditions we need to survive and flourish.

I think my background is the opposite of C Tilli's: I have been an atheist for many years (and still am - well, maybe more of an agnostic, since we might be in a simulation...), but since I found out about EA, I think I became a little bit more understanding towards not only the need for comfort, but also the idea of valuing something that goes way beyond one's own personal value and social circle, that is sought by religious people (on the other hand, I also became a little bit supicious of some cult-like traits we might be tempted to mimic).

I am sort of surprised we wrote so much, so far, without talking about death and mortality. I know I have intrinsic value, but it's fragile and perishable (cryonics aside); and yet, the set of things I can value extends way beyond my perishable self - actually, my own self-worth depends a little bit on that (as Scheffer argues, it'd be hard not to be nihilistic if we knew humanity was going to end after us), and there's no necessary upper bound for what I can value. I reckon that, as much as I fear humanity falling into the precipice, I feel joy by thinking it may continue for eons, and that I may play a role, contribute and add my own personal experience to this narrative.

I guess that's the 'trick' played by religion that might be missing here: religion 'grants' me some sort of intrinsic value through some metaphysical cosmic privilege (or the love of God) - and this provides us some comfort. But then, without it, all that is left, despite enjoyable and worthy, is perishable - transient love, fading joy, endured pain, limited virtue, pleasure... Like Dworkin (who considered this to be a religious conviction - though non-theistic), we can say that a life well-lived is an achievement in itself, and stands for itself even after we die, like a work of art - but art itself will be meaningless when humanity is gone. Maybe altruism is just another way to trick (the fear of) death: when one realizes that "All those moments will be lost in time, like tears in rain. Time to die" one might see it not as realizing some external value, but as an important part of one's own self-worth. (if Bladerunner is too melodramatic, one can use the bureaucrat in Ikiru as an example of this reasoning)

comment by Robert_Wiblin · 2020-10-13T10:58:10.722Z · score: 29 (19 votes) · EA(p) · GW(p)

For whatever reason people who place substantial intrinsic value on themselves seem to be more successful and have a larger social impact in the long term. It appears to be better for mental health, risk-taking, and confidence among other things.

You're also almost always better placed than anyone else to provide the things you need — e.g. sleep, recreation, fun, friends, healthy behaviours — so it's each person's comparative advantage to put extra effort into looking out for themselves. I don't know why, but doing that is more motivating if it feels like it has intrinsic and not just instrumental value.

Even the most self-effacing among us have a part of their mind that is selfish and cares about their welfare more than the welfare of strangers.

Folks who currently neglect their wellbeing and intrinsic value to a dangerous extent can start by fostering ways of thinking that build up that endorse and build up that selfishness.

comment by Ramiro · 2020-10-14T00:12:07.556Z · score: 9 (4 votes) · EA(p) · GW(p)

For whatever reason people who place substantial intrinsic value on themselves seem to be more successful and have a larger social impact in the long term. It appears to be better for mental health, risk-taking, and confidence among other things.

I think this is still an instrumental reason for someone to place "substantial intrinsic value on themselves." Though I have no problem with that, I thought what C Tilli complained about was precisely that, for EAs, all self-concern is for the sake of the greater good, even when it is rephrased as a psychological need for a small amount self-indulgence.
Second, I'd say that people who are "more successful and have a larger social impact in the long term" are "people who place substantial intrinsic value on themselves,” but that's just selection dynamics: if you have a large impact, then you (likely) place substantial intrinsic value on yourself. Even if it does imply that you’re more likely to succeed if you place substantial intrinsic value on yourself (if only people who do that can succeed), it does not say anything about failure – confident people fail all the time, and the worst way of failing seems to be reserved for those who place substantial value on themselves and end up being successful with the wrong values.

But I wonder if our sample of “successful people” is not too biased towards those who get the spotlights. Petrov didn’t seem to put a lot of value on himself, and Arkhipov is often described as exceptionally humble; no one strives to be an unsung hero.

comment by C Tilli · 2020-10-14T06:42:55.721Z · score: 3 (3 votes) · EA(p) · GW(p)

Actually my concerns are more practical, along the lines of Roberts comment, that this kind of thinking could be bad for mental health and, indeed, long-term productivity and impact. If the perception of self-worth didn't seem important for mental health, I would not care much about it.

But it would be a sad scenario if we look back in 50 years and see that the EA movement has led to a lot of capable, ambitious people burning out because we (inadvertently) encouraged (or failed to counteract) destructive thought patterns.

I don't think there is a simple solution, but I think Will Bradshaw is on to something in his comment about the need to "generate community structures and wisdom literature to help manage this tension, care for each other, and create the emotional (as well as intellectual) conditions we need to survive and flourish."

comment by Larks · 2020-10-12T00:11:17.252Z · score: 6 (6 votes) · EA(p) · GW(p)

I wonder to what extent this springs from the fact that most pastors do not expect most of their congregants to achieve great things. Presumably if you are a successful missionary who converts multiple people, your instrumental value significantly exceeds your intrinsic value, so I wonder if they have the same feelings. An extreme case would be someone like Moses, whose intrinsic value presumably paled into insignificance compared to his instrumental value as a saviour of the Israelites and passing on the Word of God.

In any case, I think there is a strong case to be made for spending resources on yourself for non-instrumental reasons. Even if you don't think you matter more than anyone else, you definitely don't matter less than them! And you have a unique advantage in spending resources to generate your own welfare: an intimate understanding of your own circumstances and preferences. When we give to help others, it can be very difficult to figure out what they want and how to best achieve that. In contrast, I know very well which things I have been fixated on!

comment by C Tilli · 2020-10-12T07:52:18.687Z · score: 7 (4 votes) · EA(p) · GW(p)

Interesting thought. I'm not sure if what I had was the mainstream understanding of Christianity, but I didn't experience that there was this kind of conflict in the same way. I'd think that the intrinsic value of being created and loved by God was not really something that could pale in comparison to anything. But I don't know, and maybe it's not very important.

I think there is a difference between justifying spending resources on our own wellbeing and being able to feel valuable independent of performance. Feeling valuable is of course related to feeling like we deserve to be spent resources on, but I don't think it's exactly the same.

comment by HaukeHillebrandt · 2020-10-14T12:12:52.845Z · score: 4 (2 votes) · EA(p) · GW(p)

most of the impact is achieved by a few, very impactful people could also make the people who perceive themselves as having potential for high impact particularly vulnerable, since the gap between their intrinsic value or self-worth and their instrumental value would seem even wider.

 

Not sure if relevant to what you're saying, but there's this very interesting paper that shows:

Suppose that all people in the world are allocated only two characteristics over which they have (almost) no control: country of residence and income distribution within that country. Assume further that there is no migration. We show that more than one-half of variability in income of world population classified according to their household per capita in 1% income groups (by country) is accounted for by these two characteristics. The role of effort or luck cannot play a large role in explaining the global distribution of income.

This has obvious implications how much people can realistically earn to give, but also suggests that other forms of impact, like social impact, might be mostly outside people's control. This is good reason to not be too hard on oneself for not achieving more, and not compare yourself to people like Bill Gates.

This blog post "Why not give 90%?" also seems relevant. 

comment by FCCC · 2020-10-11T14:59:01.338Z · score: 3 (3 votes) · EA(p) · GW(p)

I think you're conflating moral value with value in general. People value their pets, but this has nothing to do with the pet's instrumental moral value.

So a relevant question is "Are you allowed to trade off moral value for non-moral value?" To me, morality ranks (probability distributions of) timelines by moral preference. Morally better is morally better, but nothing is required of you. There's no "demandingness". I don't buy into the notions of "morally permissible" or "morally required": These lines in the sand seem like sociological observations (e.g. whether people are morally repulsed by certain actions in the current time and place) rather than normative truths. 

I do think having more focus on moral value is beneficial, not just because it's moral, but because it endures. If you help a lot of people, that's something you'll value until you die. Whereas if I put a bunch of my time into playing chess, maybe I'll consider that to be a waste of time at some point in the future. There's other things, like enjoying relationships with your family, that also aren't anywhere close to the most moral thing you could be doing, but you'll probably continue to value.

You're allowed to value things that aren't about serving the world.

comment by Jakob_J · 2020-10-11T10:42:05.054Z · score: 3 (3 votes) · EA(p) · GW(p)

Great post! I've also experienced similar things during my time with EA. I think there are several ways to approach the issue of self-worth:

  1. Its important to realize that EA is not the same as utilitarianism and therefore does not suffer from the problem of demandingness (this is also discussed in the latest 80K podcast with Benjamin Todd). EA does not prescribe how much of resources we should share, only that the ones we do share should be distributed in an effective way.
  2. Unfortunately there is a tendency in EA to undervalue "small" contributions (i.e. those made by care workers, nurses, GPs etc). I think we need to realize that every contribution people can make to the common good is good no matter how small. I don't think that someone who saves less than one life in expectation should feel any worse than people who saves thousands or millions of lives. In any case, I wouldn't go around telling people that they should feel worthless if they are not working on something super important for humanity (if that was the case we'd need to reach more than 99% of humans on earth to tell them that they are worthless). This is clearly an absurd position, so why should we be telling ourselves that?
comment by C Tilli · 2020-10-12T09:52:47.062Z · score: 3 (3 votes) · EA(p) · GW(p)

I think I mostly agree with this, and I'd also like to clarify that I don't think this problem originates from EA or from my contact with EA. It is not that I feel that "EA" demands too much of me, rather that when I focus a lot on impact potential it becomes (even more) difficult to separate self-worth from performance.

Different versions of contingent self-worth (contingent self-esteem, performance-contingent self-esteem - there are a lot of similar concepts and I am not completely sure about which terms to use, but basically the concept that how much we like and value ourselves is connected strongly to our ability to perform) seem to be a problem for a lot of people outside of EA, that also relates to the risk for burn-out.

My thinking is that there are people with this issue in EA, possibly more than in the general population, and that even though it does not come from EA philosophy there is some relation between these types of self-worth issues and a focus on instrumental value. I'm not arguing that this is "right" or useful, I think it'd be a lot better if we could all have a strong and stable sense of non-contingent self-worth.

comment by ld25 (lucyduncan) · 2020-10-20T16:00:56.258Z · score: 1 (1 votes) · EA(p) · GW(p)

Thanks for the post, C Tilli. I often feel, like you, that I don't deserve to listen to my own needs as my life is so much better than the lives of the majority of people alive today. When I am feeling down about this, my partner sometimes reminds me that we don't have the capacity to completely 'overcome our biology'; we will always care about ourselves and our loved ones more than we care about other people. Whilst you may have acknowledged that you have an obligation to care about far-away strangers, you won't ever be able to make yourself care about them as much as you care about yourself. I think that it's okay to acknowledge this, and not expect yourself to be a robot who completely disregards their own desires to attend to the needs of the world.

No-one will ever be a perfect utilitarian. Even if you could achieve it for one day, you would probably be so exhausted afterwards that you would spend days recovering and not doing anything useful with your time. 

I recently had a bout of increased scrupulosity regarding altruism, and I found that returning to this post [EA · GW] helped. I hope you feel better soon, and please feel free to send me a message if you would like to talk about this.