What Makes Outreach to Progressives Hard

post by Cullen_OKeefe · 2021-03-14T00:32:57.180Z · EA · GW · 76 comments

Contents

  My Background (Or, Why I am Qualified to Talk About This)
  Reasons Progressives May Not Like EA
    of Paternalistic International Aid
    Oppression Worldview
    Aversion
    Equity, and Inclusion (DEI) Issues
    Between Intersectionality and Prioritization
    Change and a Preference for State Action
    Speciesism
    Ethics
  Guesses at How To Improve Messaging to Progressives
    Early
    and Highlight Community Feedback Mechanisms
      A Digression on GiveDirectly
    the Right Words/Framings
    DEI
    Policy In Earlier
    Alliances
  Things We Shouldn't Do: Reduce Intellectual Rigor
None
76 comments

This post summarizes some of my conclusions on things that can make EA outreach to progressives hard, as well as some tentative recommendations on techniques for making such outreach easier.

To be clear, this post does not argue or assume that outreach to progressives is harder than outreach to other political ideologies.[1] Rather, the point of this post is to highlight identifiable, recurring memes/thought patterns that cause Progressives to reject or remain skeptical of EA.

My Background (Or, Why I am Qualified to Talk About This)

Nothing in here is based on systematic empirical analysis. It should therefore be treated as highly uncertain. My analysis here draws on two sources:

  1. Reflecting on my personal journey as someone who transitioned from a very social-justice-y worldview to a more EA-aligned one (and therefore understands the former well), who is still solidly left-of-center, and who still retains contacts in the social justice (SJ) world; and
  2. My largely failed attempts as former head of Harvard Law School Effective Altruism to get progressive law students to make very modest giving commitments to GiveWell charities.

Given that the above all took place in America, this post is most relevant to American political dynamics (especially at elite universities), and may very well be inapplicable elsewhere.[2]

Readers may worry that I am being a bit uncharitable here. However, I am not trying to present the best progressive objections to EA (so as to discover the truth), but rather the most common ones (so as to persuade people better). In other words, this post is about marketing and communications, not intellectual criticisms. Since I think many of the common progressive objections to EA are bad, I will attempt to explain them in (what I take to be) their modal or undifferentiated form, not steelman them.

Relatedly, when I say "progressives" through the rest of this post, I am mainly referring to the type of progressive who is skeptical of EA, not all progressives. There are many amazing progressive EAs, who do not see these two ideologies to be in conflict whatsoever. And many non-EA progressives will believe few of these things. Nevertheless, I do think I am pointing to a real set of memes that are common—but definitely not universal—among the American progressive left as of 2021. This is sufficient for understanding the messaging challenges facing EAs within progressive institutions.

Reasons Progressives May Not Like EA

Legacy of Paternalistic International Aid

Many progressives have a strong prior against international aid, especially private international aid. Progressives are steeped in—and react to—stories of paternalistic international aid,[3] much in the way that EAs are steeped in stories of ineffective aid (e.g., Playpumps [EA · GW]).

Interestingly, EAs and progressives will often (in fact, almost always) agree on what types of aid are objectionable. However, we tend to take very different lessons away from this.

EAs will generally take away the lesson that we have to be super careful about which interventions to fund, because funding the wrong intervention can be ineffective or actively harmful. We put the interests of our intended beneficiaries first by demanding that charities demonstrably advance their beneficiaries' interests as cost-effectively as possible.

Progressives tend to take a very different lesson from this. They tend to see this legacy as objectionable due to the very nature of the relationship between aid donors and recipients. Roughly, they may believe that the power differential between wealthy donors from the Global North and aid recipients in developing countries makes unobjectionable foreign aid either impossible or, at the very least, extremely difficult. They may therefore prefer aid frameworks in which parties approach each other more as equals[4] or in which there is high-context transfer of feedback from recipient to donors. Of course, these heuristics will tend to privilege interventions within existing communities, and be harder to deploy internationally—hence progressives' skepticism of foreign aid. The fact that this in effect entirely cuts off the world's poorest people from aid at all counts for very little in the progressive worldview, probably as a result of the act-omission distinction: the bad to be avoided is paternalistic international aid, and simply abstaining from international aid is an easy way to do that.

The Oppression Worldview

Modern progressivism focuses a lot on oppression, which may be defined (from their perspective) as social systems that cause equally-worthy groups to receive preferential treatment or receive disparate rewards.

For reasons that elude my comprehension, many progressives do not seem to conceptualize the current assortment of economic and legal policies that cause some countries to be ~100x richer than others to be a relevant form of oppression. If they do, they are unlikely to give it as high a priority as, e.g., within-country racial disparities or within-country economic inequality.

A full analysis of why, exactly, global poverty is often not treated as a leading form of injustice by many progressives (as evidenced by the comparatively few progressive resources that go towards it) seems very valuable, and I cannot yet provide it. But I do feel confident in saying that to many progressives, global poverty is apparently a non-central example of oppression, or a lower-priority one.[5]

Econ Aversion

Many progressives are skeptical of the tools of modern economics, believing them (inaccurately, in my view) to play a central role in legitimating domestic income inequality and other maladies. This is probably due to domestic political tendencies for the right to emphasize the value of markets and economic growth more than liberals (who tend to focus more on economic equality). Thus, they may tend to have a negative reaction to EAs relying on economic concepts and tools, including things like cost-benefit analyses, marginal thinking, and QALYs. They may also distrust interventions that leverage market forces or promote economic growth as such. They may tend to believe, despite evidence from economic history, that extreme poverty is solely the result of past injustices, which may have implications for how we ought to understand our moral obligations to the global poor. They are also very hesitant to accept that global poverty is much worse than domestic poverty in extent and severity, which leads to a larger focus on the latter.

Diversity, Equity, and Inclusion (DEI) Issues

Progressives see shared identity as very important to understanding and advocating for the interests of a group. If a group claims to be advocating for some group X, but lacks a member of X in its leadership, this will make progressives very suspicious. Specifically, when EAs purport to advocate for members of the global poor, but our leadership lacks people from the world's poorest countries, they are immediately skeptical that we actually do have their best interests in mind, or can effectively advocate for them. This and the Legacy of Paternalistic International Aid (see above) reinforce each other.

Incompatibility Between Intersectionality and Prioritization

Intersectionality is one of the dominant frameworks on the progressive left for understanding and advocating for social change. The academic and popular uses of intersectionality differ, but the slogan "[t]here is no such thing as a single-issue struggle, because we do not live single-issue lives"[6] captures much of how this is currently understood and used in progressive spaces.

Intersectionality thus implies a strong anti-prioritization framework—or at least a hesitancy to engage in prioritization. Intersectionality implies that narrow prioritization (e.g., an AIDS charity prioritizing education and condom distribution over ART) is pro tanto objectionable insofar as it fails to consider, and allocate equal resources to, the differing needs of all members of a population.

Systemic Change and a Preference for State Action

Seasoned EAs will no doubt be aware that our critics on the left are some of the biggest proponents of the systemic change objection to EA. Progressives seem more likely to believe that major problems can or should only be solved through dramatic restructuring of society, in ways that EAs may be skeptical by default of for a variety of reasons. And when both agree on the need for some form of systemic change, they may often disagree on what that should look like.

Ordinal Speciesism

Some people on the progressive left (especially in the US, it seems) are averse to animal advocacy due to what I will call "ordinal speciesism": the belief that prioritizing animal welfare over human welfare is objectionable. Consider the following quotes from this article (which I just selected arbitrarily because it seemed pretty representative of views I see):

White vegans’ priority is the top layer of veganism–animal exploitation, but they ignore the socio-economic impact that comes from the movement becoming more popularized. Some white vegans even go as far to compare historical genocides that have affected BIPOC to the workings of the meat and dairy industries. . . . Veganism can only be about the liberation of animals when it also stops the oppression of people.

The idea that the two can be meaningfully analyzed separately, and that it may be appropriate to prioritize animal welfare over human welfare, is anathema to this worldview, apparently.

Population Ethics

EAs tend to reject person-affecting views of population ethics. This, however, has uncomfortable implications for some hot-button issues on the left, like reproductive rights and environmental ethics.

Guesses at How To Improve Messaging to Progressives

I am not a messaging expert, and have not had any overwhelming success at getting progressives more interested in EA. With that said, here are some of my guesses at what a more progressive-friendly approach to EA messaging could look like. Of course, this does not consider important tradeoffs, such as the potential for alienating other audiences. This will therefore be most useful to people whose primary audience is progressives.

Intervene Early

I would consider progressive-friendly messaging from the outset of any public-facing communications, not as a band-aid to be deployed in response to criticisms from the left. First impressions are really important, and so starting messaging with things that are, at the very least, not off-putting to progressives should help advance conversation without as much negative reaction.

Develop and Highlight Community Feedback Mechanisms

As a fairly welfarist and quantitative bunch, internal EA discussion on charity evaluation focuses a lot on cost-benefit analyses and much less on qualitative factors that either inform or complement such analyses when making final recommendations. I don't think this is substantively wrong, but I do think it can give the impression that EAs care a lot about quantified spreadsheet inputs and not human factors like recipient's assessments of charities. The latter should not be simply treated as a "nice to have": if CEAs and users' assessments of a program differ dramatically, we can suspect that something has gone wrong, and end-users/recipients can be extremely valuable sources of feedback and suggestions for improvement.

I am not an expert on GiveWell's evaluation process, and am aware that they do do some of this already, but I still think EA as a community could benefit from maybe roughly doubling(?) our cultural attention to the existence and performance of community feedback mechanisms for human-facing charities. This has been a philosophical commitment since the early days of EA, yet information on how we (or the charities we prioritize) actually confirm with recipients that our programs are having the predicted positive impact on them receives, AFAICT, little attention in EA.[7] It may also be that many top charities simply don't have good user feedback mechanisms because donors don't demand them, in which case we should probably encourage more charities to develop them anyway. Mechanisms like accessible feedback hotlines and recipient ombuds may be worth exploring further.

A Digression on GiveDirectly

GiveDirectly is often highlighted as a standout charity on this point, for good reason: features like GDLive and their customer support centers (and, of course, their general model), generally make clear that they care deeply about trusting and receiving honest feedback from end-users. But to the extent that EAs point to GD when this objection is raised without caring about whether other GiveWell charities (which generally receive more funding) have similar mechanisms in place, it feels like a bit of a motte-and-bailey.

Use the Right Words/Framings

Many EA actions can be accurately framed in ways that are more palatable to a progressive worldview. I often remember this quote from a Yale EA as an example:

For me, taking the Giving What We Can pledge was an expression of my commitment to using my class privilege to contributed to a movement towards a more equitable world for current and future generations

Note how this isn't framed in terms of maximizing QALYs/dollar or generalized impact, but rather as "using class privilege" to achieve "a more equitable world." Not only is this still quite faithful to EA principles, but it's also much more palatable to a progressive audience. Similarly, EAs can reconsider framing global health/development work as working towards "global health justice," "global income inequality," or "global healthcare access" while also highlighting the tools we use to prioritize between interventions in those cause areas.

Improve DEI

While I think it's very easy to focus too much on DEI efforts at the expense of impact, I also think that improving DEI in leadership at global health charities—and especially inclusion of people from the recipient countries—can send a good signal about the relationship between the charity and the populations it intends to serve. Such leaders can probably also provide valuable perspective about the communities in which the charity is operating. At the very least, I think it poses a huge communications liability for a lot of these charities among Western progressive audiences.[8]

Bring Policy In Earlier

A common way to communicate about EA is to first talk about "finding the most cost-effective charities" or something similar, then explaining the true scope of our ambitions (including policy goals) only later. This mirrors its internal evolution from global health prioritization to the inclusion of animals and ultimately future generations. Policy interventions came pretty late in this evolution.

But as policy becomes an ever-larger part of the EA portfolio, this message makes less and less sense, and reinforces the perception of EA as averse to enacting systemic change. EA should figure out catchy messages about the types of policy work we support, as we have done for our charitable work.

Build Alliances

There are a lot of topics on which EA will have shared interests with typical progressive causes, like environmentalism, climate change, tax justice, welfare spending, immigrants' rights, incarceration reform, and pacifism. Where possible, EA groups should consider showing up for and helping to promote and organize events with common interests. This should enhance our credibility in those spaces.

Things We Shouldn't Do: Reduce Intellectual Rigor

I think there are serious problems with a lack of intellectual rigor and openness in many progressive spaces today. Despite being quite liberal, this is one reason I prefer EA spaces more to typical progressive ones. I think intellectual rigor remains vitally important to the project of EA, and nothing in this should be used to suggest that we should reduce our emphasis on that.


  1. Indeed, EAs tend to be more progressive/left-of-center than the general population. See this post [EA(p) · GW(p)] ↩︎

  2. Inshallah. ↩︎

  3. Christian missionary work is often an archetypal example of this. The ABC approach to AIDS prevention may be another. ↩︎

  4. The rise of "mutual aid" as a framework for aid in leftist circles is an example of this. ↩︎

  5. As measured by revealed preferences in the form of comparative resource allocation. ↩︎

  6. Audre Lorde, Learning from the 60s, in Sister Outsider: Essays & Speeches by Audre Lorde 138 (2007). ↩︎

  7. As an example, after ten minutes of searching I could not find information on GiveWell's overall view on this subject on their website. ↩︎

  8. E.g., as far as I can tell, there's not a single person from sub-Saharan Africa on AMF's current staff, trustees, or Malaria Advisory Group. I think this is a pretty big optics liability for them among progressive audiences, independent of its substantive importance. ↩︎

76 comments

Comments sorted by top scores.

comment by meerpirat · 2021-03-14T09:02:00.774Z · EA(p) · GW(p)

Thanks for writing this, I think this topic is worthy of more discussion.

Of course, this does not consider important tradeoffs, such as the potential for alienating other audiences. This will therefore be most useful to people whose primary audience is progressives.

I wonder how much we should even recommend leaning into the progressive/social justice framing when the audience primarily comes from this ideological bent.

  • I often find talk about privilege unproductive and used in a hostile/shaming kind of way and feel mixed about suggesting that this is part of the motivation of EA (which I prefer seeing as sth. like „we share the desire to help others and improve the world as much as possible“) and bringing more people with that mindset into EA
  • people that are not from the social justice bent might be especially worthy to attract in situations where progressives are the main audience, in order to gain intellectual diversity

If I’d read this testimonial on the local EA website, there’d be a solid chance I‘d have been significantly less interested because it doesn’t connect to my altruistic motivations and (in my head) strongly signals a political ideology.

For me, taking the Giving What We Can pledge was an expression of my commitment to using my class privilege to contributed to a movement towards a more equitable world for current and future generations

I think some points you mention, like highlighting more that aid recipients’ feedback is strongly taken into account, don’t risk turning off non-social justice people while still connecting to their motivation and worries, so maybe I’d wish to see more of that kind.

Replies from: Julia_Wise
comment by Julia_Wise · 2021-03-15T17:30:44.884Z · EA(p) · GW(p)

I think "The Privilege of Earning to Give [EA · GW]" by Jeff Kaufman (who I'm married to) helped bridge a gap between us and our non-EA friends, who tend to have much more standard leftist views than we do.

comment by Max_Daniel · 2021-03-17T11:20:30.408Z · EA(p) · GW(p)

[I was sympathetic to common progressive/left-wing/social justice views before encountering EA. I'm from Germany, so my experience might not apply as much to the US.]

I'm wondering if another reason why some progressives may not like EA is a much more cynical prior about the intentions of powerful people and institutions, plus an unwillingness to update away from it or inability to identify evidence that would allow for such updates. 

E.g. it strikes me that before I encountered EA the only context in which I ever had heard about the Gates Foundation was in contexts where it was at least implied that obviously we should expect its activities to serve Gates's private interests rather than the common good. It requires some knowledge about the particulars of the Foundation's activities, and context to understand how it differs from the activities of other foundations, to come to a more sympathetic view. 

Replies from: ljusten
comment by ljusten · 2021-03-17T15:55:24.382Z · EA(p) · GW(p)

I think you have a very good point Max. One thing I have repeatedly witnessed in conversations with progressive college students is a bordering-conspiratorial distrust of big institutions (e.g. government, courts, public education, etc) and a favorable view of "dramatic restructuring of society, in ways that EAs may be skeptical by default of for a variety of reasons." 

For example the student government at my university will almost weekly publish some scathing article or series of tweets about the Universities unwillingness to ban police from campus or mandate a second ethnic studies requirement in which it is assumed that because the university does not immediately bend to the will of progressive students it is at best unsupportive of students of color and at worst racist. One branded as such an institution can quickly fall out of favor even when making many other accommodations or structural changes. 

The phenomenon is similar to the single issue voter where the only relevant criteria for trusting or mistrusting institutions is their stance on progressive race/gender issues. Even when said institution does a lot of really good things, these can become overlooked or ignored by a single accusation of perpetuating inequality, racism, or some other form of unacceptable behavior. 

Like you brought up in your above post, I believe this is in part because progressives care a lot about 'Fairness/Cheating' aspect whereas EA's might just look at the outcome or consequences of an institution and be like "wow that is doing a lot of great work."

comment by HowieL · 2021-03-14T14:37:25.412Z · EA(p) · GW(p)

Indeed, IIRC, EAs tend to be more progressive/left-of-center than the general population. I can't find the source for this claim right now.

 

The 2019 EA Survey says:


"The majority of respondents (72%) reported identifying with the Left or Center Left politically and just over 3% were on the Right or Center Right, very similar to 2018."

https://forum.effectivealtruism.org/posts/wtQ3XCL35uxjXpwjE/ea-survey-2019-series-community-demographics-and#Politics

Replies from: timunderwood, Cullen_OKeefe
comment by timunderwood · 2021-03-23T09:58:33.435Z · EA(p) · GW(p)

I think the survey is fairly strong evidence that EA has a comparative advantage in terms of recruiting left and center left people, and should lean into that.

The other side though is that the numbers show that there are a lot of libertarians (around 8 percent) and more 'center left' people who responded to the survey than there are 'left' people. There are substantial parts of SJ politics that are extremely disliked amongst most libertarians, and lots of 'center left' people. So while it might be okay from a recruiting and community stability pov to not really pay attention to right wing ideas, it is likely essential for avoiding community breakdown to maintain the current situation where this isn't a politicized space vis a vis left v center left arguments.

Probably the idea approach is some sort of marketing segmentation where the people in Yale or Harvard EA communities use a different recruiting pitch and message that emphasizes the way that EA is a way to fulfill the broader aim of attacking global oppression, inequity and systemic issues, while people who are talking to Silicon Valley inspired earn-to-give tech bros should keep with the current messages that seem to strongly resonate with them.

More succinctly:  Scott Alexander shouldn't change what he's saying, but a guy trying to convince Yale Law students to join up shouldn't sound exactly like Scott.

Epistemologically this suggests we should spend more time engaging with the ideas of people who identify as being on the right, since clearly this is very likely to a bigger blindspot than ideas popular with people who are 'left wing'.

comment by Cullen_OKeefe · 2021-03-14T23:00:56.299Z · EA(p) · GW(p)

Thanks!

comment by Max_Daniel · 2021-03-17T11:25:28.008Z · EA(p) · GW(p)

One explanation based on Haidt's Moral foundations theory would be:

  • Progressives tend to be motivated by both the 'Care/Harm' and 'Fairness/Cheating' foundations.
  • EAs tend to be motivated almost exclusively by the 'Care/Harm' foundation, or at least by philosophical views that rationalize 'Care/Harm' intuitions but often recommend to disregard 'Fairness/Cheating' intuitions in a way that seems objectionable to progressives.

I don't know much about how solid moral foundations theory is, and haven't thought much about how plausible I find this explanation or how much of the effect I'd guess it explains.

Replies from: Jordan_Warner
comment by Jordan_Warner · 2021-03-18T09:04:17.400Z · EA(p) · GW(p)

I honestly think that the progressive movement increasingly values Loyalty (i.e. you're not a real  minority if you're politically conservative) and Sanctity ( saying the N-word or wearing blackface make white people "unclean" in a way that cannot fully be explained by the Care/Harm framework),  so if anything I think Haidt's Moral Foundations theory is more right than even Haidt suspected, the taboos and tribes of the Left are simply still being defined.

Replies from: Max_Daniel
comment by Max_Daniel · 2021-03-18T13:21:57.658Z · EA(p) · GW(p)

Interesting, yeah. That sounds at least partly right to me, though I don't know enough about moral foundations theory or current progressive discourse to have a strong take on how much I believe in this vs. other explanations for the observations you've described.

comment by Benjamin_Todd · 2021-03-16T12:08:34.210Z · EA(p) · GW(p)

Thank you for this summary!

One thought that struck me is that most of the objections seem most likely to come up in response to 'GiveWell style EA'.

I expect the objections that would be raised to a longtermist-first EA would be pretty different, though with some overlap. I'd be interested in any thoughts on what they would be.

I also (speculatively) wonder if a longtermist-first EA might ultimately do better with this audience. You can do a presentation that starts with climate change, and then point out that the lack of political representation for future generations is a much more general problem.

In addition, longtermist EAs favour hits based giving, and that makes it clear that policy change is among the best interventions, while acknowledging it's very hard to measure effects, which seems more palatable than an approach highly focused on measurement of narrow metrics.

Replies from: Stefan_Schubert, JoshYou, EmmaAbele
comment by Stefan_Schubert · 2021-03-16T13:18:25.479Z · EA(p) · GW(p)

There might be a risk that some view the (very) long-run future as a "luxury problem", and that focusing on that, rather than short-term problems in your own country, reveals your privilege. (That attitude may be particularly common concerning causes like AI risk.) My guess is that people are less likely to have such an attitude towards someone who is focusing on global poverty. 

comment by JoshYou · 2021-03-17T01:39:59.116Z · EA(p) · GW(p)

Longtermism isn't just AI risk, but concern with AI-risk is associated with a Elon Musk-technofuturist-technolibertarian-Silicon Valley idea cluster. Many progressives dislike some or all of those things and will judge AI alignment negatively as a result.

Replies from: Jordan_Warner
comment by Jordan_Warner · 2021-03-18T09:07:27.203Z · EA(p) · GW(p)

I wonder if it's a good or bad thing that AI alignment (of existing algorithms) is increasingly being framed as a social justice issue, once you've talked about algorithmic bias it seems less privileged to then  say "I'm very concerned about a future in which AI is given even more power".

comment by EmmaAbele · 2021-06-06T19:55:20.572Z · EA(p) · GW(p)

In talking to many Brown University students about EA (most of who are very progressive), I have noticed that longtermist-first and careers-first EA outreach does better and seems to be because of these objections that come up in response to 'GiveWell style EA'. 

comment by AGB · 2021-03-17T11:18:33.351Z · EA(p) · GW(p)

This has been a philosophical commitment since the early days of EA, yet information on how we (or the charities we prioritize) actually confirm with recipients that our programs are having the predicted positive impact on them receives, AFAICT, little attention in EA.

[Within footnote] As an example, after ten minutes of searching I could not find information on GiveWell's overall view on this subject on their website.

 

FWIW, the most closely related Givewell article I'm aware of is How not to be a "white in shining armor". Relevant excerpts (emphasis in original):

We fundamentally believe that progress on most problems must be locally driven. So we seek to improve people’s abilities to make progress on their own, rather than taking personal responsibility for each of their challenges. How can we best accomplish this?...

A common and intuitively appealing answer is letting locals drive philanthropic projects...At the same time, we have noted some major challenges of doing things this way. Which locals should be put in charge?...

Another approach to “putting locals in the driver’s seat” is quite different. It comes down to acknowledging that as funders, we will always be outsiders, so we should focus on helping with what we’re good at helping with and leave the rest up to locals...

It’s not that we think global health and nutrition are the only important, or even the most important, problems in the developing world. It’s that we’re trying to focus on what we can do well, and thus maximally empower people to make locally-driven progress on other fronts.

comment by Meadowlark · 2021-03-17T02:48:27.512Z · EA(p) · GW(p)

Great post! I think this is an issue worth a lot of exploration. My sense though—both from reacting to your post and from my own reflection—is that there is probably a pretty low ceiling in terms of how much is possible here. I'll speak from my own experience as both a fan of EA and as a leftist.

1. It seems to me that EA, right now, has two areas of congregation (very broadly speaking): university/city groups and professional networking circles. So if you're involved in EA you're probably one of the following: a student, someone with a pretty niche expertise, or someone in between. You might have a graduate degree from a top university, and you might be a serious contender for some pretty "big" jobs at important institutions. Pretty much, you (might be, obviously this is a generalization) a member of the "professional-managerial class" (PMC). This class status—which is distinct from working-class and capital owner—is, I think, always what EA will be and, therefore, will always appear (understandably) as "elitist" to leftists who are sensitive to working-class politics. To many leftists, EA will always appear like a niche intellectual exercise that is being done by members of the PMC, and will never be truly available to members of the working-class, who leftists view as the true source of political power. 

2. Smaller point, but I would distinguish between progressives (say, like Elizabeth Warren or Ezra Klein) and leftists (like Bernie Sanders or Elizabeth Bruenig). These two groups have similarities but are different in one of the most important ways: their views on capitalism. The former is more likely, it seems, to be interested in EA (especially people like Klein who of course already likes EA), but the latter will never be fully  into EA because EA, generally speaking, does not make fundamental critiques of global capitalism. You can work on things like diversity, equity, and inclusion, but a failure to criticize capital will lead to a dead-end in how many people on the left are interested because that's what the left is.  

So, I guess my point is that although I think this is important, unless you can make EA look more like a working-class movement (or at least less like a movement created by and ran by the PMC) then there will never be much overlap between leftists and EAs. 

As both an EA and a leftist myself, this is of course very troubling to me if true! 

Replies from: Linch, Jordan_Warner, timunderwood
comment by Linch · 2021-03-18T17:17:53.746Z · EA(p) · GW(p)

Speaking descriptively, are most active leftists members of the working class rather than the PMC? My impression is that while many working-class people have implicitly leftist views on economics, the demographics that leftists predominantly draw from for activism is the highly educated PMC class, similar to EA. 

This impression can of course be wrong due to selection bias of who I end up talking with, so I'd personally find it valuable to correct for this bias! 

Replies from: Meadowlark
comment by Meadowlark · 2021-03-18T19:04:24.160Z · EA(p) · GW(p)

Good point! My intuition is that it's probably true that self-identified leftists are often indeed members of the PMC. But this could be in part because of a similar selection bias on my part.

 I think the difference is, though, that left politics often draws power from the working-class even if the working-class of course contains people of very diverse political viewpoints. Like not everyone striking in a labor union necessarily an identified socialist, but the political act they're engaging in is one arguably. 

Whereas with EA, it is both the case that members of the community and where power is locating in the community is mostly the PMC (with exceptions). Like, descriptively most EAs are well-educated and so on, and  most EA solutions are ones that would derive from well-educated people. 

comment by Jordan_Warner · 2021-03-18T09:11:55.431Z · EA(p) · GW(p)

I feel EA would  be very interested in a socialist running a cost-benefit analysis of the global proletariat revolution, the 20th century has presumably given us enough data to make it less speculative than a lot of things EAs are concerned about.

Replies from: Garrison
comment by Garrison · 2021-03-18T14:33:28.137Z · EA(p) · GW(p)

The thing that dem socs in the us want, a socialist economy and government, hasn't really happened in a rich country. The closest example would be Sweden in the 70s. I don't think there's much value in comparing the results of left wing revolutions in extremely poor and war ravaged countries with what might happen if dem socs like bernie sanders were to be able to enact their agendas in rich countries. The most economically left wing governments and societies in the rich world, i.e. Scandinavia, are some of the best places to live based on a whole host of metrics.

Replies from: Jordan_Warner
comment by Jordan_Warner · 2021-03-19T07:28:42.589Z · EA(p) · GW(p)

I think it's important to be clear that Scandinavian Social Democracy is not a socialist economy or a socialist government - I'm a big fan of the Nordic countries and think they'd be great to emulate, but (like all  countries) Sweden is somewhere in between "capitalism" and "socialism",  using taxation and a strong welfare state to ensure that the benefits of capital are widely distributed without total redistribution.  Based on the 20th century, I'm pretty confident that the optimal system of government has both free markets and government control.

I see the Capitalist/Socialist false dichotomy a a relic of the Cold War, with neither side able to admit that the other had a point. Total laisse fare Capitalism is pretty unpleasant for the people on the bottom, but it's the height of hubris to think the government can centrally plan the entire economy - and as soon as the Chinese stopped trying, it turned out pretty well for them!

comment by timunderwood · 2021-03-23T10:10:59.175Z · EA(p) · GW(p)

Possibly the solution should be to not try to integrate everything you are interested in.

By analogy, both sex and cheese cake are god, but it is not troubling that for most people there isn't much overlap between sex and cheese cake. EA isn't trying to be a political movement, it is trying to be something else, and I don't think this is a problem.

Replies from: Meadowlark
comment by Meadowlark · 2021-03-23T13:45:46.114Z · EA(p) · GW(p)

I think this is more or less correct. EA is not destined to be compatible with everything that we care about, and I think we should be thinking hard about what EA is capable of being and that the project of bringing in leftists is way more difficult than a few messaging tweaks. Those tweaks might bring in a few left-liberals, but once many leftists really see EA—i.e. as more than just a "you should donate more effectively" project—they will not be super interested, I think. 

comment by Cullen_OKeefe · 2021-03-24T05:43:33.256Z · EA(p) · GW(p)

Another thought I meant to include with my original post:

These reflections/experiences have also led me to believe that, all else equal, EA groups at colleges are more valuable than ones at grad schools. Anecdotally, One For The World college chapters were much more successful on average than HLS's, despite HLS grads' higher earning potential. My model is that many people adopt the sort of EA-skeptical progressive worldview described here in college, which makes outreach in grad schools harder.

I think making EA a viable alternative or complement to which college students are exposed during their formative years would be very valuable for this.

Replies from: jlewars
comment by jlewars · 2021-04-04T04:06:35.424Z · EA(p) · GW(p)

Thanks for the mention :-)

Not sure how helpful this is, but grad schools typically move more money (certainly per pledger/per student/per class etc. and often in naive terms). We have no idea yet of the long term changes in attitudes/actions and how those relate to school-type.

Also FWIW someone just started raising OFTW pledges at HLS and is absolutely crushing it - about $20k/annum of pledges in about a fortnight!

Replies from: Cullen_OKeefe
comment by Cullen_OKeefe · 2021-04-04T23:43:17.457Z · EA(p) · GW(p)

Ah great, very happy to hear about the broader success. Seems like the causes may have been more local to my approach while leading HLSEA.

comment by Cullen_OKeefe · 2021-03-24T16:03:14.310Z · EA(p) · GW(p)

Discussion of progressive ordinal speciesism on the latest 80,000 Hours podcast:

Robert Wiblin: What’s something important that your political fellow travelers get really wrong, in your view?

Ezra Klein: Animal rights. Maybe since I’ve already said that, you want me to do a different one. But I do first want to say just animal rights.

Ezra Klein: I think this is just a tremendous quantity of suffering that a political movement that thinks of itself as concerned with suffering ignores. Not only ignores, but mocks and dismisses. A lot of people who think of themselves as good on all these issues, you say, “Well, how about we don’t torture so many chickens?” They’re like, “Oh, you crazy vegan.”

Robert Wiblin: Yeah.

Ezra Klein: I really don’t like it. I think it’s a way we teach ourselves to be less compassionate.

comment by Harry_Taussig · 2021-03-15T16:49:47.902Z · EA(p) · GW(p)

Thanks for writing! This definitely helped clarify some of the push-back I often get when trying to explain these ideas to friends.

For reasons that elude my comprehension, many progressives do not seem to conceptualize the current assortment of economic and legal policies that cause some countries to be ~100x richer than others to be a relevant form of oppression. If they do, they are unlikely to give it as high a priority as, e.g., within-country racial disparities or within-country economic inequality. 

This will definitely stick with me. It seems the only way to get around this contradiction is to just not think about it, but maybe I'm missing something?

Replies from: Cullen_OKeefe
comment by Cullen_OKeefe · 2021-03-15T23:48:24.639Z · EA(p) · GW(p)

I think it's a matter of prioritization and non-quantification: they either don't really appreciate how much bigger/worse extreme poverty is, or else agree that it's very bad but just don't want to get involved in stopping it because they're worried about being Neo-Colonialist or something similar and it's easier to just focus on the domestic context.

Replies from: sky, Bluefalcon
comment by sky · 2021-03-19T16:45:36.740Z · EA(p) · GW(p)

I haven't read this whole thread, so forgive me if I'm re-stating someone else's point. 
I think there's another explanation: they have a hypothesis about you/EAs/us that we are not disproving. 

My experience has been that people in any numerical or social minority group (e.g. Black Americans, people with disabilities, someone who is the "only" person from a given group at their workplace, etc), are used to being met with disappointing responses if they try to share their experiences with people who don't have them  (e.g. members of the numerical or social majority group that they are different from).  Most of us have had this experience at least some of the time, maybe as EAs! People get blank stares, unwanted pity or admiration,  or outright dismissal and invalidation (e.g. "it can't be all that bad" or "you're just playing the [race/poverty/privilege/ whatever] card"). This is definitely the kind of conversation people see over and over again on the internet. So, until proven otherwise, that's what people expect. Majority group members are expected to be ignorant of what life is really like for people who experience it differently. I think this is a rational expectation at least some of the time. The hypothesis then goes: EAs look like majority group members and often are, ergo anything EAs say about which problems are "most important" is assumed to be somewhat ignorant. Maybe people see it as well-meaning or callous ignorance. Regardless, ignorance is assumed as most probable, because it's true of most people. (I think EAs and progressives also have different models of when ignorance matters the most and when differences matter the most, but that's a different thread).  

I've usually taken the view that I don't get to assume people will see me as an informed, compassionate person on the progressive left until I disprove the hypothesis above. If the first thing I say is something like why local US poverty issues are "less important" than other issues, I've just reinforced the hypothesis rather than disproven it. It sounds like denying the reality that they know is true -- they've seen the real-life people impacted and/or read their stories or studied the human impact of these issues.  At least in my case, it's not true that they struggle to think of people in other countries as real people too. (My progressive friends have often lived abroad, have family in other countries, or work in immigrant communities). It's a trust issue. If they see me denying that local issues are "real/important," I must be ignorant, and worse, I must be unwilling to be bothered with the real-life experiences of people different from me. Why should they trust anything I say after that about helping people? "But Africa though!" sounds like a deflection, not a genuine consideration or a sincere, compassionate challenge of their own thinking about poverty. 

When I speak first about things we both care about and share sincere examples of the ways that I do see and care about the depth of personal stress that US poverty and racial disparities have on people I actually know, I haven't had a progressive friend respond by saying that poverty in other countries didn't matter.  I brought it up second though, and that seems to make a difference. If someone trusts that I am a caring, informed person, not a callous ignorant one, we can expand the scope of the conversation from there.

Fwiw, I can't think of a time this has led to changed actions on their part. 

comment by Bluefalcon · 2021-03-16T12:06:07.827Z · EA(p) · GW(p)

I think it's because they know women/poc/trans ppl/ppl on whatever fashionable domestic axis of inequality you want to look at, but don't know anyone who lives in Burundi, and because the experience of oppressed people in America is still close enough to their own to actually empathize with. Lot easier to empathize with your friend who got called a slur than with someone dying of malaria in Africa. Both because they are your friend, and because you've probably been called mean names, maybe even by the same type of asshole tossing slurs at them, whereas deadly diseases that affect young healthy people are hard to even imagine. 

Replies from: Meadowlark, ljusten
comment by Meadowlark · 2021-03-17T16:29:36.011Z · EA(p) · GW(p)

This is a useful point but I would add a little bit to it. People on the left often think about racism, transphobia, and homophobia as quite a bit more than a POC friend of theirs being called a slur. Leftists often think of these as fundamentally systemic issues with very real, often physical, consequences. Like, racism in the US can manifest as, say, an entire generation of poor Black families being poisoned by a local CAFO, or an inability to develop intergenerational wealth due to explicitly racist economic policy.

I think sometimes EAs can offer a rather uncharitable take of the left, like that the left's concern with racism is just "SJW Safe Space" stuff or whatever. Not saying that's what's happening in this thread, but I would just say that if EA wants to be more open to progressives and leftists, it has to take very seriously what they actually  believe. 

As an example, I was pleased to see that the broad EA take during the BLM summer protests didn't seem to be just "well people should donate to AMF instead of buying markers and signs," Which may have been the take of 2015 EA. Whether EAs agree with them or not, ideas like socialism, progressivism, social justice, and so on, are serious ideas and shouldn't be dismissed in the way that I sometimes have seen them dismissed. 

Replies from: Bluefalcon
comment by Bluefalcon · 2021-03-17T23:44:39.350Z · EA(p) · GW(p)

I disagree on socialism being a serious idea in American politics. It's a thing left-wing trolls say to offend right-wingers. Any American who is serious about politics would never call themself a socialist as long as there is room to describe their ideas some other way, even if socialism might fit. Elizabeth Warren has a more left-wing voting record than Bernie Sanders. So you could argue she's a socialist. But she doesn't call herself a socialist, because she's trying to actually get legislation done, and there are other ways to describe her beliefs. 

 

I use the slur example not to be dismissive of social justice, but because it's something a kid at Harvard can understand. No matter how privileged you are you've probably been called mean names at some point, and you can easily see how a racial slur is a worse extension of that.  But that same Harvard kid, while thinking themself a dedicated anti-racist, will generally focus on instances like that over even domestic forms of inequality outside their understanding. I hear a lot of SJ talk about student loans, but not so much about the earned income tax credit, for example. 

Replies from: Garrison, Meadowlark
comment by Garrison · 2021-03-18T15:04:36.874Z · EA(p) · GW(p)

Arguments like this certainly don't help win over leftists to EA.

A self described democratic socialist nearly won the democratic nomination for president. The Democratic socialists of America (dsa) helped get dozens of candidates elected to national, state, and city offices over the last 4 years. Polling shows millennials being more sympathetic to socialism than capitalism.

The Warren example would also quickly get your political analysis dismissed by almost anyone on the left. Warren's lifetime voting record is slightly more left than bernies according to this site (https://progressivepunch.org/scores.htm?house=senate) but she took office in 2012, while bernie became a senator in 2006 after serving 16 years in the house. The democratic party has moved to the left over the last 30 years so more recently elected officials will have a more left record, all else equal. Bernie also received far more support from left wing organizations than Warren. The last point that Warren is trying to get actual legislation done while bernie and other socialists aren't is just wrong. Bernie played a huge role in shaping the recently passed American rescue plan, which is estimated to halve child poverty (https://www.politico.com/newsletters/politico-nightly/2021/03/16/bernie-sanders-joe-manchin-492117).

This argument is hugely dismissive of a significant strain of American politics based on flawed analysis and unsupported assertions.

Replies from: Bluefalcon
comment by Bluefalcon · 2021-03-19T00:04:10.928Z · EA(p) · GW(p)

I am not sure there is value in winning over self-described socialists. There is certainly not if the intention is to get them involved in politics, but I suppose they could be valuable contributors in other career paths if they take a more rational approach to their career than their politics. 

Assuming your values are broadly progressive, the net impact of self-declared socialists' political participation is negative. DSA helped get five Congressional candidates elected in safe Dem seats, and nowhere else.  Looking at Wikipedia's list of current DSA members of Congress, the most conservative district any represents is Jamaal Bowman's NY-16, with a Cook PVI of D+24 (meaning a Democrat typically gets 24 percentage points more here than the national average, and would win 74% to 26% in an evenly divided year.) https://en.wikipedia.org/wiki/List_of_Democratic_Socialists_of_America_members_who_have_held_office_in_the_United_States#United_States_House_of_Representatives. DSA national did not endorse a winning Congressional candidate in any swing district in 2018 or 2020. https://electoral.dsausa.org/past-endorsements/. I'm not going to go look through their locals' endorsements but I'd bet $100 the same holds true there if anyone wants to do the research. 

And their impacts when elected are bad. The Borgen Project, a nonprofit focused on  advocating for the needs of the global poor, rated Eliot Engel one of the top 10 champions of the global poor in Congress. https://borgenproject.org/tag/eliot-engel/. Jamaal Bowman replaced him with a more isolationist, more nationalist foreign policy. AOC spends millions and millions of dollars that could go to winning a swing district seat on social media ads for herself in her safe Dem seat. 

Sanders did not come close to winning the Democratic nomination. He had a temporary lead that was a quirk of the timing of various primaries. Biden wound up with 2.5x as many delegates as Sanders. 

Replies from: Garrison
comment by Garrison · 2021-03-19T18:11:05.663Z · EA(p) · GW(p)

I'm  a self-described socialist. I also work at an EA-aligned nonprofit and co-organize one of the largest EA groups in the world. I know plenty of other EAs who do great work and identify as socialists or leftists. 

But maybe EA would be better off without us because our political contributions are objectively wrong according to your analysis. 

Your analysis assumes that the goal of anyone with left of center politics is to flip seats from red to blue, but this is not the goal of the DSA. Obviously, winning majorities is essential to enacting legislation, but the composition of those majorities will change what legislation looks like. In the example I linked above, Bernie was able to significantly influence the American Rescue Plan to get more unconditional cash to people who need it, among other things. In New York State, Dems hold super majorities in the Assembly and Senate. All 5 of DSA's endorsed candidates won their primaries (the actually competitive election). One of them was the lead sponsor on the HALT Solitary Confinement Act, which significantly restricts the usage of solitary confinement (i.e. torture) in New York's corrections facilities and just passed the Assembly and Senate with veto-proof majorities yesterday. 

Eliot Engels:

  • supported the Iraq War
  • opposed the Iran deal
  • supported Saudi's war on Yemen 

Source. (The author of this article also wrote a defense of EA 5 years ago)

The nature of presidential primaries is that there is typically a clear front-runner by some point who captures the lions share of the remaining delegates. Even so, in 2016, the results were far closer, with Hillary receiving 55% of the popular vote to Bernie's 43%.

Honestly, you sound ideologically opposed to socialism, which is fine. What's frustrating is that you're writing about politics with a certitude that doesn't seem to match your understanding of it. You're picking a few random data points and then asserting that this proves some very broad claim, like that socialists participating in politics is bad for progressives or that Engel is better than Bowman. 

Replies from: Bluefalcon
comment by Bluefalcon · 2021-03-19T23:02:44.220Z · EA(p) · GW(p)

I am not ideologically opposed to anything. I am opposed on empirical grounds to Marxism, and approximately indifferent between centrist democrats and what most Americans refer to as "socialism" on the merits.  I am also empirically opposed to anyone referring to themself as a "socialist" in American politics, because it's a bad tactic in the elections that actually affect people's lives. Even in dem-supermajority legislatures, self-described socialists don't make up enough of the caucus to be the deciding vote on an issue that has a clean left-right divide. 

 

I voted Sanders in 2016 because my uneducated instinct is "progressive good" and because I thought Clinton a particularly weak candidate.  Then I learned how bad his record is on immigration (well to the right of Joe Biden, for example), and have deeply regretted that vote ever since.  EA has moved me somewhat toward the Dem establishment and away from the Left because it has given me the tools to prioritize effectively between issues I care about. Which was something I always knew I should be doing, but didn't know how to do before. I always noticed a strain of America Only-ism in some quarters of the Left that I was uncomfortable with, but it's complicated, because I didn't know how to weigh that against, e.g. the Left not listening to idiots like Larry Summers on economic policy, or the different version of xenophobia that a lot of centrists espouse.  And it turns out that the answer is that politicians have to be evaluated individually based on their support for the global poor and not based on ideology. On the merits, Bernie Sanders and Jamaal Bowman are pretty bad, Joe Biden meh but better than a Republican, AOC is pretty good, Elizabeth Warren and Cory Booker great .  Republicans in the Trump era are consistently bad but someone like, idk, Lincoln Chaffee, might have been better than a lot of Dems in the 80s.  And America Only-ism is definitely more common among self-described socialists than among people like Elizabeth Warren who are about as Left ideologically but don't adopt the identity. And at least in my social circles it seems to be even more pronounced among activist types than among politicians. 

Replies from: Garrison
comment by Garrison · 2021-03-20T00:55:18.576Z · EA(p) · GW(p)

Thinking that you can be opposed to a broad ideology on empirical grounds is simply mistaken. You can say something like " the countries that adopted self-described Marxist governments fared worse than they would have otherwise". But even that claim requires a lot of evidence to defend! Marxist revolutions didn't happen in already wealthy countries with stable institutions. I don't even consider myself a Marxist-- I'm just trying to make the point that this stuff is too complicated to make a claim that an ideology is empirically right or wrong.

Ideology is like bad breath, you can't smell your own. You have an ideology, whether you'd like to admit it or not!

I share your wish that American politics weren't so focused on Americans and wish that Bernie were more of an internationalist. However, his platform on immigration in 2020 was better than any other candidate's from an EA perspective IMO, even if his record may not have been great on it. 

comment by Meadowlark · 2021-03-18T02:09:03.304Z · EA(p) · GW(p)

Thanks for the clarification—I should clarify as well.  By serious ideas I don't mean that they necessarily have a lot of purchase in, let's say, American society. They might (it depends on your measures, and as I noted above we're talking about different things. Socialism, progressivism, leftism, etc. can be understood differently) or they might not. What I mean is that they have a rich intellectual history, in the case of socialism an intellectual history that is much older than EA, and that when a person on the left espouses an idea that it should be judged seriously. As opposed to the way that—not necessarily in this thread—I've seen EAs dismiss ostensibly or actually left-leaning concepts without a lot of deep introspection. 

I want to again mention the example of the summer BLM protests. It's possible that 2015ish EA would have given the trite and unhelpful response of "well, there are worse things happening in X country, so people should instead be donating there" or whatever. But, EA has meaningfully grown since then, particularly when it comes to issues of race and justice. By taking those (left-leaning) ideas seriously, there has been a tangible shift, I think, in  EA's ability to be compatible with progressives and more inclusive generally. As I mention in another comment, I think there's a ceiling to this, but progress is possible. 

EA has changed a lot over the past ~8 or so years, and I think it's moving away from the "Elon Musk Silicon Valley" EA that Dylan Matthews noted several years ago and is morphing into something more diverse and interesting. And I think part of that is because left/progressive/social justice ideas like diversity and racial justice are being taken more seriously (while acknowledging that there is a lot of work that still needs to be done).

To summarize, anytime I see a thoughtful person seriously (and disparagingly) refer to progressives and leftists as something like "SJWs" I cringe because of what that could mean for the future of EA. 

I generally agree with your second paragraph! 

Replies from: Jordan_Warner
comment by Jordan_Warner · 2021-03-19T07:43:21.132Z · EA(p) · GW(p)

I worry that there's a danger in taking the ideas of the left too seriously, if I take ideas like "abolish the police" seriously,  I want to respond with the best arguments against it in order to have a productive discussion of criminal justice policy, and end up denying people's lived experience. I think it would be a very bad idea for EA to take the ideas of the Left seriously in any way that risks seeming critical of them.

Whereas if I  don't take the idea seriously and understand it merely as an expression of distaste for modern American policing, I can be much more compassionate and understanding. It's probably better to take the sentiment more seriously than the slogans.

Replies from: Meadowlark, Garrison
comment by Meadowlark · 2021-03-19T14:04:31.873Z · EA(p) · GW(p)

I take your point, but I think I still have some slight pushback. Although I am unconvinced myself of the abolish the police position, slogan or not, it seems a bit patronizing maybe to assume that a very real policy proposal—which has some support by real academics and philosophers including utilitarian ones—is just, like, an "expression of distaste". Maybe I'm misunderstanding your point, and if so please let me know, but I guess that's the kind of dismissal of real (real not necessarily as in "good" just as in "supported by thoughtful people and perhaps defensible in some cases") ideas that I worry EA does too much. One reason I love EA is because  of its ability to deliberate and to really deal with many different ideas in a productive way. I'm not sure that it's super productive to not take seriously a political idea because the conversation would (truly) be difficult. 

Again, I am curious how many EAs view leftists as something like "people who aren't really good at being thoughtful or serious, but maybe sometimes, when they're not being SJWs, have some valid sentiments" or whatever. Like that dismissal, insofar as it exists, is I think representative of a deep problem with EA , which in its history has been just as naive and stubborn as any left movement

I guess I would say that maybe EA understands the left just as little as the left understands EA, and if this is true, then EA is destined to never have a movement that involves the left. 

As an anecdotal example, I've had dozens of conversations with EAs about this kind of stuff and, reliably, they will view "socialism" or "leftism" as synonymous with something like "centrally-planned government-run economy", which is if that's your only understanding of the left, then you don't understand the left any more than if someone thinks EA is nothing more than day traders donating to AMF. 

Replies from: Jordan_Warner
comment by Jordan_Warner · 2021-03-19T22:10:28.275Z · EA(p) · GW(p)

I'm not saying nobody has thought through the ideas,  I find the proposed alternatives to police fascinating, although I'm personally sceptical that they'd actually be better than the existing system - that's an essay all on its own!

My point was just that many people repeat slogans to express feelings rather than to advocate for concrete policy proposals, because everyone has feelings but almost nobody has policy proposals. (Myself included - I have opinions about lots of policy issues, if I'm honest I don't really understand most of them). I'm not saying we should dismiss ideas just because most people that advocate for them would struggle to defend them,  I'm just recommending against getting into arguments over the minutia of how community based restorative justice will actually work in the real world with people that have no idea what you're talking about! It's often more tactful to take people seriously but not literally, especially since slogans remove all nuance from the conversation and make it hard to know what people actually believe - saying "defund the police" could signal anything from supporting modest budget reallocation to literal anarchy!

I agree that treating "the Left" or "Progressives" as a monolithic bloc reveals a lack of understanding, but since Stalin and Hitler are much easier to argue against than what people on the left or the right actually believe, I'm not seeing this cheap rhetorical trick going away any time soon. We definitely should refrain from it though!

Replies from: Meadowlark
comment by Meadowlark · 2021-03-20T15:34:34.937Z · EA(p) · GW(p)

Gotcha! Now I think I understand. This makes sense to me

comment by Garrison · 2021-03-19T18:15:56.251Z · EA(p) · GW(p)

Many EAs support open borders, which to me is in the same general ballpark of "abolish the police". Both are radical breaks from how the world currently is.  Both slogans are open to many different interpretations. And both have a lot of literature and research behind them. But one slogan is popular among EAs, and one isn't. 

Replies from: Meadowlark
comment by Meadowlark · 2021-03-20T15:32:55.913Z · EA(p) · GW(p)

This is a really interesting comparison. A lot of leftists also support more open border policies. 

comment by ljusten · 2021-03-17T16:22:23.871Z · EA(p) · GW(p)

Yes there is a kind of "Narcissism of small differences" in which societal progress is measured in the context of a wealthy western countries instead of the broader world. The social justice initiatives in the U.S. do not benefit or extend to people of color in poorer countries who often suffer under even more pronounced economic or state injustices (e.g. deadly malaria mosquitoes, malnutrition, lack of access to healthcare, jobs, education, and internet, government oppression, etc).  I believe this is in part because people in the U.S. don't know how how much worse quality of life can be in poorer or more  authoritarian countries. 

comment by Larks · 2021-03-14T17:25:46.499Z · EA(p) · GW(p)

EAs tend to reject person-affecting views of population ethics. This, however, has uncomfortable implications for some hot-button issues on the left, like reproductive rights and environmental ethics.

 

I can see why left wing views on abortion would biased people against totalist views, because they do not want to accept the implication that someone's desire to abort their child could be 'outweighed' by the interests of a possible-person. And I guess totalism would also imply we should have more children, in contradiction to the idea that we should have fewer to protect the environment. But it would naively seem that being concerned about the environment would make you more amenable to longtermist views (as distinct from totalism), because if you don't care about future people then most of the damage from climate change can be ignored.

Replies from: Cullen_OKeefe, jackmalde
comment by Cullen_OKeefe · 2021-03-15T23:47:02.957Z · EA(p) · GW(p)

And I guess totalism would also imply we should have more children, in contradiction to the idea that we should have fewer to protect the environment.

This is mostly what I was referring to. Matt Yglesias has often said that he gets a lot of pushback against his One Billion Americans book from leftists who implictly have some sort of prior against both population and economic growth.

Also, as Michael says below, I think they (like most people who aren't moral philosophers) just don't really have coherent population ethics.

comment by jackmalde · 2021-03-14T19:57:30.064Z · EA(p) · GW(p)

Also, person-affecting views can lead to the bizarre conclusion that we don't need to worry much about contributing to climate change because the people in the future wouldn't have existed if we hadn't done so - so we won't actually have harmed them (provided their lives are net good).

AKA the non-identity problem.

Replies from: MichaelStJules
comment by MichaelStJules · 2021-03-15T20:28:48.045Z · EA(p) · GW(p)

I would assume that progressives concerned with the welfare of future generations (maybe most?) don't have these specific kinds of person-affecting views, although most probably have not thought that much about population ethics or metaphysical identity issues at all. I think the closest steelman might look like:

  1. the wide and soft asymmetry view here (Thomas) or here (Frick), which does fine on the non-identity problem,
  2. dying is bad, so extinction would at least be bad for the people who die and don't want to,
  3. and maybe they separately value the preservation of humanity,  like this (Frick), or something like an animal conservationist way, but more humans isn't (always) better. Or, they aren't actually person-affecting, but recognize decreasing marginal value in additional lives as a population increases.
comment by jushy (sanjush) · 2021-03-16T21:24:38.245Z · EA(p) · GW(p)

Related to this, a reasonable question I can see progressives asking is "Why do EAs not prioritise anti-racism/ feminism/ LGBT rights?" 

While EAs could argue that drug decriminalisation and criminal justice reform in America are closely related to anti-racism, I think there are some important philosophical questions to answer here related to how EA chooses to define a cause area, and why we don't seem to think of anti-racism / feminism / LGBT rights  as cause areas. I have no idea what a good answer would look like.

I also don't think that the last discussion [EA · GW] on this forum of how we define cause areas made much progress.

Replies from: AGB
comment by AGB · 2021-03-17T09:40:37.280Z · EA(p) · GW(p)

Whether this is a ‘good’ answer would depend on your audience, but I think one true answer from a typical EA would be ‘I care about those things too, but I think that the global poor/nonhuman animals/future generations are even more excluded from decision-making (and therefore ignored) than POC/women/LGBT groups are, so that’s where I focus my limited time and money’.

I don’t actually think the cause area challenge is quite what is going on here; I can easily imagine advancing those things being considered cause areas if they had a stronger case.

comment by deluks917 · 2021-03-14T06:31:06.184Z · EA(p) · GW(p)

I think of the intersectionality/social justice/anti-oppression cluster as being a bit more specific than just 'progressive' so I will only discuss the specific cluster. Through activism, I met many people in this cluster. I myself am quite sympathetic to the ideology. 

But I have to ask: How do you hold this ideology while attending Harvard Law? From this perspective, Harvard law is a seat of the existing oppressive power structure and you are choosing to become part of this power structure by attending. The privileges that come from attending Harvard Law are enormous. Harvard law graduates earn extremely high salaries (even the starting salaries are high)and often end up with very high net worths. Harvard law is also obviously strongly connected to many parts of the neoliberal capitalist system. 

From a certain perspective being a leftist at Harvard law can be viewed as trying to become some sort of 'class traitor' to the neoliberal elite. This does not seem like the obvious thing to do from a leftist perspective. Much leftist analysis would suggest that it's much more likely you just end up part of the neo-liberal power structure instead of subverting it. 

In your experience how do these people resolve the contradiction?

Replies from: Julia_Wise, xuan, sanjush
comment by Julia_Wise · 2021-03-15T17:15:04.770Z · EA(p) · GW(p)

Ironically, the situation in which I have most frequently been asked about whether EA is elitist is while giving intro talks about EA at MIT, Yale, etc.

Replies from: Cullen_OKeefe
comment by Cullen_OKeefe · 2021-03-15T23:49:37.437Z · EA(p) · GW(p)

This is my experience too.

comment by xuan · 2021-03-21T02:14:44.992Z · EA(p) · GW(p)

Based on my experiences as a Yale undergraduate, I've come away with the perhaps overly pessimistic conclusion that a lot of class-privileged leftists at Ivy+ schools don't actually resolve that contradiction, and are unfortunately not that interested in interrogating and addressing their class privilege, or thinking about redistributing what familial or future wealth / resources they may have access to. I say this as both a former organizer of Yale EA, but also as someone who started a Resource Generation chapter there, and found it difficult to get people to engage. By way of comparison, it was considerably easier to find people interested in the local DSA chapter.

(For context, Resource Generation is a movement that organizes young (USAmerican) people with wealth or class privilege to redistribute their wealth, land, and power, and I see it as perhaps the most viable movement for class-privileged US leftists who are really interested in addressing the contradiction of being both leftist and wealthy. See for example their giving pledge guidelines, which are considerably more ambitious than GWWC, and have as their goal for the " top 10% to develop plans to redistribute all or almost all (see below) inherited wealth and/or excess income". )

 It's hard to have a charitable take in response to that data, but I think it's partly that people find it quite uncomfortable to talk about class, what more interrogate their own class privilege in a deep way. The other part is that the social incentives in these schools and activist circles tend to reward more external-facing leftist actions like fossil fuel divestment protests, and not internal-facing actions like confronting one's wealthy family to redistribute their wealth - in part because to do that publicly, you have to reveal your family is wealthy, which isn't exactly celebrated in leftist spaces.

Replies from: Max_Daniel, John_Maxwell_IV, deluks917, timunderwood
comment by Max_Daniel · 2021-03-22T08:35:25.815Z · EA(p) · GW(p)

That's really interesting, thanks for sharing your experience with these efforts.

Only partly on-topic, but I'm wondering if Jerry Cohen's If You're an Egalitarian, How Come You're so Rich? may be a good book for such audiences.

As far as I remember it, it doesn't actually make that strong a case that rich egalitarians ought to redistribute most of their wealth. (I actually think that most of what I got from that book was reflecting on some weird parallels between Marxism and AI risk thought, and the role of philosophers in both.) But it at least raises and somewhat discusses the question, and it's by one of the main 'analytical Marxists' and so might have more initial credibility to leftists.

Replies from: xuan
comment by xuan · 2021-03-26T18:16:09.594Z · EA(p) · GW(p)

I have read the paper, not the book! And have tried to get friends to read it, though unfortunately I don't think it was necessarily very effective either. I did end up writing an op-ed (Reparation, not just Charity) once trying to motivate wealthy students to redistribute more of their wealth, and it received a lot of likes on social media, but I'm not sure that it led to meaningful behavioral change :/ I think behavioral changes and commitments just take a lot more work, and a supportive community to encourage it. 

comment by John_Maxwell (John_Maxwell_IV) · 2021-05-23T05:15:21.440Z · EA(p) · GW(p)

Just for reference, there's a group kinda like Resource Generation called Generation Pledge that got a grant from the EA Meta Fund. I think they've got a bit more of an EA emphasis.

comment by deluks917 · 2021-03-21T20:02:32.918Z · EA(p) · GW(p)

Really cool to learn about resource generation. These fellows are hardcore. I promote the following to EA type people:
-- Donate at least 10% of pre-tax income (I am above this)
-- Be as frugal as you can. Certainly don't spend more than could be supported by the median income in your city. 
-- Once you have at least ~500K net worth give away all additional income. In my opinion, 500K is enough to fund a lean retirement if you are willing to accept a little risk. 

--If you get a big windfall I suggest either putting it in a trust or just earmarking it for charity instead of immediately donating the whole thing; your cause prioritization may change (I regret how I donated a big windfall during the first crypto bull market. )

I don't think people should have to work if they don't want to so I think it's reasonable to 'save yourself'. But don't strive for too much security and keep your spending lean. I was objectively raised in a far from top 10% household and have no received much money from my parents. For example, they contributed zero dollars to my college. But anyone who is able to 'speedrun to 500K while donating' (or even seriously consider it) must be very privileged somehow.

If you actually take my advice seriously it is quite strict. But RG seems a lot more hardcore than that. 

comment by timunderwood · 2021-03-23T10:20:32.684Z · EA(p) · GW(p)

I feel like trying to be charitable here is missing the point.

It mostly is Moloch operating inside of the brains of people who are unaware that Moloch is a thing, so in a Hansonian sense they end up adopting lots of positions that pretend to be about helping the world, but are actually about jockeying for status position in their peer groups.

EA people also obviously are doing this, but the community is somewhat consciously trying to create an incentive dynamic where we get good status and belonging feelings from conspicuously burning resources in ways that are designed to do the most good for people distant in either time or space.

Replies from: tamgent
comment by tamgent · 2021-03-24T13:23:37.023Z · EA(p) · GW(p)

I don't think xuan's main point was about being charitable, although they had a few thoughts in that direction. More generally, trying to be charitable is usually good. Of course it's going to miss a point (what finite comment isn't), but maybe it's making another?

I appreciate you trying to bring the discussion towards what you see as the real reason for lefty positions being held by privileged students (subconscious social status jockeying), but I wonder if there's a more constructive way to speculate about this?

Maybe one prompt is: how would you approach a conversation with such a lefty friend to discover if that is their reason, or not?

You could be direct, put your cards on the table, and say you think they are just interested in the social status stuff, and let them defend themselves (that's usually what happens when you attack someone's subconscious motivation, regardless what's true). Or you could start by asking yourself, what if I was wrong here? Is there is another reason they might hold this position on this topic? That might lead you to ask questions about their reasons. You could test how load-bearing their explanations are, by asking hypotheticals, or for them to be concrete and specific. Maybe you, or they, end up changing/modifying your position or beliefs, or at least have a good discussion, with at least one person having more understanding going out than you had coming in. In any case, I think a conversation that assumes good faith is more likely to lead to a productive discussion.

Circling back to the initial thing: I'm assuming that you do see the value in being charitable and assuming good faith in general, and just feel it is hard to practice this in conversations when people are very attached to their positions. But let me know if not, i.e. if you do genuinely think there is no point in being charitable (as that would be our true disagreement, this seems unlikely).

Please correct me if I've misunderstood you here. 

+ nitpick: you use terms people might not have heard of. If I look up 'Moloch' I don't immediately see the article by Scott Alexander that I think you have in mind, just a Wikipedia article about the god. 

comment by jushy (sanjush) · 2021-03-14T18:37:50.496Z · EA(p) · GW(p)

Not OP or at Harvard Law but anecdotally I know plenty of people who would consider themselves to be leftists, fit in the anti-oppression cluster, but wouldn't think that just going to Harvard Law makes you a class traitor. I think for many it would depend on what the Harvard Law grad actually did as a profession, eg - are you a corporate lawyer (class traitor) or a human rights lawyer (not class traitor).

That being said, I also think that the mainstreaming of social justice issues means that increasing numbers of people in the intersectionality/anti-oppression cluster don't know about / care about / support ideas about class struggle and class war, so aren't really 'leftists' in that sense of the word.

Replies from: Cullen_OKeefe, deluks917
comment by Cullen_OKeefe · 2021-03-15T23:57:14.460Z · EA(p) · GW(p)

I think for many it would depend on what the Harvard Law grad actually did as a profession, eg - are you a corporate lawyer (class traitor) or a human rights lawyer (not class traitor).

This is consistent with my experience. But also, I think a lot of people that end up at HLS don't think in those sort of Marxist/socialist class terms, but rather just have a sort of strong Rawslian egalitarianism commitment.

I also think many people at HLS are hilariously unaware of their class privilege. In fact, many of them think of themselves as victims of unfair power structures vis-a-vis being students. This is how you get HLS grads advocating for their student loans to be forgiven by the federal government (this was truly a fashionable position when I was there) or generally spending their time advocating for HLS students getting better treatment. For example, there were at least two student groups [1] [2] advocating for HLS students to get better financial treatment The second one explicitly focuses on how large law firms (starting salary: $180-190k) treat early-career lawyers.

Replies from: AGB
comment by AGB · 2021-03-16T16:19:39.075Z · EA(p) · GW(p)

But also, I think a lot of people that end up at HLS don't think in those sort of Marxist/socialist class terms, but rather just have a sort of strong Rawslian egalitarianism commitment.

I also think many people at HLS are hilariously unaware of their class privilege.

FWIW, I strongly agree with both of these statements for Oxbridge in the UK as well. 

The latter I think is a combination of a common dynamic where most people think they are closer to the middle of the income spectrum than they are, plus a natural human tendency to focus on the areas where you are being treated poorly or unfairly over the areas where you are being treated well. 

comment by deluks917 · 2021-03-14T22:19:45.321Z · EA(p) · GW(p)

From this perspective, a corporate lawyer who went to Harvard is not a class traitor. They are just acting in their own class interests.

Replies from: sanjush
comment by jushy (sanjush) · 2021-03-15T00:05:32.949Z · EA(p) · GW(p)

I agree but I feel that in practice leftists I come across use the term to mean 'working against the class you grew up in', and exclusively use it for people who grew up poor and working class.

comment by tamgent · 2021-03-14T11:47:27.171Z · EA(p) · GW(p)

Thanks for the article, interesting and well-written. I'm sure will be useful as a reference for me in some future conversations.

With reference to your section titled Incompatibility Between Intersectionality and Prioritization - how do you see worldview diversification fitting in?

To me, this perspective incorporates the value of diversification of causes (which intersectionality protects) whilst still being realistic about actually getting things done (which prioritization protects).  Under a worldview diversification lens, prioritization is less about one thing to the exclusion of all others, whilst still not going as far as to say all causes are equal and should have an equal place at the table.

Replies from: Jsevillamol
comment by Jsevillamol · 2021-03-14T23:40:26.387Z · EA(p) · GW(p)

I feel like invoking worldview diversification here is discussing things at the wrong level. 

Is like saying "oh its ok that you believe in intersectionality, because from a worldview diversification perspective we want to work on many causes anyway", and failing to address the fundamental disagreement that within their worldview a intersectionalist does not find cause prioritization useful.

Like, I feel the crux of intersectionality is about different problems being interwoven in complex, hard-to-understand ways. So as OP pointed out, if you believe this you'll need to address all problems at once by radically restructuring society.

Meanwhile, the crux of worldview diversificationists is that we are not certain of our own values and how they will change, so it is better to hedge your bets by compromising between many views.

Replies from: tamgent
comment by tamgent · 2021-03-15T10:20:01.062Z · EA(p) · GW(p)

That wasn't really what I was saying, and I don't think you're steelmanning the intersectionalist perspective, although I agree with your description of the crux. I think many (maybe most?) people who like intersectionality would agree that prioritization is sometimes necessary and useful.

An attempt to steelman intersectionality for a moment:
- problems are usually interwoven and complex
- separating problems from their contexts can cause more problems
- saying one problem is more important than  another has negative side effects, because we are trying to fix a broken hammer with broken hammer (comparison culture is a cause of many problems, is a belief of many progressives, I believe)

I am unsure this is incompatible with prioritization, which in my view is simply a practical consequence of not having infinite resources. I think they'd agree, and would not take issue with, for example, someone dedicating their life to only climate change, as long as that person did not go around saying climate change is more important than all the other important issues, and also saw how climate change is related to, for example, improving international governance, or reducing corruption and worked with those efforts rather than in competition with/undermining them.

I think viewing most intersectionality proponents as people who cannot ever work on one thing because they literally need to address all problems at once is an overly literal interpretation, although it's possible to get this impression if there are a few loud ones like this (I don't know enough to know).

The disagreement seems to be more about whether it is helpful to compare the importance of issues in a public way. Comparing things, whilst necessary and important, can have side effects such as making some people feel bad about a the good thing that they are doing because it isn't the best thing a person in theory could be doing. We are familiar with this from 80K's mistakes.

I was focusing more on the marketing side like Cullen, and wondering whether worldview diversification might be a way to better connect with intersectionality proponents via a message like this:

problems are complicated and sometimes entangled, and we can work on many at once, on a group level, but also our resources are finite, so when allocating them, trade-offs will need to be made

Replies from: Jsevillamol
comment by Jsevillamol · 2021-03-15T14:11:22.095Z · EA(p) · GW(p)

I find your steelman convincing (would love more intersectionalists to confirm though!).

Re: downsides of intercause prioritization. Beyond making people feel bad about their work, systematic prioritization can systematically misallocate resources, while a more informal, holistic and intersectional approach is less likely to make this kind of mistake.

Arguably, while EAs are very well aware of the importance of hit-based giving, they are overly focused on a few cause areas. Meanwhile my (naive) impression is that intersectionalists are succesfully tackling a much wider array of problem areas and interventions, from community help to international aid and political lobbying. 

I do not think it is a stretch to think that prioritization frameworks are  partly to blame for cause convergence in the EA community.

comment by rootpi · 2021-03-18T10:50:39.242Z · EA(p) · GW(p)

Thanks for this great post. I'm closer to left-libertarian or classical liberal myself, but I have many friends and family (mostly in the US) who are more traditional progressives and much more sympathetic to typical social justice concerns than to EA. I agree with many of the issues identified here (including in the comments); my own experience has been that it is largely that they want to be able to "walk and chew gum at the same time". As an economist, I'm imbued with notions like opportunity cost and only being able to optimize one goal at a time (potentially itself an aggregation of course), but this is very foreign and off-putting to them. Either they don't understand the size of the actual disparities between issues, or... well actually I'm not sure, it's hard for me to wrap my head around.

However I particularly wanted to mention an illuminating recent post by Matt Yglesias (who came up elsewhere in the comments) on his substack:  https://www.slowboring.com/p/slate-star-codex 

The main topic is distinct, but from "The radicalism of effective altruism" onward it is very relevant and informative. On the one hand Yglesias is criticizing the journalist's progressive critique of EA, SSC, Silicon Valley, etc. On the other hand Yglesias (who is definitely on the left, and who likes evidence and reason a lot) doesn't end up very sympathetic to EA himself. He thinks of it as purely consequentialist, extreme, etc. Even if it's hard to attract some full-on progressives, someone like him should be exactly the type of person who  supports EA. Something has gone wrong with the messaging if that isn't the case, and we are missing out.

Replies from: meerpirat
comment by meerpirat · 2021-03-18T12:25:07.920Z · EA(p) · GW(p)

I agree with the last point, and I think EA is doing fairly well on the being sympathetic to Matt Yglesias front:

comment by Jordan_Warner · 2021-03-18T08:54:59.803Z · EA(p) · GW(p)

I found this helpful, I'm in a similar situation of moving from "social justice" (mainly concerned with homelessness in my own city) to Effective Altruism, and so am trying to think of good ways to engage people/slightly concerned that if we don't phrase things in the correct way the left may try to destroy us.

I wonder if talking about the causes of international economic inequality makes it seem more like an issue of injustice to be addressed from a progressive/social justice framework? That's one way I'd frame the issue when talking about EA principles to a left-of-centre audience.  I don't subscribe to a zero-sum view of development in which all wealth is taken from someone else, but it's undeniable that most currently wealthy nations benefitted from colonialism at the expense of the rest of the world, and we all continue to participate in an economic system that is pretty clearly constructed to benefit multinational corporations rather than individual producers. I'd  also argue that donating to effective charity should at least be part of living an ethical lifestyle, and that many of the other issues people may find more emotionally compelling, like human trafficking and exploitative employment, are primarily rooted in poverty. 

I also point out how basically everyone in the audience is in the top 10% globally, although I feel like this is probably less effective when talking to students since their wealth is mostly in the future. I've also found that the very progressive idea that everyone should be treated equally is one argument in favour of international aid, that x100 multiplier goes a long way! However, it is difficult to convince people that life for the poorest 10% of people in the world really is a lot worse than life for the poorest 10% of people in a wealthy country, although access to food, medicine and housing is probably the area that makes this clearest.

Also, use emotional appeals, although that's just good advice when trying to persuade humans generally, although ideally use this to support  rather than instead of facts and evidence, because we probably can't win solely based on emotional appeals. This is obviously easiest in the context of global health, AMF has loads of pictures of smiling children holding mosquito nets, and GiveDirectly has loads of personal stories of how people actually spent the money.

comment by Arepo · 2021-03-18T09:09:45.817Z · EA(p) · GW(p)

Helpful post!

What makes you say rejecting person-affecting views has uncomfortable (for progressive) and environmental ethics, out of curiosity? I would have thought the opposite: person-affecting views struggle not to treat environmental collapse as morally neutral if it leads to a different set of people existing than would have otherwise.