Posts

Would you like to host an effective giving talk in your workplace this Giving Season? 2022-09-06T11:04:29.174Z
EA in the mainstream media: if you're not at the table, you're on the menu 2022-07-30T16:55:45.184Z
FTX/CEA - show us your numbers! 2022-04-18T12:05:25.707Z
Lessons and results from workplace giving talks 2022-03-10T10:26:06.192Z
Why your EA group should promote effective giving (and how) 2022-02-17T16:02:06.606Z
Join our collaboration for high quality EA outreach events (OFTW + GWWC + EA Community) 2021-02-23T02:49:02.960Z
One for the World - January 2021 update 2021-02-12T05:14:59.433Z
How we promoted EA at a large tech company (v2.0) 2021-01-12T03:15:31.283Z

Comments

Comment by Jack Lewars (jlewars) on Fundraising Campaigns at Your Organization: A Reliable Path to Counterfactual Impact · 2022-09-07T20:51:47.119Z · EA · GW

These are amazing results guys, well done!

How did you select which companies you worked with?

Comment by Jack Lewars (jlewars) on Earn To Give $1M/year or Work Directly? · 2022-09-06T08:19:41.904Z · EA · GW

I agree with your prior - but I would add that I think my chances of getting someone with 10 years more experience than anyone else on my team goes up if my first choice gives me $1m/year!

I'm happy to answer for One for the World that I would take the Earning to Give option and readvertise, unless for some reason I thought I had a uniquely good candidate. Happy for you to link this in your main post as well.

Also, to be clear, I 100% support your main point, which is that people should apply for jobs they think they would be good at. I also want to keep EtG as a serious EA pathway, though, as I think EA is going to need a lot more funding and/or ops funding to achieve what it wants to over time.

Comment by Jack Lewars (jlewars) on Earn To Give $1M/year or Work Directly? · 2022-09-05T08:14:26.059Z · EA · GW

These are both true but not quite addressing what I was asking.

I think this is better: 'how sure do you have to be that you found the best person available to turn down the option of having the next best person, plus e.g. $250k in cash?'

Comment by Jack Lewars (jlewars) on Earn To Give $1M/year or Work Directly? · 2022-09-03T17:30:13.630Z · EA · GW

I may be totally off base here because I'm a terrible capitalist in the sense of being a really bad one, but how does this reasoning sound:

It only seems reasonable to ask someone to work for you instead of giving you $1m if they are uniquely the best person for that job.

I am very, very sceptical of any hiring process, including my own, finding a uniquely good candidate.

Assuming no orgs currently pay $1m salaries, surely you should always take the money, add say $100k to the advertised salary, readvertise, hire the best person you get then and keep the change?

I guess another way of framing this is 'how sure do you have to be that you found the best person available to turn down even $250k in cash plus the person you find once you readvertise?'

Outside some technical fields, where you may very genuinely be talking to the best programmer or mathematician in the world, surely you should always take the cash?

Comment by Jack Lewars (jlewars) on EA Giving Tuesday will likely hibernate in 2022, unless handed off to another EA-aligned organization · 2022-08-31T10:38:42.302Z · EA · GW

Hi guys - OFTW is interested in hosting this. I'll reach out by email.

Comment by Jack Lewars (jlewars) on Some updates in EA communications · 2022-08-03T13:53:08.229Z · EA · GW

Superb.

Comment by Jack Lewars (jlewars) on EA in the mainstream media: if you're not at the table, you're on the menu · 2022-08-02T09:16:24.684Z · EA · GW

Yes, although this was a total guess, assuming $200k/person for two senior people and then $100k for a more junior person. I have no idea what good PR professionals earn.

There is also a practical issue of trying to have multiple people covering different markets/languages, but I guess you use consultants strategically to overcome that.

Comment by Jack Lewars (jlewars) on EA in the mainstream media: if you're not at the table, you're on the menu · 2022-08-02T09:15:27.326Z · EA · GW

This is consistent with the other advice I've been given. Apparently you need some people to really 'specialise' in your area, so that they can do high fidelity messaging, be the go-to place for comment etc.; so this seems like a good argument for trying to find a small group of EA-aligned (or 'teachable') professionals and then buy their time.

Comment by Jack Lewars (jlewars) on EA in the mainstream media: if you're not at the table, you're on the menu · 2022-08-02T09:10:52.750Z · EA · GW

Hey - this is great, and very reassuring. It's great that all this exists!

I would just point out one distinction, which is between 'marketing' and 'PR'. I'm not well-versed in this, but this is how I understand it: marketing = trying to get people to do your thing (access your services, donate, come to your conference, join your movement); PR = managing the public's perception of you (getting positive press, combating bad press, crisis management). I think the two often overlap (e.g. in general awareness raising) but aren't the same.

If this is right, a lot of the above is marketing, where I completely agree that there have been great strides recently (e.g. more Comms directors at EA orgs, the EA Market Testing initiative, the new EA digital marketing agency etc. etc.); but there seems to me to be a lot less PR (e.g. managing mainstream press coverage). So while One for the World is an enthusiastic part of message testing to try to boost donations, we're not aware of any efforts to place more positive press coverage of effective donations in general.

All that said, these are all great initiatives and it's exciting to see them come to fruition.

Comment by Jack Lewars (jlewars) on EA in the mainstream media: if you're not at the table, you're on the menu · 2022-08-01T20:26:18.403Z · EA · GW

That's really excellent to hear.

Comment by Jack Lewars (jlewars) on EA in the mainstream media: if you're not at the table, you're on the menu · 2022-08-01T09:57:56.774Z · EA · GW

Thanks Ben. A few things:

  • I read the posts I could find on this topic on the forum, none of which mention a PR agency or hiring a PR team for the movement
  • I've talked about this with a lot of other EA professionals and no one has mentioned that this idea is coming, although all of them thought it was a good idea to some degree
  • I posted it as an idea for FTX and it wasn't taken up, without feedback or any suggestion that it was already happening
  • I visited the many examples of mainstream press criticism of EA cited in other posts and saw no response or comment 'from EA'

But, most importantly, the things you list here don't address the suggestion of this post. Individual orgs being advised by PR professionals, or Longview aiming for more press coverage, has at best partial overlap with the effect of a dedicated PR team for EA.

This is also self-evidently not having the effect of countering op-eds like this one (although that might be low priority work).

So, when OFTW, at least one GiveWell charity and presumably others are forwarded this op-ed by potential or actual major donors, and there is crickets 'from EA' in response, I think it's pretty reasonable to say 'this seems like a good idea - can we make it happen?'

Comment by Jack Lewars (jlewars) on EA in the mainstream media: if you're not at the table, you're on the menu · 2022-08-01T09:36:50.989Z · EA · GW

Thanks Max - really good to hear. Will CEA's Head of Comms be focussed more on CEA comms or EA Movement comms? I see some other comments about not over-centralising this but I also worry about capacity for one person/someone with more than one brief to monitor and be proactive about the whole space?

Definitely a really positive development though.

Comment by Jack Lewars (jlewars) on EA in the mainstream media: if you're not at the table, you're on the menu · 2022-07-31T09:01:15.157Z · EA · GW

Yes, I agree. I was just trying to explore briefly why people might think this was a bad use of time/money, and thought 'don't give this stuff oxygen' might be one of those arguments. But it's not one I agree with.

Comment by Jack Lewars (jlewars) on EA in the mainstream media: if you're not at the table, you're on the menu · 2022-07-31T08:59:00.352Z · EA · GW

I agree.

Comment by Jack Lewars (jlewars) on Any corporate donation matching platform? · 2022-07-30T16:57:16.304Z · EA · GW

Do you want the platform to highlight effective charities specifically?

For ease of use, Benevity is probably the best. Others include YourCause in the US, and CAF GAYE in the UK.

But none of these is effectiveness orientated, so it depends how important that is for you.

Comment by Jack Lewars (jlewars) on EA Hub Berlin / German EA Content: Two meta funding opportunities · 2022-06-03T09:21:50.938Z · EA · GW

As a user of the coworking space, I think it's immensely valuable (OFTW already pays for our space there). I think developing the space further would be excellent and would also provide a model for other people interested in creating EA coworking spaces.

Comment by Jack Lewars (jlewars) on My GWWC donations: Switching from long- to near-termist opportunities? · 2022-04-26T13:34:37.048Z · EA · GW

Does the relative amount of evidence and uncertainty affect your thinking at all? I have heard indirectly of people working in longtermism who donate to neartermist causes because they think it hedges the very large uncertainties of longtermism (both longtermist work and donations). 

As you say, the neartermist donation options recommended by EA benefit from very robust evidence, observable feedback loops, tried-and-tested organisations etc., and that could be a good hedge if you're working in an area of much higher uncertainty.

Comment by Jack Lewars (jlewars) on FTX/CEA - show us your numbers! · 2022-04-22T09:21:19.610Z · EA · GW

Very interesting, thanks. I read this as more saying 'we need to be prepared to back unlikely but potentially impactful things', and acknowledging the uncertainty in longtermism, rather than saying 'we don't think expected value is a good heuristic for giving out grants', but I'm not confident in that reading. Probably reflects my personal framing more than anything else.

Comment by Jack Lewars (jlewars) on FTX/CEA - show us your numbers! · 2022-04-22T09:19:02.762Z · EA · GW

Like you, I'm fairly relaxed about asking people publicly to be transparent. Specifically in this context, though, someone from FTX said they would be open to doing this if the idea was popular, which prompted the post.

As a sidenote, I think also that MEL consultancies are adept at understanding context quickly and would be a good option (or something that EA could found itself - see Rossa's comment). My wife is an MEL consultant, which informs my view of this. But that's not to say they are necessarily the best option.

Comment by Jack Lewars (jlewars) on FTX/CEA - show us your numbers! · 2022-04-20T08:00:15.029Z · EA · GW

Absolutely. And so the questions are:

  • have we defined that ROI threshold?

  • what is it?

  • are we building ways to learn by doing into these programmes?

The discussions on post suggest that it's at least plausible that the answers are 'no', 'anything that seems plausibly good' and 'no', which I think would be concerning for most people, irrespective of where you sit on the various debates/continuums within EA.

Comment by Jack Lewars (jlewars) on FTX/CEA - show us your numbers! · 2022-04-19T18:19:53.103Z · EA · GW

I like this.

I'm not sure I agree with you that I find it equally worrying as moving so fast that we break too many things, but it's a good point to raise. On a practical level, I partly wrote this because FTX is likely to have a lull after their first grant round where they could invest in transparency.

I also think a concern is what seems to be such an enormous double standard. The argument above could easily be used to justify spending aggressively in global health or animal welfare (where, notably, we have already done a serious, serious amount of research and found amazing donation options; and, as you point out, the need is acute and immediate). Instead, it seems like it might be 'don't spend money on anything below 5x GiveDirectly' in one area, and the spaghetti-wall approach in another.

Out of interest, did you read the post as emotional? I was aiming for brevity and directness but didn't/don't feel emotional about it. Kind of the opposite, actually - I feel like this could help to make us more factually aligned and less driven by emotional reactions to things that might seem like 'boondoggles'.

Comment by Jack Lewars (jlewars) on FTX/CEA - show us your numbers! · 2022-04-19T16:33:29.740Z · EA · GW

Thanks - I missed that update, and wouldn't have written about CEA above if I had seen it, I think.

Comment by Jack Lewars (jlewars) on FTX/CEA - show us your numbers! · 2022-04-19T16:29:39.583Z · EA · GW

Indeed :-) I had understood from this post (https://forum.effectivealtruism.org/posts/FjDpyJNnzK8teSu4J/) that this was the destination, though, so the current rate of spending would be less relevant than having good heuristics before we get to that scale.

I see from Max below, though, that Open Phil is assuming a lot of this spending, so sorry for throwing a grenade at CEA if you're not actually going to be behind a really 'move the needle' amount of campus spending.

Comment by Jack Lewars (jlewars) on FTX/CEA - show us your numbers! · 2022-04-19T16:24:59.483Z · EA · GW

Indeed - and to be clear, I wasn't trying to suggest that you shouldn't have made the comment - just that it's very secondary to the substance of the post, and so I was hoping the meat of the discussion would provoke the most engagement.

Comment by Jack Lewars (jlewars) on FTX/CEA - show us your numbers! · 2022-04-19T16:20:16.110Z · EA · GW

This would be great. It also closely aligns with what EA expects before and after giving large funding in most cause areas.

Comment by Jack Lewars (jlewars) on FTX/CEA - show us your numbers! · 2022-04-19T16:16:10.220Z · EA · GW

I'm not sure why the burden wouldn't fall on people making the distribution of funds? (Incidentally, I'm using this to mean that the funders could also hire external consultancies etc. to produce this.)

But, more to the point, I wrote this really hoping that both organisations would say "sure, here it is" and we could go from there. That might really have helped bring people together. (NB: I realise FTX haven't engaged with this yet.)

In many ways, if the outcome is that there isn't a clear/shared/approved expected value rationale being used internally to guide a given set of spending, that seems to validate some of the concerns that were expressed at EAG.

Comment by Jack Lewars (jlewars) on FTX/CEA - show us your numbers! · 2022-04-19T16:07:24.294Z · EA · GW

That's right, and this was very casually phrased, so thanks for pulling me up on it. A better way of saying this would be: "if you're going to distribute billions of dollars in funding, in a way that is unusually capable of being harmful, but don't have the time to explain the reasoning behind that distribution, it's reasonable to ask you to hire people to do this for you (and hiring is almost certainly necessary for lots of other practical reasons)."

Comment by Jack Lewars (jlewars) on FTX/CEA - show us your numbers! · 2022-04-19T16:02:25.208Z · EA · GW

Hi Max - I took this from CEA's post here (https://forum.effectivealtruism.org/posts/FjDpyJNnzK8teSu4J/), which aims for campus centres at 17 schools controlling "a multi-million dollar budget within three years of starting", and which Alex HT suggested in the comments would top out at $3m/year. This suggested a range of $17m-$54m.

Comment by Jack Lewars (jlewars) on FTX/CEA - show us your numbers! · 2022-04-18T22:17:55.311Z · EA · GW

Candidly, I'm a bit dismayed that the top voted comment on this post is about clickbait.

Comment by Jack Lewars (jlewars) on FTX/CEA - show us your numbers! · 2022-04-18T22:01:36.097Z · EA · GW

Thanks - this is exactly what I think is useful to have out there, and ideally to refine over time.

My immediate reaction is that the % changes you are assigning look very generous. I doubt a $15 dinner makes some 1% more likely to pursue an impactful career; and especially that a subsidised flight produced a 5% swing. I think these are likely orders of magnitude too high, especially when you consider that other places will also offer free dinners/retreats.

If a $400 investment in anything made someone 5% more likely to pursue an impactful career, that would be amazing.

But I guess what I'm really hoping is that CEA and FTX have exactly this sort of reasoning internally, with some moderate research into the assumptions, and could share that externally.

Comment by Jack Lewars (jlewars) on FTX/CEA - show us your numbers! · 2022-04-18T17:31:32.277Z · EA · GW

I've updated this now: it's a Back Of The Envelope Calculation.

Comment by Jack Lewars (jlewars) on FTX/CEA - show us your numbers! · 2022-04-18T17:25:58.372Z · EA · GW

Thanks Jessica, this is helpful, and I really appreciate the speed at which you replied.

A couple of things that might be quick to answer and also helpful:

  • is there an expected value of someone working in an EA career that CEA uses? The rationale above suggests something like 'we want to spend as much as top tier employers' but presumably this relates to an expected value of attracting top talent that would otherwise work at those firms?
  • I agree that it's not feasible to produce, let alone publish, a BOTEC on every payout. However, is there a bar that you're aiming to exceed for the manager of a group to agree to a spending request? Or a threshold where you'd want more consideration about granting funding? I'm sure there are examples of things you wouldn't fund, or would see as very expensive and would have some rule-of-thumb for agreeing to (off-site residential retreats might be one). Or is it more 'this seems within the range of things that might help, and we haven't spent >$1m on this school yet?'
  • is there any counterfactual discounting? Obviously a lot of very talented people work in EA and/or have left jobs at the employers you mention to work in EA. So what's the thinking on how this spending will improve the talent in EA?
Comment by Jack Lewars (jlewars) on FTX/CEA - show us your numbers! · 2022-04-18T17:18:41.266Z · EA · GW

True, and it seems like a necessary step on its own, but I'm wary of people 'deducing' too much. Right now, a lot of the anxiety seems to be coming from people trying to deduce what funders might be thinking; ideally, they'd tell people themselves.

Comment by Jack Lewars (jlewars) on FTX/CEA - show us your numbers! · 2022-04-18T17:07:38.657Z · EA · GW

In the spirit of this post, maybe you could share these informal BOTECs?

'Here is a BOTEC' is going to help more than 'I've done a BOTEC and it checks out'.

(I appreciate the post isn't actually aimed at you)

Comment by Jack Lewars (jlewars) on Free-spending EA might be a big problem for optics and epistemics · 2022-04-17T18:10:59.794Z · EA · GW

Completely agree. I will write something about this tomorrow

Comment by Jack Lewars (jlewars) on Legal support for EA orgs - useful? · 2022-04-14T20:21:21.274Z · EA · GW

One other use case: EA orgs are increasingly using Employers Of Record to employ staff in countries where they aren't registered. The agreements with the firms that provide this are important - they are frequently worth >$100k - and can be quite dense. I wouldn't be surprised if you got more demand for reviewing these than for immigration advice (although I have literally no data to support that view).

Comment by Jack Lewars (jlewars) on EA and Global Poverty. Let's Gather Evidence · 2022-04-14T13:12:21.554Z · EA · GW

I have thought this about PR for some time. Do you know anyone in EA who is skilled at PR?

Comment by Jack Lewars (jlewars) on Free-spending EA might be a big problem for optics and epistemics · 2022-04-14T12:56:25.357Z · EA · GW

This is a very interesting point that, for me, reinforces the importance of keeping effective giving prominent in EA. It is both a good thing, and also a defence against accusations of self-serving wastefulness, if a lot of people in the community are voluntarily sacrificing some portion of their income (with the usual caveats about 'if you have actual disposable income).

GWWC, OFTW etc. may be doing EA an increasing favour by enlisting a decent proportion of the community to be altruistic.

It's also noticeable that giving seems to be least popular with longtermists, who also seem to be doing the most lavish spending.

Comment by Jack Lewars (jlewars) on Free-spending EA might be a big problem for optics and epistemics · 2022-04-14T12:46:36.184Z · EA · GW

True, but GiveWell doesn't expect funding to grow at the same rate as top quality funding opportunities, so that $1bn/year is going to need further donors. Unless we believe GiveWell's top programmes/charities will never have a funding shortfall again, the point about where EA prioritises its funding still seems relevant.

Donating to AMF still seems like a good benchmark for cost effectiveness. Unlike George, my instinct is that e.g. a team retreat for an EA Group is likely to produce considerably less impact than spending the money on bednets or other GiveWell top charities.

Comment by Jack Lewars (jlewars) on I feel anxious that there is all this money around. Let's talk about it · 2022-04-07T12:56:42.326Z · EA · GW

Agreed. I wasn't clear in the original post but I particularly had in mind this one attack ad, which is intellectually bad faith.

Comment by Jack Lewars (jlewars) on I feel anxious that there is all this money around. Let's talk about it · 2022-03-31T11:32:03.384Z · EA · GW

Thanks for writing this. I share some of this uneasiness - I think there are reputational risks to EA here, for example by sponsoring people to work in the Bahamas. I'm not saying there isn't a potential justification for this but the optics of it are really pretty bad. 

This also extends to some lazy 'taking money from internet billionaires' tropes. I'm not sure how much we should consider bad faith criticisms like this if we believe we're doing the right thing, but it's an easy hit piece (and has already been done, e.g. a video attacked someone from the EA community running for congress about being part-funded by Sam Bankman-Fried - I'm deliberately not linking to it here because it's garbage).

Finally, I worry about wage inflation in EA. EA already mostly pays at the generous end of nonprofit salaries, and some of the massive EA orgs pay private-sector level wages (reasonably, in my view - if you're managing $600m/year at GiveWell, it's not unreasonable to be well-paid for that). I've spent most of my career arguing that people shouldn't have to sacrifice a comfortable life if they want to do altruistic work - but it concerns me that entry level positions in EA are now being advertised at what would be CEO-level salaries at other nonprofits. There is a good chance, I think, that EA ends up paying professional staff significantly more to do exactly the same work to exactly the same standard as before, which is a substantive problem; and there is again a real reputational risk here.

Comment by Jack Lewars (jlewars) on FTX Future Fund and Longtermism · 2022-03-18T17:08:25.059Z · EA · GW

This format is amazing. More please.

Comment by Jack Lewars (jlewars) on The Future Fund’s Project Ideas Competition · 2022-03-02T20:25:30.211Z · EA · GW

EA influencers

Effective Altruism

More awareness of EA = more talent and money for EA

Pay A-list influencers, with followings independent of EA, to promote EA content and themes. Concentrate on influencers popular with GenZ.

Risks: lack of message fidelity

Comment by Jack Lewars (jlewars) on The Future Fund’s Project Ideas Competition · 2022-03-02T20:21:43.422Z · EA · GW

I think this exists (but could be much bigger and should still be funded by this fund).

Comment by Jack Lewars (jlewars) on Why your EA group should promote effective giving (and how) · 2022-02-21T09:37:55.111Z · EA · GW

Hi Peter - thanks for this. To your/their credit, I think EAIF is doing a really good job of filling some of these gaps. As you say, though, the gatekeepers and funder diversity issues do remain.

I'm also conscious that the current EAIF committee has made some really positive changes - but also that I guess the next committee could plausibly feel differently!

Comment by Jack Lewars (jlewars) on Why your EA group should promote effective giving (and how) · 2022-02-21T09:34:30.256Z · EA · GW

Thanks Mauricio - I think we are in roughly the same place here :-)

I especially like the idea of groups testing outreach and rebalancing on the results.

To be clear, I would expect most student groups to continue to prioritise non-giving outreach and I think that's great - it's likely impactful and it offers variety, and an entry point for low income students, which is super important.

Our concern is the number of groups doing no giving outreach at all. If every group did their existing programming, but added a giving session each semester (or a pledge drive), we'd be delighted!

Comment by Jack Lewars (jlewars) on Why your EA group should promote effective giving (and how) · 2022-02-18T14:05:23.524Z · EA · GW

Thanks for this Bridges, and I'm sorry you had a negative experience with giving. It's definitely a positive that EA has broader programming now and I agree that there is a real danger of alienating people who come from less affluent backgrounds. I'm really delighted that you've found a way back to EA now :-)

A couple of points: I'm not sure I agree that giving isn't a team sport - Giving What We Can and One for the World both see a lot of engagement in our communities, from meet ups to webinars to socials. 

I think our point is that it's a shame to neglect giving entirely. As you say, it can often be part of the menu of EA without significant costs to other aspects; and while you were really inspired by longtermism and careers advice, thousands of people have presumably been inspired by Giving What We Can and One for the World when they've taken our pledges.

There seems to be good counter-evidence that talking to students about giving isn't a good idea at all - it's been done successfully in so many places for so long within EA and in so many other social movements. Tactics like future-dated donations, pledges that don't start immediately or focussing on trivial amounts while you're still studying can all help. But doing this sensitively is really important and that's part of why we're trying to offer training and resources!

Anyway, in summary, I'm really pleased you're back in EA; and I hope we can mitigate these risks well going forward.

Comment by Jack Lewars (jlewars) on Why your EA group should promote effective giving (and how) · 2022-02-18T13:50:00.122Z · EA · GW

Thanks for this Mauricio. It's good to have an alternative perspective added to this, which was written by quite convinced advocates for one way of thinking!

I think you make a good point that this is a theory that seems to align very closely with the reality of EA, rather than an absolutely established phenomenon. So, for example, we don't have data in the EA survey that says 'people say they would likely drop out if they weren't donating' or 'we see higher rates of drop out amongst people who don't donate versus those who do'. That's not to say those statements aren't plausibly true - it's just the survey isn't set up to capture them.

It seems unlikely, though, that it's coincidental that the foremost and most longstanding members of EA have given throughout their engagement and often seem to increase their giving over time (cf. Julia, Will, Toby, everyone at Longview, ~everyone at GiveWell). This also aligns with our experience of talking to the EA community. Obviously anecdotal evidence is weaker than some sort of systematic evidence but if you have a theory that is plausibly true, aligns with common sense and then is supported by a lot individual cases, that seems enough to think this is 'signal' rather than coincidence.

To address some specific points:

-Careers advice may be more popular than programming about giving - it makes sense, as both parties want the thing on offer. It's the opposite of asking for some sacrifice - you can receive careers advice purely out of self-interest. Equally, though, lots of students are passionate about social justice, making a difference etc. and can be attracted to EA precisely by talking about giving. Career change isn't for everyone, especially when EA careers advice can focus on careers that need significant technical expertise, like biorisk or AI safety. Careers advice also has some hazier routes to impact in its theory of change than a lot of effective giving.
- I'd challenge the idea that the majority of students are charity sceptics. A quick Google suggests exactly the opposite Gen Z gives more and more widely than older generations. Gen Z and Millennials are seen as activist generations, so I'd be really surprised if the median Gen Z-er is a donation sceptic, and the data seems to undermine this idea reasonably firmly.
- I'm surprised a) that you haven't seen the result of any donations and b) that you're sceptical that is has shorted feedback loops than a career change. If you're at university and alter your career plans, I'd guess you'd have to wait at least 2-3 years to see any impact from that? And plausibly way, way longer? If you donate $10 to AMF today, you'll be able to see the bednet distribution you funded in a much shorter timescale. I can see a donation I made in November '21 has already funded nets that are ready in the factory for distribution in the Congo. Maybe this changes depending on what you donate to?
- costly signalling is a widely-referenced theory (the Wikipedia pages on it are instructive), although in fairness it's more broadly cited in relation to signalling to others rather than necessarily deepening your personal commitment (a costly signal is seen as more honest and therefore more powerful)
- Candidly, I think opportunity costs are frequently overstated. We do acknowledge this above and give examples of how giving can be incorporated into existing programming. However, we also think there's a frequent fallacy in EA, where we make all decisions as if they are zero sum (e.g., to pick a particularly odd example, 'we shouldn't give blood because we could spend that time earning $x and giving it to an effective charity', when of course almost everyone in EA can do both simultaneously). Often this choice isn't real. Of course EA groups need to make some decisions about prioritisation; but are most EA groups genuinely so maxed out that they couldn't weave giving into their existing programming or even run an extra session?

Overall, I think you do a good job of laying out possible drawbacks of this approach. I'm not convinced they add up to a really robust argument to neglect effective giving entirely, though. And I'd challenge you in return that maybe you're understating the opportunity costs of only focussing on careers advice, while overstating some of these drawbacks.

Comment by Jack Lewars (jlewars) on Why your EA group should promote effective giving (and how) · 2022-02-18T13:04:39.656Z · EA · GW

Hi Yonatan,

A full answer to this would be very detailed, so do fill out our form if you'd like us share resources and tactics in more detail. 

In brief, I think the main thing is to frame giving as an opportunity, rather than an obligation. There are some pretty robust arguments that it actually is an obligation, if we have disposable income in high income countries - but this tends to be less effective as a persuasion strategy and has more risks around people feeling unduly pressured.

If we talk about the incredible opportunity we have to save a life, or improve animal welfare, without really making any noticeable sacrifice in our own lives, we can inspire people to give. I don't think we need to pressure people (e.g. by saying 'you're a bad person' or 'if you don't do this you're not an effective altruist'). But we can absolutely raise awareness and persuade people.

Many people, especially at universities, already have some sense that they are in a position of privilege and would like to 'make a difference', and for these people it's just a case of raising their awareness - you're actually solving a problem for them. Others can be persuaded if we highlight, for example, where the median graduating salary from a university places them in the income distribution of their home country, or indeed globally.

And I think it's worth emphasising that we're not saying that everyone should take a pledge that will meaningfully reduce their income - if you're earning substantially above the median wage, it's likely that you can give something like 1% with literally no effect at all on your material quality of life. So, again, I don't think that explaining this framing to people is pressuring them.

Ultimately, of course, any movement seeks to persuade people - we persuade people to change career plans, or majors, or eat less meat - and persuading them to give falls within this spectrum.

Comment by Jack Lewars (jlewars) on Why your EA group should promote effective giving (and how) · 2022-02-17T21:33:04.524Z · EA · GW

Thank Luke. You guys are also an option in the contact form, so I'll forward anything relevant