How I Recommend University Groups Approach the Funding Situation

post by sabrinac (sabrinachwalek) · 2022-07-04T15:16:35.082Z · EA · GW · 19 comments

Contents

  I. Introduction
  II.  How the funding situation uniquely affects university groups
    Degrading Epistemics 
    Expanding the Neartermist-Longtermist Divide
    Attracting rent-seekers
    Creating New Opportunities 
  III. What we can do 
    Don’t be afraid to ask for help
    How to communicate about the increase in funding
      Free stuff is not actually free
      Don’t widely advertise that “EA has money”
      Be careful how you advertise EAG conferences
    Establish healthy groups norms around money 
      Develop spending heuristics
      Buy the cheaper version of things (or don’t spend extravagantly)
    Consider reintroducing (or maintaining) other altruistic signals
      Have conversations about why you care
      Buy vegan food
      Emphasize effective giving
  IV. FAQ: Talking to group members about money in EA
      Why does EA have so much money? 
      Why do longtermism and existential risk reduction have so much money compared to global health & development and animal welfare?
      If longtermism has a larger funding overhang, why doesn’t EA fund [insert cause] instead?
      Where does the money for EA groups come from?
      Why is EA funding students to go to conferences?
      Why are students getting funding to community build? Is community-building a ponzi scheme? 
  V. Resources
None
19 comments

Thanks to Emma Abele, Harry Taussig, Jemima Jones, George Rosenfeld, James Aung, Emma Williamson and Matt Burtrell for all their helpful feedback and comments. 

I. Introduction

Over the past few months, there’s been a lot of discussion about the funding situation in EA and how it’ll change the community’s culture, epistemics, and public reputation. Like most sub-areas in the community, the funding situation will affect university groups and pose new challenges (and create new opportunities) for community-builders. 

I’m writing this post to address how the increase in funding uniquely affects university groups and propose how community-builders should mitigate potential risks. While I’m largely optimistic about the trajectory of university groups, I also have a number of community-health concerns which I outline below. Especially if some of the problems outlined in this post [EA · GW] are true, we want to create healthy norms and cultures within EA groups before scaling them. I expect this post will be most useful for anyone who currently runs a university EA group.

TLDR

The increase of funding for community groups is great overall, however, there are a few ways it’ll uniquely negatively affect groups: 

  1. Degrade group epistemics, especially because young people are more impressionable and drawn to potential employers offering them free stuff 
  2. Expand the neartermist-longtermist divide within EA groups, which might be bad for multiple reasons
  3. Increase rent-seeking behavior  (e.g. students who want funding and free trips to EA conferences) 

Groups can mitigate these concerns in the following ways: 

  1. Ask for help when sensitive situations arise
  2. Communicate carefully about money in EA and why certain opportunities are free for students
  3. Create healthy group norms around spending, such as avoiding buying extravagant things 
  4. Maintain altruistic signals such as buying vegan food, promoting effective giving, and talking candidly about why you got involved in EA

The last section of the post also has an FAQ with questions that community-builders commonly get asked and sample responses. 

If you’re a community-builder who’s had problems with the issues I’ve detailed in this post, especially with problematic individuals in your group, I strongly encourage you to reach out to the community-health team for support. You can fill out this form to contact them or email Catherine Low directly (catherine@centreforeffectivealtruism.org). 

II.  How the funding situation uniquely affects university groups

Funding will create a lot of new opportunities for groups and their members. However, it’ll also change group cultures, expand the opportunity gap between working on neartermist vs. longtermist causes, and attract people who are more rent-seeking than altruistically inclined.

This section focuses on outlining increased funding’s positive/negative effects on groups. You can skip to this section if you want to read my proposals for what groups should do.   

Degrading Epistemics 

I pretty much agree with everything that this post says [EA · GW] in terms of money being an epistemic problem for groups, and this is my biggest concern. 

“Consider the case of a college freshman. You read your free copy of Doing Good Better and become intrigued. You explore how you can get involved. You find out that if you build a longtermist group in your university, EA orgs will pay you for your time, fly you to conferences and hubs around the world and give you all the resources you could possibly make use of. This is basically the best deal that any student society can currently offer. Given this, how much time are you going to spend critically evaluating the core claims of longtermism? And how likely are you to walk away if you’re not quite sure? Anecdotally, I’ve spoken to several organisers who aren’t convinced of longtermism but default to following the money nevertheless. I’ve even heard (joking?) conversations about whether it’s worth 'pretending' to be EA for the free trip.“

I also want to point out that young people are very impressionable. When I think back to when I first got involved with EA, I was much more likely to defer to other people’s judgement and follow their career advice. When you’re an 18 year-old who wants to make a difference but doesn’t have a clue what they should do with their career, it’s very tempting to do whatever 80k or [insert high-status person] says, especially if that person is offering you funding and free stuff.  

I’m also concerned that the influx of money into community-building will exacerbate existing problems in group epistemics. I don’t think funding currently incentives community-builders to create healthy epistemic environments; it’s up to good community-builders to solve these problems. It’s easy to apply and get funding to run an intro fellowship and intro talks, but it’s harder to develop high-fidelity models of community-building that focus on helping young EAs reason clearly and skill-build. 

(Caveat: Funding isn’t necessarily the main problem here. We probably need better models of community-building and more experimentation.)

Expanding the Neartermist-Longtermist Divide

While community-building over the past couple years, I’ve at times felt sad that I couldn’t offer more support to people in my group who are interested in global health & development, animal welfare, or other neartermist causes. Not everyone is cut out (or motivated) to work on reducing existential risks, but due to the talent bottlenecks in longtermism, there are more early career opportunities and funding for university students interested in movement-building or x-risk reduction compared to early-career opportunities in animal welfare/global poverty. Compounded with the fact that nearly all highly-engaged community-builders I’ve met are either sympathetic to or focused on reducing existential risks, this complicates group dynamics. 

When the majority of dedicated people in your EA group are all working on reducing existential risks and nearly all early career opportunities in EA are longtermist-focused, we’ll likely expand the division between neartermists and longtermists. Especially since status in EA now correlates more to doing “high-impact longtermist work,” we risk creating environments where people working on animal welfare or global health & development don’t feel as welcome or respected in their EA group. And the prospect of this makes me really sad—I don’t want anyone working on reducing the suffering of a billion people in extreme poverty and trillions of factory-farmed animals to feel belittled or ostracized. Even if it may be true, it’s difficult to tell people “hey, if you care about extreme poverty and animal welfare, you should just earn to give.”

I expect these effects to only grow larger as the number of longtermist opportunities increase and longtermist careers become much sexier than neartermist careers. (For example, paying higher salaries, offering personal assistants, having really nice office spaces, etc.) 

There are a number of reasons why expanding this divide could be bad. First, EA as a movement encourages people to reason for themselves about how to do the most good and pushing group members to adopt strong philosophical worldviews goes against this. Second, you might want EA to still practice worldview diversification, or take actions that are more robust to different moral frameworks. Third, there might be negative community-health effects from expanding this divide. I’ve heard people in Brown EA say that they felt “baited and switched by EA about longtermism,” and I’ve seen plenty of cases where neartermists feel out of place in EA discussions. 

Even if you are a longtermist, lots of people get into longtermism through global health & development or animal welfare (myself included). It also seems like if you accept the belief that we should prioritize longtermist work to the extent of all else, then you should run a community-group focused on longtermism or existential risk reduction. (I’m not making claims about whether longtermism/neartermism is more true, but it seems suboptimal for both communities to constantly be in tension with each other.) 

Attracting rent-seekers

Another side effect of increasing funding for university groups is that it will attract people who are more rent-seeking than altruistic. While I’ve only really heard about a few edge cases of students going to EA conferences for deceptive or selfishly-motivated reasons, I wonder how this will change in the coming years. When I hear of a university group sending 40 students to an EA conference, I’m skeptical that all of those people should really be there. While being able to send students to conferences is great and offers them a transformative experience, I’ve seen cases where people seem more motivated by the free flight than the conference itself. (It also makes the conference worse for the rest of the attendees.) 

Creating New Opportunities 

I want to finish by reiterating that there are a lot of positive effects from increasing funding for EA groups. In discussions about funding and university groups, I find that most people gravitate towards talking about the negative effects and don’t give enough weight to all of the new opportunities resulting from EA groups having more funding. While the majority of this post has a somewhat critical or wary tone, I also want to say that I’m really excited about what university groups could accomplish with more funding. This is probably redundant for most people, but I still think it’s worth capturing how far groups have come. 

Less than ten years ago, EA groups barely existed at universities. Now, there are over [EA · GW] ten conferences in 2022 that members of the community can attend. University groups regularly run retreats helping their group members connect and think critically about how they want to use their careers for the social good. Community-builders receive funding for their work, allowing them to significantly increase the quality of their group (ocasionally even working on their group full-time) and make fewer time-money tradeoffs. Groups themselves have funding to host weekly dinners, invite speakers, travel to conferences, rent office spaces, and buy high-quality advertising materials. Opportunities available for young people—including the Open Philanthropy’s Century Fellowship and Biosecurity Scholarships, various funding sources, and paid internships at EA orgs—allow young EAs to skill-up and be more ambitious. 

All of these opportunities should significantly increase the number of altruistically-inclined young people who learn about EA, get involved in the community, and make the commitment to use their career to tackle the world’s most pressing problems. We just need to remain cognizant of how funding changes the community and its incentive structures.

III. What we can do 

Don’t be afraid to ask for help

I hope most university group organizers already know this, but the community health team is available to help you address sensitive situations within your group. Please have a low barrier for reaching out to them. 

For example, if you know of people in your university group applying to attend an EA conference just because they want a free trip and don’t feel comfortable addressing them, you can flag these individual(s) to the community-health team. (Caveat: there are times where this is clear, and there are times where it’s borderline. If it’s not clear why an individual is applying for an EAG conference, I’d talk to them directly before flagging them to the comm-health team.) 

How to communicate about the increase in funding

(See the FAQ below for sample responses to specific questions that a community-builder might get asked.) 

Free stuff is not actually free

When university students get offered free stuff, they usually don’t think about where that free stuff came from. If one of your group members seems particularly keen on stuff being “free” in EA, I recommend conveying to them that “no, everything isn’t actually free”. Grantmakers in the EA community think funding [local EA groups/travel to conferences/etc] is high value, and therefore chose to fund [local EA groups/travel to conferences/etc] over other valuable causes.” (I rarely need to tell people this, but it occasionally comes up. I also try to preemptively say stuff along these lines when I’m around newer group members.) 

Ultimately, I want my group members to understand the counterfactual value of money. I’d hope they understand that it’s possible to spare the life of a factory-farmed animal for a few dollars or save the life of a person for ~$5,000. EA doesn’t fund people to attend conferences because they’re fun and free; there are serious problems in the world and we want to empower young people to figure out how they can tackle them. 

Don’t widely advertise that “EA has money”

It’s a good norm to not spread the meme that “EA has tons of money” to people who are new to your group. I’d tell them instead about specific opportunities/funding sources that are available to them. And if you want to talk about money in EA, I would avoid having these discussions at large group events and instead have them in smaller groups with more seasoned members of your group. 

Be careful how you advertise EAG conferences

When encouraging your group members to attend an upcoming EA conference, be careful with your language and phrasing. For example, instead of writing that “you can get a free trip to London/Prague,” I’d write, “we don’t want financial barriers to prohibit students from attending the conference, so students can request to have their housing and travel costs covered.” I’d also consider adding another line that says something along the lines of “we recommend that you attend the conference only if you want to seriously engage with the EA community.

Establish healthy groups norms around money 

Develop spending heuristics

In university-group organizing, it’s difficult to get good feedback loops on your decisions. I find it more useful to develop heuristics for how you spend money than to try and model the cost-effectiveness of your actions (although that could be a useful group exercise). For example, here are some heuristics I use to determine how we spend money for Brown EA: 

***I recommend reading CEA’s Common Group Expenses Guidelines document

Buy the cheaper version of things (or don’t spend extravagantly)

One easy way to encourage a degree of frugality in your group is to buy the cheaper version of things. For example, if you have the option between spending $10 per person on a weekly dinner versus $20, I’d buy the cheaper food (at least until everyone gets sick of eating pizza!) Or, if you’re running a retreat, I wouldn’t stay at a luxury retreat center.  

One thing to note is that people’s impressions of extravagance often doesn’t reflect the cost of what you’re buying. If you buy a really expensive dinner for your group that costs $500 instead of $200, people might perceive that as more wasteful than the $3000 retreat you ran at a run-down summer camp.  

Consider reintroducing (or maintaining) other altruistic signals

Have conversations about why you care

Personal narratives go a long way in representing what the EA movement is and why you got involved. I think talking about your emotional motivations for joining EA can indirectly counter funding criticisms and the notion that all EAs do are multiply numbers and argue about cost-effectiveness. 

Buy vegan food

Most EA groups do this anyways, but I’d recommend buying vegan food for events or dinners. It’s an easy way to show that you care about animal suffering and opens up opportunities to educate your group members about factory-farming. 

Emphasize effective giving

This is somewhat controversial in community-building, but I’d still encourage people to talk about effective giving and effective charities. GWWC doesn’t need to be people’s first introduction to EA, but you can still encourage your group members to think about the marginal value of money. Talking candidly about why you took the GWWC pledge (or a variation of it) can help convey that you take helping people seriously and are willing to make personal sacrifices in order to do so. 

IV. FAQ: Talking to group members about money in EA

Why does EA have so much money? 

Several large scale donors—Dustin Moskovitz, Cari Tuna, and Sam Bankman Fried—have committed the majority of their wealth to effective altruist causes, which is why there is significantly more funding in EA now. Open Philanthropy, which is largely funded by Dustin Moskovitz and Cari Tuna, plans to give out $800 million in grants in 2022. The FTX Future Fund, founded by Sam Bankman Fried, also plans to give out between $100 million and $1 billion this year. It’s worth noting, however, that this is still a very small amount of money relative to other philanthropic efforts. For example, the US government alone aims to invest $44.9 billion in 2023 towards tackling climate change.  

Why do longtermism and existential risk reduction have so much money compared to global health & development and animal welfare?

This is a common misconception about EA. In 2019, the two largest recipients of funding in EA were global health & development (44%) and farm animal welfare (13%), followed by biosecurity (10%) and AI safety (10%). While longtermist funding in EA has expanded since 2019, particularly with the foundation of the FTX Future Fund, longtermist causes receive a minority of funding. What’s true, however, is that longtermism and x-risk reduction have a larger funding overhang, meaning that there’s more money dedicated to longtermist causes than there are people and projects able to absorb the funding. 

If longtermism has a larger funding overhang, why doesn’t EA fund [insert cause] instead?

First, most money in EA isn’t transferrable across cause areas. If Open Philanthropy has $100 million allocated for grantmaking aimed at improving the long-run future, their grantmakers can’t use that money to fund policies aimed at, for example, improving farmed chicken welfare. Second, although it’s currently harder to fund good longtermist projects and people, this won’t always be the case. Over time, grantmakers will likely find effective ways to distribute funding and the funding overhang will disappear. Lastly, taking a more patient approach to philanthropy allows the community to maximize its impact over the long-run. If we only funded problems with easy funding opportunities, we wouldn’t make progress on the thornier/hard-to-fund but equally important problems.  

Where does the money for EA groups come from?

Most EA groups receive funding from Open Philanthropy, CEA’s University Group Accelerator Program (UGAP), or the EA Infrastructure Fund (EAIF). Each organization has its own unique application process and bar for funding. 

Why is EA funding students to go to conferences?

The EA Events Team offers financial support for students to attend conferences because they want to make sure that money doesn’t prohibit anyone from attending a conference. Conferences can be really valuable because they allow students to make personal and professional connections, learn about early career opportunities within the EA community, and get feedback on their career plans. Students also frequently cite conferences as one of their most transformative experiences! 

Why are students getting funding to community build? Is community-building a ponzi scheme? 

Community-building is a uniquely high-impact opportunity for students because of its multiplier effect. For example, imagine that you have two options. (1) You could pursue a high-impact career yourself or (2) you could start an EA group at your university and get five other people to pursue equally impactful careers. By working on community-building, you’ve 5x your impact! First, grantmakers fund students to run their local EA groups because of the value of community-building. Second, students often have large financial burdens and opportunity costs, and funders don’t want community-builders to have to choose between running their group or taking a paid on-campus job. 

People commonly critique community-building arguing that it’s a ponzi scheme. If a community-builder finds five more community-builders, and those five community-builders find more community-builders, then nobody is actually making an impact! However, in practice, this isn’t the case. A minority of students in university groups work on community-building or plan to do community-building full-time after graduation. (Although it’s debated [EA · GW] how much community-building university groups should do.) 

V. Resources

19 comments

Comments sorted by top scores.

comment by Luke Chambers · 2022-07-05T11:40:19.238Z · EA(p) · GW(p)

I think this is a good guide, and thank you for writing it. I found the bit on how to phrase event advertising particularly helpful.

One thing I would like to elaborate on is the 'rent-seekers' bit. I'm going to say something that disagrees with a lot of the other comments here. I think we need to be careful about how we approach such 'rent-seeking' conversations. This isn't a criticism of what you wrote, as you explained it really well, but more of a trend I've noticed recently in EA discourse and this is a good opportunity to mention it. 

It's important to highlight that not all groups are equal, demographically. I co-lead a group in a city where the child poverty rate has gone from 24% to a whopping 42% in 5 years, and remains one of the poorest cities in the UK.  I volunteer my time at a food bank and can tell you that it's never been under stronger demand. Simply put, things are tough here.  One of the things I am proudest about in our EA group is we've done a load of outreach to people who face extra barriers to participating in academia and research, and as a result have a group with a great range of life backgrounds. I'm sure it's not the only EA group to achieve this, because I've spoken to other group leads who have made an effort to achieve the same effect.

We've adapted our strategies and events a bit to enable this - eg. pre-buying public transport tickets for people to attend our events, or wage replacement where if they attend a day-long event, we'll pay a micro-stipend equivalent of a 10 or 12 hour shift of minimum wage (though this is rare as we're careful about when we arrange stuff).  This was because some people literally couldn't afford a day off to attend conferences, or present their research, because that lost day had significant consequences for them. As a result, we've had some fantastic things come from people who otherwise would not have had the opportunity to contribute their valuable ideas and work.

My point is that a lack of funding is an extremely real barrier to many people's participation not just in EA, but in academia/research in general. I understand that there is a very real risk of people using EA events as a 'free holiday' type deal, and it's something that bears mitigating, but we also have to be really careful about unfairly tarring people who rely on full funding to attend events.  I fully expect to encourage as many members of my group as possible to attend the conferences because they have lots to gain and lots to contribute. I understand the 'rent-seeking' fear is that people will use EA conferences to pursue jobs or grant funding for projects, but I don't think this is as high a risk as people say because those are EA-aligned jobs and grants, and those organisations have their own safeguards. They can see through false interest fairly easily. As for reducing the quality of conferences, I'm not sure how you could reliably tell the difference between a 'rent-seeker' and someone who just doesn't know EA in-depth yet, or who is nervous.

Essentially, it boils down to the fact that in some groups only a few people may be suitable to attend the conferences as in your example. However, there are contextual and geographical factors at play which means that some groups may make more applications than others, and it may not necessarily be a 'rent-seekers' issue. Some groups may just need more help for more people. As a result, higher numbers of people from x group over y group isn't necessarily an indication of 'rent-seeking'.

I'm always extremely apprehensive about any 'rent-seekers' discourse because it seems to follow a similar trend as to class warfare in mainstream media. For example, the demonisation of people on benefits despite that fact that benefits fraud makes up a microscopic rate of overall fraud. The idea of someone taking advantage of the group (whether that's society or an organisation etc) is often overinflated compared to its actual risk. I would be very interested to see any confirmed examples of rent-seeking to try and gauge how big the current threat is. I assume the grant-makers check the hotels people are claiming for (not 5-star etc) and length of stay (not booking 8 days for a 2-day event). You also sign in to events via a QR code, so checking that people actually went to the event is fairly easy. I assume EA can also access people's agendas, to a degree, and are able to see if people are actively engaging with others. These various safeguards should make this issue quite trackable. If it's a matter of engaging in good faith, that's so immensely hard to measure I'm not even sure it's possible.

A final bit I would like to expand on is this:

"I’ve seen cases where people seem more motivated by the free flight than the conference itself"

There is also a risk of mistaking excitement for motivation. For many people, an EA conference will be the first time they've travelled away to another country (or even city), and so lots of excitement surrounding the actual trip is normal. My first ever EAG London was my first time travelling to my own nation's capital. You can bet I had a walk around the tourist sites after my agenda for the day was finished. And that's okay. I understand there's a concern of people doing it just for the flight, travel, hotel, whatever - but the amount of safeguards would (I assume) prevent this from being the case.

You make really good points, and I think the 'rent-seekers' risk bears watching to see if it becomes a genuine threat, but I am concerned about it becoming an increasing part of EA discourse and if we're not careful it could drive away otherwise great contributors because of entrenched social and class issues. EA already has intellectual diversity issues, and we need to be careful about exacerbating rather than fixing these. I also understand that 'rent-seeker' in no way is intended to mean 'low economic background' - however, my point is that many of the 'rent-seeker' red flags listed here and elsewhere could also be signs of someone overcoming class and social barriers and so there's a risk of mistakenly alienating people from certain backgrounds over others.

Again - I 100% know this isn't what you meant and this was a really helpful guide, but I'm commenting more on the general discourse trend I'm noticing on the forum, on the Twitter group, and in some blogs. I am concerned that the fears of rent-seekers could be overblown compared to the real proportion of the risk, and would be interested to see some evidence-based research in this area.

 

Replies from: levin, Khorton, nathan, sabrinachwalek, Luke Chambers
comment by levin · 2022-07-05T14:20:19.040Z · EA(p) · GW(p)

I agree that it's very important to continue using EA money to enable people who otherwise wouldn't be able to participate in EA to do so, and it certainly sounds like in your case you're doing this to great effect on socioeconomic representation. And I agree that the amount of funding a group member requests is a very bad proxy for whether they're rent-seeking. But I don't agree with several of the next steps here, and as a result, I think the implication — that increased attention to rent-seeking in EA is dangerous for socioeconomic inclusion — is wrong.

I think my disagreement boils down to the claim:

I'm not sure how you could reliably tell the difference between a 'rent-seeker' and someone who just doesn't know EA in-depth yet, or who is nervous.

In my experience, it is actually pretty easy for group organizers to differentiate. People who are excited about EA and excited about the free flight or their first major travel experience/etc do not set off "rent-seeking alarms" in my gut. People who ask a lot of questions about getting reimbursed for stuff do not set these alarms off, either. You're right that these correlate with socioeconomic status (or youth, or random other factors) more than rent-seeking.

It's people who do these things and don't seem that excited about EA that set off these alarms. And assessing how interested someone is in EA is, like, one of the absolute essential functions of group organizers.

I think EA group organizers tend to be hyper-cooperators who strongly default towards trusting people, and generally this is fine. It's pretty harmless to allow a suspected rent-seeker to eat the free food at a discussion, and can be pretty costly to stop them (in social capital, time, drama, and possibly getting it wrong). But it's actually pretty harmful, I think, for them to come to EAGs, where the opportunity costs of people's time and attention — and the default trust people give to unfamiliar faces — are much higher. For me, it takes consciously asking the question, "Wait, do I trust this person?" for my decision-making brain to acknowledge the information that my social-observational brain has been gathering that the person doesn't actually seem very interested. But I think this gut-level thing is generally pretty reliable. I'll put it this way: I would be pretty surprised if EA group organizers incorrectly excluded basically anyone from EAGs in the past year, and I think it's very likely that the bar should be moved in the direction of scrutiny — of just checking in with our gut about whether the person seems sincere.

Replies from: Luke Chambers
comment by Luke Chambers · 2022-07-05T15:45:44.275Z · EA(p) · GW(p)

That's a good point, about community organisers being kind of a filter. I like to think I'd know if someone was looking to extract profit. To be honest we usually have the other problem. I've heard a few times before from people they 'dont want to take the p*ss' and I have to convince them it's alright to stay at a 2 star instead of a 1 star! I think the groups function well because it's (in theory for me, never happened yet) possible to tell when someone's shifty. So I agree with that point. 

I do still think though that too much focus on the discourse risks socioeconomic exclusion. I know people don't intend it this way, but sometimes the discourse can come off quite elitist in writing when worded incorrectly. It's a risk. But at the same time I would hate to chill someone's free speech, and valid concerns. Communities are always a delicate balancing act! Difficult to get right.

 

Replies from: levin
comment by levin · 2022-07-05T18:08:06.215Z · EA(p) · GW(p)

I think you're probably right that there are elitism risks depending on how it's phrased. Seems like there should be ways to talk about the problem without sounding alienating in this way. Since I'm claiming that the focus really should just be on detecting insincerity, I think a good way to synthesize this would just be to talk about keeping an eye out for insincerity rather than "rent-seeking" per se.

comment by Kirsten (Khorton) · 2022-07-05T11:51:50.625Z · EA(p) · GW(p)

This is a great comment and I think would make a good standalone Forum post - I'd certainly like to link to it.

Replies from: Luke Chambers
comment by Luke Chambers · 2022-07-05T11:54:09.334Z · EA(p) · GW(p)

It's something I would be willing to write if others wanted to read it, unless the original poster would rather do it.

Replies from: Khorton
comment by Kirsten (Khorton) · 2022-07-05T12:00:19.624Z · EA(p) · GW(p)

Please do - at a minimum you could post what you've already written as a comment, but if you have more to say I'd be interested.

comment by Nathan Young (nathan) · 2022-07-05T12:12:00.790Z · EA(p) · GW(p)

Agreed. I'm gonna channel my inner Ollie Base here and say "it's the EAG team's job to accept and pay for those they think will create the most value by attending". I think currently if you get accepted go, go joyfully and enjoy the city you go to.

I went to the zoo on the Sunday of EAG Prague. Some of my flights were paid for by CEA because I was cash strapped at the time. I could have decided that was an inappropriate use of the time, but I think it made me enjoy the EAG more, I still talked to lots of people and I would be more likely to fly to another EAGx. Signalling masters, yes, but counterfactual impact is more important. If someone applies to an EAG partly for the holiday, then as long as they intend to take the EAG seriously and are honest on their application, more power to them. CEA can read their application and accept them if they want.

Replies from: OllieBase, Luke Chambers
comment by OllieBase · 2022-07-05T15:05:45.982Z · EA(p) · GW(p)

As the real Ollie Base, I agree with this (assuming personal leisure doesn't add non-negligible costs).

Having skimmed Luke's parent comment I also agree and upvoted. Anecdotally, I encounter more people who I wish had applied for travel funding (or more funding) than people who applied for too much. This weakly suggests to me we should worry more about making sure people are aware of our travel grant policy (and that's on us)  than free-riders, though I could imagine the latter being more costly from a PR/community health perspective per instance.

comment by Luke Chambers · 2022-07-05T12:15:02.577Z · EA(p) · GW(p)

That's an interesting tie-in to the 'burnout' discourse we've been seeing lately that I had not even considered.

comment by sabrinac (sabrinachwalek) · 2022-07-08T01:15:47.295Z · EA(p) · GW(p)

Hi Luke!

Thanks so much for your thoughtful response. Socioeconomic inclusion doesn't get enough attention in EA and I'd hate for attempts to prevent mitigating rent-seeking behavior to turn into raising barriers for low-income community members.

I think that the questions of how to mitigate rent-seeking behavior and make EA more socioeconomically inclusive can largely be decoupled though. I agree with @levin that it's easy for group organizers to identify when people are motivated in good faith vs. bad faith to get funding to attend a conference. I also don't think that attempts to identify people acting in bad faith necessarily lead to socioeconomically excluding people. Rather, there aren't enough clear resources and opportunities for low-income community members, and most of the community's resources are targeted at affluent, elite students and universities. To me, these seem like much larger barriers to inclusion than the dialogue on funding in EA. 

While the risk of rent-seeking behavior may be overinflated compared to the actual risk, I also think this is difficult to claim. First of all, I think that the most serious incidents of rent-seeking behavior won't be public knowledge in EA. Instead, the community health team and other individuals will likely deal with these incidents privately. Second of all, I've heard of a couple of incidents that **really concerned me** regarding how some individuals accessed large sums of money for self-motivated and manipulative reasons. EA orgs and individuals definitely have safeguards in place, but I think the high-trust nature of the community at times allows people to access a lot of money with minimal oversight. I'd personally defer to the community-health team and grantmakers in assessing the scope of the risk.

I also think there are good second-order reasons to also want to prevent rent-seeking behavior in EA (even if the first-order effect of funding a rent-seeking person isn't that bad). Maintaining a high-trust community makes it a lot easier to get stuff done and fund people. There are a lot of PR risks from individuals accessing money for personal gain. And I want to avoid spreading the meme of "EA will fund students' vacations" on college campuses, giving EA groups a bad reputation. 

Replies from: Luke Chambers
comment by Luke Chambers · 2022-07-13T07:26:07.090Z · EA(p) · GW(p)

Thank you for such an informative and well-thought-out reply. I appreciate you taking the time :)

I think you raise some good points here, and yes I have personally found getting access to money much easier than with most other orgs. I still do think that there may be an unintentional chilling effect on people from rent-seeker discourse, but I think we can both agree with @Levin that perhaps using a different term related to good and bad faith may be a good avenue to pursue.

All in all I think you do raise really good points both in the original post and in this reply, but do also think it's worth being mindful, as always, of unintended consequences :)

comment by Luke Chambers · 2022-07-05T11:52:12.107Z · EA(p) · GW(p)

I'm actually going to reply to my own comment here with the cardinal sin of thinking of another point after hitting 'post', but not wanting to disrupt the flow of the original comment!

I believe there IS a case to be made for teaching organisers how to better spend funds smartly. I have been to larger EA events before where I've thought to myself 'this could have been done at half the price'.  Maybe it's the fact I grew up in an environment where you had to make every penny stretch as far as possible, but it blew me away when another group leader mentioned to me they don't negotiate costs with vendors! Like haggle on price for room fees, food etc. Some find it distasteful, and I get that, but a lot could be saved. 

Also, some events can be unnecessarily ostentatious. Like do we really need a room with this much gold and antique clocks? You could have rented a soviet-style office room at half the price like 2 miles away. 

Then again, it's very easy for me to criticise others given my near-zero large-scale event planning experience. Maybe there are other factors I'm not considering. That said, maybe give group leaders some books on negotiation or on frugality tips. That may help a range of the issues highlighted in this post. 

comment by Harrison Gietz (harrygietz@gmail.com) · 2022-07-04T22:06:38.230Z · EA(p) · GW(p)

Thanks for writing this! I think a lot of this is great to keep in mind for university groups.

I especially liked the "free stuff is not actually free" framing. Putting a  counterfactual on conference costs can be humbling, and really makes one think carefully about attending... if ~$5000 dollars could save a life elsewhere (say, generate 80 QALYs),  then a $500 reimbursement for a trip to a conference is sacrificing 8 years of life. Not a decision to take lightly!

comment by levin · 2022-07-04T23:18:01.600Z · EA(p) · GW(p)

On "attracting rent-seekers" and "be careful how you advertise EAGs": for some reason the rent-seekers seem particularly attracted to the conferences, rather than e.g. free food, etc. This is somewhat interesting because if you were totally uninterested in EA, it would obviously be costlier to go to a conference than to get free food at weekly meetings or something, but I guess it's also the career connections (albeit in sub-spaces that fake-EAs are unlikely to actually want to go into?) and feeling of status that you're getting flown places. I also think it's (maybe obviously) much more damaging for rent-seekers to attend conferences and take up the time of professional EAs who could be meeting non-rent-seekers.

For these reasons, I think EAG's bar for accepting students has gotten a bit too low; specifically, I think they should ask university group leaders for guidance on which group members are high-priority and which shouldn't be accepted. (I know they're capacity-constrained, but this might be worth an additional staff member or something.)

On "Don't advertise 'EA has money'": I endorse your framing throughout this post as "EA doesn't want a lack of money to stop [impactful thing from happening]" rather than "we have all this money, take some and do something with it." I think this both directly attracts rent-seekers and signals that we're in it for the money (both of which probably repel altruists). I totally get why people have the instinct to talk about it, especially mid-funnel people who are just realizing how much there is but don't quite get the nuances and problems described in this post, so it's worth having this conversation with anyone who does community-building in your group.

On humor and talking about EA money in general: In a broad range of IRL social settings, I personally find it very hard not to joke about things. I just naturally gravitate towards observing ironies, referencing memes, and phrasing points in a way that lands on a surprising/humorous beat; when I try to turn this off, e.g. in serious class discussions about heavy topics, I usually fail and have to clarify that I'm not trying to make light of the thing and just go for a tone of "dark irony" instead.

Money in EA is extremely ironic, and it produces lots of opportunities to note surprising results and connections between concepts. When longtime EAs hang out, talking about various funny ways to spend money can be a fun way to push various theories (or maybe brainstorm good galaxy-brain ideas!). But I think it is a very bad look to joke about it in semi-public contexts, and I've worked hard to just not say the things that come to mind because I know it will sound like I'm trivializing suffering, or finding glee in the ridiculous inequality of this situation, or "here for the wrong reasons." Weak anecdotal/subjective evidence: when a top/mid-funnel person has joked about money, it's usually when I'm already smiling/laughing, and when I react with a polite nod but wind down the smile, this seems to actually convey a seriousness/sensitivity that I think is the right vibe. So I've also tried to institute an informal rule of "no jokes about money" and (non-confidently) recommend other group organizers do the same.

Replies from: lukefreeman
comment by Luke Freeman (lukefreeman) · 2022-07-05T03:18:13.758Z · EA(p) · GW(p)

+1 to the comment here about humour. I'm someone who loves a good laugh and has a pretty dry sense of humour but am particularly wary about it when talking about money and suffering (I've seen it go pretty badly in several EA or EA-adjacent contexts).

It's also very important to think about humour in non-EA social contexts where there are a lot of people within the EA community alongside those who aren't. Someones first exposure to the community might be somewhere like an informal party and first impressions really count.

comment by Nathan Young (nathan) · 2022-07-05T12:34:06.536Z · EA(p) · GW(p)

Thanks for taking the time to write this

What have people's experiences been of these bad outcomes happening?

My guess would be that there have still only been like 10 or so people who have grifted to fly to conferences in total. Regardless of free food and flights it's still quite a lot of effort and requires a careful deception.

I made that number up to give my sense of the scale of the problem feel free to disagree with it.

If we knew that 1% of free flights would go to bad actors that still seems a reasonable gamble, right?

comment by OllieBase · 2022-07-05T15:11:48.346Z · EA(p) · GW(p)

The EA Events Team offers financial support for students to attend conferences because they want to make sure that money doesn’t prohibit anyone from attending a conference. Conferences can be really valuable because they allow students to make personal and professional connections, learn about early career opportunities within the EA community, and get feedback on their career plans. Students also frequently cite conferences as one of their most transformative experiences! 

Speaking on behalf of the CEA Events team, I think this is basically right :)

Two minor points of clarification:

  • It's CEA as an organisation not "EA" or the "EA Events Team"
  • Financial support is available to everyone who needs it, not just students.
comment by Luke Freeman (lukefreeman) · 2022-07-05T03:12:32.582Z · EA(p) · GW(p)

Thank you so much for writing this up! I particularly like the tangible examples and even exact wordings that make it easier for other organisers e.g. 

  1. Your suggested language for conference funding (“we don’t want financial barriers to prohibit...attend the conference only if you want...”).
  2. Your  heuristics that you use to determine how you spend money for Brown EA.
  3. Your answers to FAQs about funding situation

In terms of promoting effective giving, I recommend university group leaders look at the guide for promoting effective giving [EA · GW], effective giving event guides and please don't hesitate to reach out to community@givingwhatwecan.org if you want help (such as connecting you with local people who are a few years ahead and can talk about their experience, giving a talk or providing sponsorship for giving games etc).