The Vultures Are Circling

post by CitizenTen · 2022-04-06T05:57:11.096Z · EA · GW · 66 comments

    It might not have shown up on your radar but the funding situation for EA is currently insane. Like bananas, jumping off the wall, insane. Especially with regards to young people.  I personally know of 16 year olds getting more money than the median American salary, and of 21 year olds getting six to seven figure grants.  And this isn’t to knock either of those things.  There’s really well thought out reasons why this makes sense.  And generally I’d even advocate for more of this crazy risk taking. Normal institutions are extremely risk averse and it’s nice to see EA buck the trend. 

 

But here’s the thing. The message is out.  There’s easy money to be had. And the vultures are coming.  On  many internet circles, there’s been a worrying tone. “You should apply for [insert EA grant], all I had to do was pretend to care about x, and I got $$!” Or, “I’m not even an EA, but I can pretend, as getting a 10k grant is a good instrumental goal towards [insert-poor-life-goals-here]”  Or, “Did you hear that a 16 year old got x amount of money?  That’s ridiculous!  I thought EA’s were supposed to be effective!” Or, “All you have to do is mouth the words community building and you get thrown bags of money.”  Basically, the sharp increase in rewards has led the number of people who are optimizing for the wrong thing to go up. Hello Goodhart. Instead of the intrinsically motivated EA, we’re beginning to get the resume padders, the career optimizers, and the type of person that cheats on the entry test for preschool in the hopes of getting their child into a better college.  I’ve already heard of discord servers springing up centered around gaming the admission process for grants.  And it’s not without reason.  The Atlas Fellowship is offering a 50k, no strings attached scholarship.  If you want people to throw out any hesitation around cheating the system, having a carrot that’s larger than most adult’s yearly income will do that. TLDR: People are going to begin to optimize really hard around showing [EA grants] what they are thought they want to see.  This will lead to just less impactful grants for helping people, and generally less chance of right handed tail successes.      

    So what to do?  I’d like to note that some of the knee jerk reactions when hearing of the problem are examples of things not to do.

     Finally, I’d like to note that this problem has yet to become an actual problem. It's just a whisper of what's to (maybe) come. It still happens to be the case that the intrinsically motivated EA’s  far, far out number the resume builders.  But this might change if we're not careful.  And this will begin to make a difference, as no matter how good our interview filters, the false positive rate will continue to increase [LW · GW].   Furthermore, it seems that there are currently plans to massively scale up grant giving.  So it would be nice if we could somehow solve this now when it’s a small problem, instead of later.  Money saved is lives saved!     

66 comments

Comments sorted by top scores.

comment by Stefan_Schubert · 2022-04-06T07:57:10.956Z · EA(p) · GW(p)

This post uses an alarmist tone to trigger emotions ("the vultures are circling"). I'd like to see more light and less heat. How common is this? What's the evidence?

People have strong aversions to cheating and corruption, which is largely a good thing - but it can also lead to conversations on such issues getting overly emotional in a way that's not helpful.

Replies from: weeatquince, CitizenTen, Samuel Shadrach, howdoyousay?
comment by weeatquince · 2022-04-06T18:13:50.923Z · EA(p) · GW(p)

I might be in the minority view here but I liked the style this post was written in, emotive language and all. It was flowery language but that made it fun to read it and I did not find it to be alarmist (e.g. it clearly says “this problem has yet to become an actual problem”). 

And more importantly I think the EA Forum is already a daunting place and it is hard enough for newcomers to post here without having to face everyone upvoting criticisms of their tone / writing style / post title. It Is not the perfect post (I think there is a very valid critique in what Stefan says that the post could have benefited from linking to some examples / evidence) but not everything here needs to be in the perfect EA-speak. Especially stuff from newcomers.

So welcome CitizenTen. Nice to have you here and to hear your views.  I want to say I enjoyed reading the post (don’t fully agree tho) and thank you for it. :-)

Replies from: Halstead, CitizenTen
comment by John G. Halstead (Halstead) · 2022-04-11T18:20:50.106Z · EA(p) · GW(p)

I also thought that the post provided no support for its main claim, which is that people think that EAs are giving money away in a reckless fashion. 

 Even if people are new, we should not encourage poor epistemic norms. 

Replies from: casebash
comment by Chris Leong (casebash) · 2022-04-13T02:57:22.291Z · EA(p) · GW(p)

The claim sounds plausible to me and that’s enough to warrant a post to encourage people to think about this.

comment by CitizenTen · 2022-04-06T21:06:00.073Z · EA(p) · GW(p)

:-)

comment by CitizenTen · 2022-04-06T13:19:37.852Z · EA(p) · GW(p)

My bad.  Any good ideas for what the title should change to?  Also, I'd just like to note that this is not yet very common at all.  My evidence is just hearsay, anecdotes, and people that I've talked to. So if it was overly alarmist I'm sorry.  That was not my attention. Once again, I'm more noting the change in tone on how some people are treating the grants then anything.  Instead of being excited about cause area X and then using the grants as a way to achieve their goals, people are instead excited about cause area X because they can get easy funding.  Once again, I don't think we should be alarmist about this, as funding less great/risky people would be a failure mode.  I just wanted it to be common knowledge that this is happening (and probably?) going to get worse over time.  

Replies from: Stefan_Schubert, Charles He
comment by Stefan_Schubert · 2022-04-06T13:23:53.134Z · EA(p) · GW(p)

Fair enough - thanks for your gracious response.

comment by Charles He · 2022-04-06T15:25:11.304Z · EA(p) · GW(p)

My evidence is just hearsay, anecdotes, and people that I've talked to

 

Once again, I don't think we should be alarmist about this

Well, it’s pretty clear you said that you read this on the internet:

On  many internet circles, there’s been a worrying tone. “You should apply for [insert EA grant], all I had to do was pretend to care about x, and I got $$!” Or, “I’m not even an EA, but I can pretend, as getting a 10k grant is a good instrumental goal towards [insert-poor-life-goals-here]”  Or, “Did you hear that a 16 year old got x amount of money?  That’s ridiculous!  I thought EA’s were supposed to be effective!” Or, “All you have to do is mouth the words community building and you get thrown bags of money.” 

Your "boulder splash" is annoying. It pushes on the very issue and the adverse affects you claim to worry about. Noise and heat is self perpetuating. This transition into a higher funding environment is delicate, and outcomes depend on initial conditions and the liminal states. Reactions and confidence from the community and nascent leaders is important and benefits from a firm hand and the right tone. 

I don’t think this is malice, but it's clumsy and emotionally manipulative.

Replies from: CitizenTen, Charles He
comment by CitizenTen · 2022-04-06T15:51:19.219Z · EA(p) · GW(p)

Yeah.  I agree.  Pointing to this problem can make the problem worse.  It's a little bit of an info-hazard in that respect?  But yeah, I'll agree it was slightly clumsy.  I wanted to tell everyone that this was a thing that was happening, without creating a backlash that would destroy the genuinely valuable parts of doing what were doing. Furthermore, it is genuinely really valuable to have such a high trust community and I don't want that to change.   I guess whether or not I succeeded on walking this tightrope is  for others to decide.    

comment by Charles He · 2022-04-06T15:45:29.627Z · EA(p) · GW(p)

One of the devices not mentioned in my comment below is the utility of LessWrong as a filter/proxy for values. This can work but has a weakness because institutional literacy and intellectual honesty isn't at the right aesthetic. You've demonstrated that with your post and your comment, which is poetic.

 

I know someone who has been trying to work on the problem (which isn't very well elaborated on in your post) with sort of four arms:

  • Show, not tell, instances of the issue you worry about to increase literacy about it (without getting defunded)
  • Show, not tell, issues with intellectual honesty and subcultures (without getting defunded)
  • Setup institutions and mechanisms to solve the issue
  • Increase literacy about the design of EA and how funding is decided

 

It turns out this project is pretty hard. 

The money thing itself isn't that hard. "Meta-AI" stuff in business is everywhere. What's tricky is showing not telling, and handling the cause area activism/proxy and consequent issues. If you're trying to stand in multiple cause areas, which is necessary, it's an absurd situation right now and unfair to work in.

comment by acylhalide (Samuel Shadrach) · 2022-04-06T11:44:17.869Z · EA(p) · GW(p)

IMO a standard norm on whether clickbait EA titles are good or bad might help.

I remember seeing a post once arguing for the exact opposite - that clickbait/catchy titles/summaries,  and generally writing styles that draw you in - are good because they draw attention to important issues. So much so that you have a moral obligation to use them, if you believe you're pointing at something important.

Replies from: Stefan_Schubert
comment by Stefan_Schubert · 2022-04-06T11:56:38.511Z · EA(p) · GW(p)

I think you refer to this post [EA · GW]. Note that there was a discussion about the title of that post, and that it was eventually changed.

In general, I think that one should be more careful about being clickbaity regarding sensitive and emotionally charged topics.

Replies from: Samuel Shadrach
comment by acylhalide (Samuel Shadrach) · 2022-04-06T14:22:24.496Z · EA(p) · GW(p)

Yup, and yeah makes sense.

comment by howdoyousay? · 2022-04-06T09:09:54.970Z · EA(p) · GW(p)

Yes to links of what conversations on gaming the system are happening where! Surely this is something that should be shared directly with all funders as well? Are there any (in)formal systems in place for this?

comment by Stefan_Schubert · 2022-04-06T09:13:32.747Z · EA(p) · GW(p)

Fwiw, anecdotally my impression is that a more common problem is that people engage in motivated reasoning to justify projects that aren't very good, and that they just haven't thought through their projects very carefully. In my experience, that's more common than outright, deliberate fraud - but the latter may get more attention since it's more emotionally salient (see my other comment [EA(p) · GW(p)]). But this is just my impression, and it's possible that it's outdated. And I do of course think that EA should be on its guard against fraud.

comment by aogara (Aidan O'Gara) · 2022-04-06T06:06:07.168Z · EA(p) · GW(p)

Would really appreciate links to Twitter threads or any other publicly available versions of these conversations. Appreciate you reporting what you’ve seen but I haven’t heard any of these conversations myself.

Replies from: Jonas Vollmer, howdoyousay?
comment by Jonas Vollmer · 2022-04-07T19:25:11.432Z · EA(p) · GW(p)

I sent a DM to the author asking if they could share examples. If you know of any, please DM me!

comment by howdoyousay? · 2022-04-06T09:10:50.754Z · EA(p) · GW(p)

Yes to links of what conversations on gaming the system are happening where! 

Surely this is something that should be shared directly with all funders as well? Are there any (in)formal systems in place for this?

comment by Michael Townsend (Michael_Townsend) · 2022-04-06T09:59:00.826Z · EA(p) · GW(p)

Like other commenters, to back-up the tone of this piece, I'd want to see further evidence of these kinds of conversations (e.g., which online circles are you hearing this in?). 

That said, it's pretty clear that the funding available is very large, and it'd be surprising if that news didn't get out. Even in wealthy countries, becoming a community builder in effective altruism might just be one of the most profitable jobs for students or early-career professionals. I'm not saying it shouldn't be, but I'd be surprised if there weren't (eventually) conversations like the ones you described. And even if I think "the vultures are circling" is a little alarmist right now, I appreciate the post pointing to this issue.

On that issue: I agree with your suggestions of "what not to do" -- I think these knee-jerk reactions could easily cause bigger problems than they solve. But what are we to do? What potential damage could there be if the kind of behaviour you described did become substantially more prevalent?

Here's one of my concerns: we might lose something that makes EA pretty special right now. I'm an early-career employee who just started working at an EA org . And something that's struck me is just how much I can trust (and feel trusted by) people working on completely different things in other organisations. 

I'm constantly describing parts of my work environment to friends and family outside of EA, and something I often have to repeat is that "Oh no, I don't work with them -- they're a totally different legal entity -- it's just that we really want to cooperate with each other because we share (or respect the differences in) each other's values". If I had to start second-guessing what people's motives were, I'm pretty sure I wouldn't feel able to trust so easily. And that'd be pretty sad.

Replies from: Bella_Forristal, CitizenTen
comment by Bella (Bella_Forristal) · 2022-04-06T13:34:06.430Z · EA(p) · GW(p)

Strong upvote for the erosion of trust being one of the things I'm really worried about.

comment by CitizenTen · 2022-04-06T13:53:29.845Z · EA(p) · GW(p)

Agree strongly. Eroding the high trust EA community would be really sad.  Don't have much to add, except a strong upvote.    

Replies from: Pablo_Stafforini
comment by Pablo (Pablo_Stafforini) · 2022-04-06T15:58:50.332Z · EA(p) · GW(p)

How about also adding links to your sources?

comment by Mauricio · 2022-04-06T08:13:35.447Z · EA(p) · GW(p)

So what to do? I’d like to note that some of the knee jerk reactions when hearing of the problem are examples of things not to do.

This seems overly quick to rule out a large class of potential responses. Assuming there are (or will be) more "vultures," it's not clear to me that the arguments against these "things not to do" are solid. I have these hesitations (among others) [edited for clarity and to add the last two]:

  • "The rationale for giving out high risk grants stands and hasn’t changed."
    • Sure, but the average level of risk has increased. So accepting the same level of risk means being more selective.
  • "decreasing the riskiness of the grants just means we backslide into becoming like any other risk averse institution."
    • Even if we put aside the previous point, riskiness can go down without becoming as low as that of typical risk-averse institutions.
  • "Increasing purity tests. [...] As a community that values good epidemics, having a purity test on whether or not this person agrees with the EA consensus on [insert topic here] is a death blow to the current very good MO."
    • There are other costly signals the community could use.
  • "So not funding young people means this type of talent and potential is wasted. Let's not do that."
    • Just because something has downsides doesn't mean we shouldn't do it; maybe it's worth it to waste some talent to avoid many vultures. (I'm not saying that's the case, just that more consideration would be helpful.)
Replies from: Aaron_Scher
comment by Aaron_Scher · 2022-04-06T18:28:58.148Z · EA(p) · GW(p)

Thanks for this comment, Mauricio. I always appreciate you trying to dive deeper – and I think it's quite important here. I largely agree with you. 

Replies from: Mauricio
comment by Mauricio · 2022-04-06T21:21:26.351Z · EA(p) · GW(p)

Thanks, Aaron!

comment by Charlie_Guthmann (Charles_Guthmann) · 2022-04-06T18:00:38.642Z · EA(p) · GW(p)

I feel like this community was never meant to scale. There is little to no internal structure, and like others have said, so much of this community relies on trust. I don't think this is just an issue of "vultures", it will also be an issue of internal politics and nepotism. 

To me the issue isn't primarily about grantmaking. If you are a good grantmaker, you should see when people's proposals aren't super logical or aligned with EA reasoning. More people trying to get big grants is mostly a good thing, even if many are trying to trick us into giving free money. I think the much larger issue is about status/internal politics, where there is no specific moment if you can decide how aligned someone is. 

But first to give some evidence of vultures, I have already seen multiple people in the periphery of my life submit apps to EAGs who literally don't even plan on going to the conferences, and are just using this as a chance to get a free vacation. I feel sorry to say that they may have heard of EA because of me. More than that, I get the sense that a decent contingent of the people at EAGx Boston came primarily for networking purposes(and I don't mean networking so the can be more effective altruists). At the scale we are at right now, this seems fine, but I seriously think this could blow up quicker than we realize.

Speaking to the internal politics,  I believe we should randomly anonymize the names on the on the forum every few days and see if certain things are correlated with getting more upvotes (more followers on twitter, a job at a prestigous org, etc.). My intuition has been that having a job at a top EA org means 100-500% more upvotes on your posts here, hell even the meme page. Is this what we want? The more people who join for networking purposes, potentially the worse these effects become. That could entail more bias. 

I post (relatively) anonymously on twitter, and the amount of (IMO) valid comments I make that don't get responded to makes me worry we are not as different from normal people as we claim, just having intellectual jousts where we want to seem smart among the other high status people. To be fair this is an amazing community and I trust almost everyone here more than almost anyone not in this community to try to be fair about these things.

I  get the sense (probably because this is often going on the back of my mind), that many people are in fact simply optimizing for status in this group, not positive impact as they define it themself. Of course status in this community is associated with positive impact, BUT as defined by the TOP people in the community. Could this be why the top causes haven't changed much? I don't feel strongly about this, but it's worth considering.

As a former group organizer, there is a strong tension between doing what you think is best for the community vs for yourself. Here is an example: To build resilience for your group, you should try to get the people who might run the group after you leave to run events/retreats/network with other group organizers, so they are more committed,  have practice, and have a network built up. But you get more clout if you run retreats, if you network with other group organizers, etc. It takes an extremely unselfish person to not just default to not delgating a ton of stuff, in no small part for the clout benefits. This tension exists now, so I'm not claiming this would only result from the influx of money, but now that organizers can get jobs after they graduate school, expect this to become a bigger issue.

P.S. If the community isn't meant to scale, then individual choices like vegetarianism are not justified within our own worldview.

Replies from: Charles He
comment by Charles He · 2022-04-07T05:11:15.725Z · EA(p) · GW(p)

I’m not a community builder. Also, just to be careful, relevant to the sentiment of this post and your own comment, I want to disclose that I’m both willing to drop and also take the title/status of being an EA, aligned with "improving the long term future", etc.

In the past, I have been involved in planning and probably understand the work of creating a retreat.

 

I thought your comment and experiences were important and substantive. In particular, this part of your comment seemed really important.

As a former group organizer, there is a strong tension between doing what you think is best for the community vs for yourself. Here is an example: To build resilience for your group, you should try to get the people who might run the group after you leave to run events/retreats/network with other group organizers, so they are more committed,  have practice, and have a network built up. But you get more clout if you run retreats, if you network with other group organizers, etc. It takes an extremely unselfish person to not just default to not delegating a ton of stuff, in no small part for the clout benefits. This tension exists now, so I'm not claiming this would only result from the influx of money, but now that organizers can get jobs after they graduate school, expect this to become a bigger issue.

I wanted to understand more:

For context, this is my basic understanding of how leadership is rewarded in organizations: most successful organizations reward development. Senior people are supposed to and rewarded for dedicating most of their time away from object level work to managing people and fostering talent. This leadership performance is assessed, and good leaders are promoted to greater status and influence, so organizations end up with conscientious, effective leaders at the top who further develop or replicate these virtues.

In this ideal model, the more and active strong the junior people are, the more credit and status the leaders get. Leaders don’t lose their status, no matter how much junior people do, they would get promoted themselves. There is no incentive to squat on duties.

 

It seems like this isn’t true in this situation. This seems important. I wanted to ask questions to learn more:

“It takes an extremely unselfish person to not just default to not delegating a ton of stuff” 

I think you are saying there is an incentive to do the work of running a retreat personally, even when there are talented people who can do this, and you already have experience running a retreat.

  1. I don’t understand, wouldn’t it make sense to get others to do the work, mentor them, and then go on the retreat with them? Maybe you cannot actually attend the retreat? It also just seems more fun and rewarding to work on this cooperatively with good people, compared to hiding them away, or even directly managing them.
  2. It seems like decisions of the leader and the experiences of the juniors can be assessed by “upper management” or the people giving CBGs. Do you think there is adequate assessment, such as interviewing? Is this assessment ineffective or low effort? (Leaders might evade or hide junior people but it this seems like real misconduct).

Again, the right outcome and common belief would look like everyone saying, "Wow, Guthmann is a hero, he scouted out A, B, and C, who are huge future leaders. Imagine what new people and projects Guthmann can foster!". 

 

I’m uncertain how much I will learn, but others might and it seems worth asking. 

Please let me know if I’m wrong or muddying the water. I also understand if you don’t respond.



 

Replies from: Charles_Guthmann
comment by Charlie_Guthmann (Charles_Guthmann) · 2022-04-07T08:35:02.072Z · EA(p) · GW(p)

I started Northwestern's EA club with a close friend my sophomore year at northwestern (2019). My friend graduated at the end of that year and our club was still nascent. There was an exec board of 6 or 7 but truly only a couple were trustworthy with both getting stuff done and actually understanding EA. 

Running the club during covid and having to respond to all these emails and carrying all this responsibility somewhat alone(alone isn't  quite fair but )  and never meeting anyone in person and having to explain to strangers over and over again what ea was stressed /tired me a decent bit (I was 19-20) and honestly I just started to see EA more negatively and not want to engage with the community as much, even though I broadly agreed with it about everything. 

 I'm not sure I really feel externally higher status in any way because of it. I guess I might feel some internal status/confidence from founding the club, because it is a unique story I have, but I would be lying if I said more than 1 or 2 people hit me up during eagx boston (had a great time btw, met really cool people)to talk over swapcard, meanwhile my  friend who has never interacted with ea outside of NU friends and fellowship but has an interesting career was dmed up like 45 times. And the 2 people who hit me up did not even do so because I founded, much less organized the club.  The actual success of the club in terms of current size/avg. commitment and probabilistic trajectory does not seem to be data that anyone in the community would necessarily notice if I didn't try to get them to notice. Don't even get me started on whether or not they would know if I promoted/delegated (to) the right people. At any point during our clubs history I could tell you which people were committed and which weren't, but no one ever asked. There are people who work with the university groups but it's not like they truly knew the ins and outs of the club, and even if I told them how things are truly going, what does that really do for me? It may be the case that they would be more likely to hirer or recommend people who are better at delegating but anecdotally this doesn't even seem true to me. Which is still a far cry from doing impact estimates and funding me based on that. Plus isn't it possible that people who delegate less just inherently seem like a more important piece of a universities "team". Maybe there are other people waiting to take over and do and even better job but they are quite literally competition to their boss in that case. Perhaps it increases my chance of getting jobs? but I'm not sure, and if it was, it's not like it would be connected to any sort of impact score. 

Founding the club has at best a moderate impact on its own. It is the combination of starting the club and giving it a big enough kick to keep going that I believe is where the value is created. Otherwise the club may die and you basically did nothing. A large part of this "kick" is ofc ensuring the people after you are good. Currently, Northwestern's Effective Altruism club is doing pretty good. We seem to be on pace to graduate 50+ fellows this year, we have had 10-15 people attend conferences. TO BE CLEAR - I have done almost nothing this year. The organizers that (at risk of bragging) I convinced/told last year to do the organizing this year have done a fire job. Much better than I could have. I like to think that if I had put very little effort in last year, or potentially even worse, not give authority to other positive actors in the club, there would have been a not tiny chance the club would have just collapsed, though I could be wrong. It does seem as though there is a ton of interest in effective altruism among the young people here, so it's feasible that this wasn't such a path dependent story.

Still - If I had started the club, put almost no effort in to creating any structure to the club/giving anyone else a meaningful role during covid year other than running events with people I wanted to meet (and coordinating with them myself, which counterintuitively is easier then delegating), and then not stepped down/maintained control this year so that I could continue doing so, no one would have criticized me, even though this action would probably have cost ea 15-30 committed northwestern students already, and potentially many more down the line. I mean, no one criticized me when I ghosted them last year(lol). If I had a better sense of the possibility of actually getting paid currently or after school for this stuff, I could see it increasing the chance I actually did something like above. Moreover, if I had a sense of the potential networking opportunities I might have had access to this year ( I did almost all my organizing except the very beginning during heavy covid), this probably would have increased my chances of doing something like above even more than the money. 

To be clear I probably suck at organizing, and even if I hadn't solely used the club as my own status machine it would have been pretty terrible if I didn't step down and get replaced by the people who currently organize.

To summarize/ Organize:
 

  • There is a lack of real supervision (maybe this has changed like I said I wasn't super involved this year) from the top of what is happening at these clubs, and to the extent that you might receive status for success while you organize, it seems highly related to how willing you are to reach out to people in CEA and ask for more responsibility, or to post updates online, or to generally socialize with other EAs about stuff
  • If you correctly step down so someone better can run the club, it’s not clear there is any sort of reward
  • I would be surprised if delegating correctly was noticed.
  • In general, being a good organizer isn’t even something that seems to get you much clout in this community, see other post today about this (i haven’t read it yet)
  • Thus, the real clout from organizing, esp. If you don’t have an online presence, comes from the access organizing can give you
  • organizing provides opportunities to reach out to anyone in the community
    • BUT, these opportunities often come hand in hand with specific events that your club is participating in. The most “bonding” moments come from helping plan events with other members of EA from different places. There are a finite number of these and each one you delegate is a lost opportunity to talk to someone at CEA, another organizer, a possible speaker, etc.
    • It can feel as though you deserve these opportunities because if you just spent the work that you used on organizing networking in the first place, or blogging, you would probably be more respected, since in the first place organizing doesn’t seem to get much status. Because there is no real oversight, you definitely are not at risk of getting shamed for using the club as a status machine.
      • So you start attending meetings that someone else in the club should have been at, or emailing people to ask them to speak at the club when you should have let a freshman or sophomore email them.
      • or even giving an intro talk when you should have let a younger student give it, because it means all the other people from your school will see you more as one of the sole leaders of the club, which tbf is less related to the overarching concept of this post. And also I want to give a nod to the discussion on balancing resilliance vs. immediate impact, in the sense that you might give a better talk(or so you think), which will convince more people, which might make the club more resilient. But Then I would say you should have coached the younger student better.
  • Seems like we might be promoting squeaky wheels. You get paid if you ask for money(i think?), you get status if you take it, etc. This could both provide bad incentives and be incredibly frustrating to the shyer folk.
  • No one has ever reached out to me for advice on starting a club, or asked how my experience went, or asked me if I would be interested in meta work. I have never received a cent for any of my community building work. If I was actually getting paid what I believe my time is worth, which is probably still much much less than the actual value of my time to EA while I was organizing, I would almost certainly be owed (tens of?) thousands of dollars. I definitely feel like my sense that this was a community where you didn’t need to market yourself to get to the top was not as true as I originally envisioned. At the same time I don't regret starting the club at all. It is probably one of the few things I have done in my life that I feel proud of.
  • What should we do? Can we federalize clubs? Should we have more data analysts and researchers and CEA people work on this? Would we actually audit a college club? Should we pay organizers more? <- but wouldn't this increase "vulturism"?
  • The core realization should be that EA needs an institution(s) that doesn't exist. Without more complex institutions we are basically being culty and trusting each other on a variety of dimensions. I hope the trust remains but why not build resiliency(unless of course, you believe gatekeeping is the solution).

 I know I didn't precisely answer your questions and more just rambled. let me know if you have questions, and obviously if I said stuff that sounds wrong disagree. I feel like even though this post is long it's lacking a lot of nuance I would like to include but I felt it was best to post it like this. 


 

Replies from: jessica_mccurdy, Bluefalcon, Linda Linsefors
comment by jessica_mccurdy · 2022-04-12T14:44:30.547Z · EA(p) · GW(p)

Hi Charles,

I am not writing in an official CEA capacity but just wanted to respond with a couple quick personal thoughts that don't cover everything you mentioned

  • I am sorry you had negative experiences while organizing.
  • I do think a lot has changed in the community building space in the past year.
  • Right now CEA has about 1.5 fte covering ~100 groups so it isn't possible to keep completely up to date on each group but we are working to expand capacity so we can offer some additional support.  In particular, our new University Group Accelerator Program aims to add a lot more oversight and support. I wish it had existed when you were starting up your group. It provides mentorship, stipends, and support for people starting groups.
  • Even though we are expanding support, we strongly encourage groups to be public about how they are doing, for instance by writing on the forum. I think this is helpful for other groups to see as well and drives innovation, collaboration, and progress in the community. 
  • When I am personally thinking about hiring, one of the things that impress me a lot is how successful someone was at passing off their group.  I am also impressed with organizers who act as "senior advisors" where they help on some strategy level stuff but not the object level group organizing. I am generally more impressed with someone who does this well than someone who kept doing active organizing until they graduated and let their club die. 
  • I think there is some clout around doing really good organizing but that requires being publically engaged. I also hope people aren't just doing it for the clout though.
Replies from: Charles_Guthmann
comment by Charlie_Guthmann (Charles_Guthmann) · 2022-04-12T20:58:13.495Z · EA(p) · GW(p)

Hi,

Thanks for the thoughtful reply, appreciate it. Super valid points. Upon re-reading it seems I may have come off insultingly towards the community building contingent of EA. Certainly not my intention! I think y'all are doing a great job and I def don't want to give the impression that I would have a better plan in mind. I am  somewhat familiar with the recent initiatives with universities and think they will def be solid also. 

  • Makes me a bit sad that you need to be publicly engaged to receive recognition. I understand this is probably just a truism about life, not anyone in particulars fault. 
  • Good to hear things are moving forward, def rooting for the success of the new initiatives. 
  • Can you comment on why there are only 1.5 FTE covering uni groups? does no one want those jobs? Trying to be very careful abt scaling? Seems remarkably low when considering potential Cost Benefit but I haven't thought about it enough.  I don't think it would be crazy to have as many as 25 FTE but maybe that is completely ridiculous( maybe this is happening w/ugap?).
  • Good to hear that you care about delegation/passing off. I wonder if you think it's worth making it clear to people that this incentive exists? or do you think it is clear already? Moreover if you hire people at the end of senior year of college how do you know whether or not they did a good job passing off the group?
  • I wonder what you think would happen if you were a nepotist- say you advantaged the community builders you had closer relationships with in hiring/referal decisions. Would you expect to be fired and how quickly? 

Again I just want to clarify that I don't think EA community builders are doing anything specifically wrong per se, and I don't think most of these issues are even super specific to the community building sector of EA. I think the issues I brought up would be present in pretty much any new social movement that is fast scaling and has lots of opportunities. 

Replies from: jessica_mccurdy
comment by jessica_mccurdy · 2022-04-18T17:09:30.078Z · EA(p) · GW(p)

Just another super quick response that doesn't cover everything and is purely my own thoughts and not necessarily accurate to CEA:

  • We are currently expanding the groups team :) We are careful about scaling too fast and want to make high-quality offers. You can read some more on hiring in previous CEA reports.
  • Ideally, people have entirely passed off their group by the end of their senior year (ie: someone else has been running the group and they have just been advising). 
  • Much of the groups team's hiring process is blinded and has clear guidelines and rubrics to help reduce unintentional biases here. (I also think if we were hiring faster this would be even more of a concern!).  I think it is basically impossible to remove all biases here (especially in referral decisions since it really relies on having context on the person) but this is something we take seriously and do not tolerate people acting with conflict of interests. 
comment by Vilfredo's Ghost (Bluefalcon) · 2022-04-18T16:07:21.172Z · EA(p) · GW(p)

I think you are vastly overestimating the access one gains from organizing events. You don't need to organize anything to get access to people. You just have to have something interesting to talk about. I've had access to VIPs in my field since I was 16 because I was working on interesting projects, and my experience within the EA community has been similar--the VIPs are easy to reach as long as you have a reason. And if you are managing someone else who is organizing an event, this should be easy to do,  e.g. you can check up on your subordinates' performance. 

comment by Linda Linsefors · 2022-04-18T15:39:07.576Z · EA(p) · GW(p)
  • In general, being a good organizer isn’t even something that seems to get you much clout in this community, see other post today about this (i haven’t read it yet)

 

Which post is this?

comment by aderonke (QuadBee_Aderonke) · 2022-04-06T06:45:02.963Z · EA(p) · GW(p)

It's already happening. The two people I vetted for FTX Future Fund grant application didn't pass muster but, I dare say, they've perfected the worst version of "isomorphic mimicry" (for the lack of a better phrase). I'm not sure I can share details of this private conversation but it's good someone is pointing it out. 

comment by Benjamin_Todd · 2022-04-07T11:03:58.212Z · EA(p) · GW(p)

Just my impression based on anecdotes, but I've heard about more people from outside of the  community trying to get FTX grants than I've noticed in the past (e.g. with Open Phil). The word seems to have 'gotten out' to a greater degree than before.

To bring in some numbers, FTX had a huge number of applications, so most of these must have come from outside the current community. And it seemed like the number was greater than expected.

To bring in some theory, it would make sense this is happening based on SBF's fame and the strategy they're taking (rapid, low overhead grants for a wider range of things).

So overall it seems plausible to me this is now happening to a greater degree than in the past, but I'm very unsure how much more, and how much of a problem it is. It's probably semi-inevitable as you get bigger.

Replies from: Lukas_Gloor
comment by Lukas_Gloor · 2022-04-07T11:23:50.466Z · EA(p) · GW(p)

Interesting! This makes me wonder if it could also be related to the crypto connection. Crypto is full of opportunities to exploit systems for money, and some people do that full-time with zero ethics. That would lend plausibility to the claim in the OP about discords dedicated to it.

Replies from: Benjamin_Todd
comment by Benjamin_Todd · 2022-04-08T18:15:00.529Z · EA(p) · GW(p)

Agree that seems plausible. I also heard FTX had a lot of crypto projects applying for funding.

comment by Stephen Clare · 2022-04-06T15:19:40.208Z · EA(p) · GW(p)

I downvoted this post because it doesn't present any evidence to back up its claims. Frankly I also foudn the tone off-putting ("vultures"? really?) and the structure confusing. 

I also think it underestimates the extent to which the following things are noticeable to grant evaluators. I reckon they'll usually be able to tell when applicants (1) don't really understand or care about x-risks, (2) don't really understand or care about EA, (3) are lying about what they'll spend the money on, or (4) have a theory of change that doesn't make sense. Of course grant applicants tailor their application to what they think the funder cares about. But it's hard to fake it, especially when questioned.

Also, something like the Atlas Fellowship is not "easy money". Applicants will be competing against extremely talented and impressive people from all over the world. I don't think the "bar" for getting funding for EA projects has fallen as much as  this post, and some of the comments on this post [EA · GW], seem to assume.

Replies from: MichaelPlant
comment by MichaelPlant · 2022-04-06T17:08:19.701Z · EA(p) · GW(p)

it doesn't present any evidence to back up its claims

I appreciate this and it's annoying, but I'm supposing OP didn't think they could do this without revealing who they are, which they wanted to avoid.

I also think it underestimates the extent to which the following things are noticeable to grant evaluators

I agree that grant makers are probably aware of these things, but I would like them to demonstrate it and say how they plan to mitigate it. I note the Atlas Fellowship doesn't talk about this is its FAQ (admittedly the FAQ seems aimed at applicants, not critics, but still). 

I'm not sure how easy it is for grantmakers to tell sincere from insincere people  - particularly at high-school level when there hasn't been so much opportunity to engage in costly signalling. 

I am genuinely worried about what effect it has on people's epistemics if they even think that they will be rewarded for holding certain beliefs. You can imagine impressionable students not wanting to even raise doubts because they worry this might be held against them later. 

Replies from: CitizenTen
comment by CitizenTen · 2022-04-06T17:51:28.136Z · EA(p) · GW(p)

Didn't know how to say it originally, but yes, I did not want to reveal/out sources. It does make it so that the argument holds less punch (and you should be rightly skeptical)  but on net I thought it would be enough without.  

comment by david_reinstein · 2022-04-06T16:34:12.297Z · EA(p) · GW(p)

Increasing purity tests

If you call them 'purity tests' that has a bad connotation.

whether or not this person agrees with the EA consensus on [insert topic here]

Obviously that test would be terrible for the intellectual and epistemic environment of EA. We shouldn't screen on 'whether agrees with outcome'...

But it is reasonable to consider 'epistimic virtues' as inputs ... 'whether someone engages in honest debate, their reasoning is transparent' ... something less stringent than the (CEA principles)[https://www.centreforeffectivealtruism.org/ceas-guiding-principles] perhaps.

I also think considerations like 'does this person have a track record of engaging with EA and EA-adjacent activities before applying for this' should yield some good signaling/screening value.

(I see Mauricio made a similar point)

comment by Ramiro · 2022-04-06T14:37:49.193Z · EA(p) · GW(p)

Thanks for the post. I share your concerns, and I even enjoy the kind of alarmist tone. However, I think some possible objections would be:
a) Perhaps job applications are more effective at marketing EA than other strategies. Publish a good job offer, and you can make dozens or hundreds of talented and motivated people dive into EA concepts.

b) Maybe false positive rates are increasing, but what about recall? It's all about trade-offs, right? There are probably many people with EA potential out there; how many vultures are you willing to let in to attract them?

c) I don't have a problem with "effective" vultures. If they can, e.g., solve the alignment problem or fill the operational needs of an EA organization, does it matter a lot that they are just building career capital?

comment by quinn · 2022-04-09T15:49:32.271Z · EA(p) · GW(p)

“You should apply for [insert EA grant], all I had to do was pretend to care about x, and I got $$!”

I can speak of one EA institution, which I will not name, that suffers from this. Math and cognitive science majors can get a little too  far in EA circles just by mumbling something about AI Safety, and not delivering any actual interfacing with the literature or the community. 

So, thanks for posting. 

Replies from: Linch
comment by Linch · 2022-04-09T20:33:31.569Z · EA(p) · GW(p)

Have you told the institution about this? Seems like a pretty important thing for them to know!

Replies from: Linch
comment by Linch · 2022-05-13T13:57:17.343Z · EA(p) · GW(p)

Update: I had a chance to talk to quinn irl about this, and speaking in very broad strokes, I consider the problem (at least for the specific example they gave) an order of magnitude less significant than when I read it the first time on this forum.

comment by Jack R (JackRyan) · 2022-04-08T03:18:33.673Z · EA(p) · GW(p)

I might make it clearer that your bullet points are what you recommend people not do. I was skimming and at first and was close to taking away the opposite of what you intended.

comment by Jonas Vollmer · 2022-04-07T19:14:27.495Z · EA(p) · GW(p)

Atlas Fellowship cofounder here. Just saw this article. Currently running a workshop, so may get back with a response in a few days.

For now, I wanted to point out that the $50,000 scholarship is for educational purposes only. (If it says otherwise anywhere, let me know.)

Replies from: portack, alexrjl
comment by portack · 2022-04-07T21:11:33.827Z · EA(p) · GW(p)

the $50,000 scholarship is for educational purposes only

That's not how I understood the scholarship when I read the information on the website.

 

The FAQ says

Scholarship money should be treated as “professional development funding” for award winners. This means the funds could be spent on things like professional travel, textbooks, technology, college tuition, supplementing unpaid internships, and more.

and

Once the student turns 18, they have two options:

  1. Submit an award disbursement request every year, indicating the amount of scholarship the student would like to withdraw for what purposes. Post-undergrad, the remainder of the funds are sent to the student. This helps avoid scholarship displacement.
     
  2. Receive the scholarship funds as a lump-sum payment sent directly to the student.

From this, I concluded that once the student turns 18, they can use the money for everything that could be defended as plausibly leading to their professional development.

If that's the case, than though the scholarship is not exactly "no strings attached" as the OP claims, it's still a description that to me seems closer to reality than "educational purposes only".

edit: a typo

Replies from: Ben Pace, Jeff_Kaufman
comment by Ben Pace · 2022-04-08T03:05:40.019Z · EA(p) · GW(p)

I remember hearing that the money was just for the person and I felt alarmed, thinking that so many random people in my year at school would've worked their asses off to get $50k — it's more than my household earned in a year. 

Sydney told me scholarships like this are much more common in the US, then I updated that it's only to be paid against college fees which is way more reasonable. But I guess this is kind of ambiguous still? Does seem like it's two radically different products.

comment by Jeff Kaufman (Jeff_Kaufman) · 2022-04-23T07:52:07.219Z · EA(p) · GW(p)

If you start from the premise that someone is trying to game the system, then since there seems to be no oversight on what happens after they choose to take a $50k transfer to their bank account it's effectively no strings attached.

comment by alexrjl · 2022-04-07T20:30:47.224Z · EA(p) · GW(p)

Don't people have the option to take it as a lump sum? If that is the case, presumably if they are willing to game the system to get the money they will not be particularly persuaded by a clear instruction to "only spend it on education".

comment by Pranay K · 2022-04-07T20:04:37.634Z · EA(p) · GW(p)

Thank you for writing this post. The discussion and critiques brought up are important and valuable, and I just want to say that I'm grateful you put this out there, since I've been very worried about the same things. 

comment by james.lucassen · 2022-04-06T23:56:31.841Z · EA(p) · GW(p)

I'll hop on the "I'd love to see sources" train to a certain extent, but honestly we don't really need them. If this is happening it's super important, and even if it isn't happening right now it'll probably start happening somewhat soon. We should have a plan for this.

comment by Harrison Durland (Harrison D) · 2022-04-06T14:46:42.941Z · EA(p) · GW(p)

Vultures ≈ Death, typically

EA = Not dead, quite the opposite in fact

Replies from: Jeff_Kaufman
comment by Jeff Kaufman (Jeff_Kaufman) · 2022-04-23T07:58:26.354Z · EA(p) · GW(p)

The "circling vultures" metaphor is generally used to mean "X is in danger". The idea is that vultures can tell X is at an elevated risk of death and are preparing to swoop down once that happens.

Here my interpretation is that the vultures represent grifters, and X is something like "EA money, protected by diligence and community trust". When the protection falls the grifters swoop in on the money.

comment by Chris Leong (casebash) · 2022-04-06T10:45:11.341Z · EA(p) · GW(p)

I think we should be careful about how we communicate. Maybe instead of just saying that there is "lots of funding available" we should clarify that we mean that there's lots of funding available for people who can deliver. That is less likely to draw in vultures.

Replies from: Jeff_Kaufman
comment by Jeff Kaufman (Jeff_Kaufman) · 2022-04-23T08:00:21.424Z · EA(p) · GW(p)

I haven't seen very much in terms of mechanisms that ensure funding is only distributed to people who can deliver? And it seems like that's in opposition to a "hits-based" approach, where many people doing what we want them to do will still come away without having accomplished their goal?

comment by mic (michaelchen) · 2022-04-06T06:58:01.036Z · EA(p) · GW(p)
As a community that values good epidemics

good epistemics?

Thanks for posting about this; I had no idea this was happening to a significant extent.

comment by Simon Skade · 2022-04-12T18:33:59.104Z · EA(p) · GW(p)

I think it is important to keep in mind that we are not very funding constrained. It may be ok to have some false positives, false negatives may often be worse, so I wouldn't be too careful.

I think grantmaking is probably still too reluctant to fund stuff that has an unlikely chance of high impact, especially if they are uncertain because the people aren't EAs.
For example, I told a very exceptional student (who has like 1 in a million problem solving capability) to apply for Atlas fellowship, although I don't know him well, because from my limited knowledge it increases the chance that he will work on alignment from 10% to 20-25%, and the $50k are easily worth it.

Though of course having more false positives causes more people that only pretend to do sth good to apply, which isn't easy to handle for our current limited number of grantmakers. We definitely need to scale up grantmaking ability anyways.

I think that non-EAs should know that they can get funding if they do something good/useful. You shouldn't need to pretend to be an EA to get funding, and defending against people who pretend they do good projects seem easier in many cases, e.g. you can often just start giving a little funding and promise more funding later if they show they progress.

(I also expect that we/AI-risk-reduction gets even much more funding as the problem gets more known/acknowledged. I'd guess >$100B in 2030, so I don't think funding ever becomes a bottleneck, but not totally sure of course.)

comment by acylhalide (Samuel Shadrach) · 2022-04-06T10:51:15.855Z · EA(p) · GW(p)

One solution is to give less money per person but give to more people, and to scale-up giving to the same person or org over time. (I'm assuming this is already being done, but didn't find it in the post.)

Replies from: AndreaM
comment by Andrea_Miotti (AndreaM) · 2022-04-06T13:00:27.882Z · EA(p) · GW(p)

This may be counterproductive as many projects we would like to see funded face economic barriers to entry.

E.g., if starting any effective new advocacy org requires at minimum a 0.5 FTE salary of X and initial legal costs of Y, for a total of X+Y=Z, funding some people 20% below Z won't lead to a 20% less developed advocacy org, but no advocacy org at all.

Fixed costs also vary across projects, and only providing initial funding below a certain threshold could lead to certain high-value but high-fixed cost projects being de-prioritized compared to low-fixed cost, lower-value ones.

Replies from: Samuel Shadrach
comment by acylhalide (Samuel Shadrach) · 2022-04-07T11:37:01.854Z · EA(p) · GW(p)

Agreed, and this feels compatible with what I'm saying.

That being said this whole topic is completely outside my expertise, so I'm not sure I should engage further. Sorry. Thank you for replying though.

comment by Charles He · 2022-04-06T08:18:24.959Z · EA(p) · GW(p)

It seems like there are tools to deal with this:

  • Grants ramp up and demand for talent seems stable, so most financial value is in the future. So defecting by taking the money and running, or other kinds of defection is costly.
  • The incentives are even larger because grants ramp up in size and independence.
  • Most grants, including community building grants, have observable outputs in early stages.
  • The larger the grant is, the higher the output and more skill demanded.
  • A 50K development grant for extraordinary young people has been done before. In the Thiel instance, it has been really successful, with a large number of high value projects and people.
     

Finally, this isn't what will need to happen, but an available, robust strategy is to focus on giving grants to people with high opportunity costs or high outside options. These outside options are observable and also correlated with effectiveness, for reasons most people find acceptable. 

comment by A_donor · 2022-05-03T19:11:44.970Z · EA(p) · GW(p)

I was considering writing something like this up a a while back, but didn't have enough evidence directly, was mostly working of too few examples as a grantmaker and general models. Glad this concern is being broadcast.

I did come up with a proposal for addressing parts of the problem over on the "Please pitch ideas to potential EA CTOs [EA · GW]" post. If you're a software dev who wants to help build a tool which might make the vultures less able to eat at least parts of EA please read over the proposal and ping me if interested.

comment by hath · 2022-04-08T17:05:40.623Z · EA(p) · GW(p)

For what it's worth, Atlas has two different steps of online application, and then a final interview. This doesn't make it impossible to Goodhart, but it buys us time.

comment by Peter S. Park · 2022-04-06T18:15:06.668Z · EA(p) · GW(p)

My prior is that one's degree of EA-alignment is pretty transparent. If there are any grifters, they would probably be found out pretty quickly and we can retract funding/cooperation from that point on. 

Also, people who are at a crossroads of either being EA-aligned or non-EA aligned (e.g., people who want to be a productive member of a lively and prestigious community) could be organizationally "captured" and become EA-aligned, if we maintain a high-trust, collaborative group environment.