Posts

Does the Forum Prize lead people to write more posts? 2021-09-21T03:09:30.837Z
Who do intellectual prizewinners follow on Twitter? 2021-08-25T15:26:19.293Z
[PR FAQ] Improving tag notifications 2021-08-09T10:23:25.239Z
Our plan to share "PR FAQs" for new Forum features 2021-07-29T04:10:26.253Z
Some 2021 CEA Retention Statistics 2021-07-09T17:11:29.933Z
Thoughts on being overqualified for EA positions 2021-04-30T03:19:41.186Z
[Job Ad] Help us make this Forum better 2021-03-25T23:23:26.801Z
Layman’s Summary of Resolving Pascallian Decision Problems with Stochastic Dominance 2021-03-12T03:51:24.215Z
Retention in EA - Part II: Possible Projects 2021-02-05T19:09:31.361Z
Retention in EA - Part I: Survey Data 2021-02-05T19:09:18.450Z
Retention in EA - Part III: Retention Comparisons 2021-02-05T19:02:05.324Z
EA Group Organizer Career Paths Outside of EA 2020-07-14T23:44:10.799Z
Are there robustly good and disputable leadership practices? 2020-03-19T01:46:38.484Z
Harsanyi's simple “proof” of utilitarianism 2020-02-20T15:27:33.621Z
Quote from Strangers Drowning 2019-12-23T03:49:51.205Z
Peaceful protester/armed police pictures 2019-12-22T20:59:29.991Z
How frequently do ACE and Open Phil agree about animal charities? 2019-12-17T23:56:09.987Z
Summary of Core Feedback Collected by CEA in Spring/Summer 2019 2019-11-07T16:26:55.458Z
EA Art: Neural Style Transfer Portraits 2019-10-03T01:37:30.703Z
Is pain just a signal to enlist altruists? 2019-10-01T21:25:44.392Z
Ways Frugality Increases Productivity 2019-06-25T21:06:19.014Z
What is the Impact of Beyond Meat? 2019-05-03T23:31:40.123Z
Identifying Talent without Credentialing In EA 2019-03-11T22:33:28.070Z
Deliberate Performance in People Management 2017-11-25T14:41:00.477Z
An Argument for Why the Future May Be Good 2017-07-19T22:03:17.393Z
Vote Pairing is a Cost-Effective Political Intervention 2017-02-26T13:54:21.430Z
Ben's expenses in 2016 2017-01-29T16:07:28.405Z
Voter Registration As an EA Group Meetup Activity 2016-09-16T15:28:46.898Z
You are a Lottery Ticket 2015-05-10T22:41:51.353Z
Earning to Give: Programming Language Choice 2015-04-05T15:45:49.192Z
Problems and Solutions in Infinite Ethics 2015-01-01T20:47:41.918Z
Meetup : Madison, Wisconsin 2014-10-29T18:03:47.983Z

Comments

Comment by Ben_West on Open Thread: September 2021 · 2021-09-22T20:54:29.710Z · EA · GW

Welcome Jen!

Comment by Ben_West on Open Thread: September 2021 · 2021-09-22T17:23:46.850Z · EA · GW

Hey David! Congratulations on publishing your first post :)

Comment by Ben_West on Does the Forum Prize lead people to write more posts? · 2021-09-21T20:05:17.999Z · EA · GW

Thanks Larks! I somewhat regularly encounter people who are hesitant to post on the forum, and can't recall a time when telling them about the existence of the prize made them seem more likely to post. I can, however, think of people who have told me that they were more willing to post after having received the prize or other recognition for their work.

My guess is that something like imposter syndrome is more of a barrier to people posting than money is.

Comment by Ben_West on [Creative Writing Contest] [Poetry] [Referral] "The Bell-Buoys" · 2021-09-20T20:33:31.042Z · EA · GW

Thanks for posting this! I ended up liking it, although it took me a while to figure out what the poem was trying to say. In case others have the same confusion, here's the Kipling society's summary:

From its place out over the shoals, the Bell Buoy’s voice is lifted to issue warning and protect human life while the church bell safe in its tower, knows nothing of these dangers and stands aloof. Its voice is one controlled by the authority of the church and limited by the church’s interests. In contrast the bell buoy glories in its independence and in the vital work it performs.

Comment by Ben_West on [Creative Writing Contest] [Fiction] The Reset Button · 2021-09-18T20:57:56.487Z · EA · GW

I found this motivational. Thanks for posting!

Comment by Ben_West on [Creative Writing Contest] [Fiction] [Referral] A Common Sense Guide to Doing the Most Good, by Alexander Wales · 2021-09-13T18:04:02.428Z · EA · GW

Thanks for posting this! I had read some of their other stuff, but hadn't come across this story

Comment by Ben_West on Buck's Shortform · 2021-09-08T21:51:22.319Z · EA · GW

Thanks! "EA organizations are bad" is a reasonable answer.

(In contrast, "for-profit organizations are bad" doesn't seem like reasonable answer for why for-profit entrepreneurship exists, as adverse selection isn't something better organizations can reasonably get around. It seems important to distinguish these, because it tells us how much effort EA organizations should put into supporting entrepreneur-type positions.)

Comment by Ben_West on Buck's Shortform · 2021-09-08T13:20:57.428Z · EA · GW

Thanks for writing this up. At the risk of asking obvious question, I'm interested in why you think entrepreneurship is valuable in EA.

One explanation for why entrepreneurship has high financial returns is information asymmetry/adverse selection: it's hard to tell if someone is a good CEO apart from "does their business do well", so they are forced to have their compensation tied closely to business outcomes (instead of something like "does their manager think they are doing a good job"), which have high variance; as a result of this variance and people being risk-averse, expected returns need to be high in order to compensate these entrepreneurs.

It's not obvious to me that this information asymmetry exists in EA. E.g. I expect "Buck thinks X is a good group leader" correlates better with "X is a good group leader" than "Buck thinks X will be a successful startup" correlates with "X is a successful startup".

It seems like there might be a "market failure" in EA where people can reasonably be known to be doing good work, but are not compensated appropriately for their work, unless they do some weird bespoke thing.

Comment by Ben_West on Thoughts on being overqualified for EA positions · 2021-09-08T12:55:37.058Z · EA · GW

Sure, those other things are also ways in which I would say that Bob is underqualified, not overqualified.

Comment by Ben_West on JP's Shortform · 2021-09-08T10:47:17.240Z · EA · GW

I see. My model is something like: working uses up some mental resource, and that resource being diminished presents as "it's hard for you to work more hours without some sort of lifestyle change." If you can work more hours without a lifestyle change, that seems to me like evidence your mental resources aren't diminished, and therefore I would predict you to be more productive if you worked more hours.

As you say, the most productive form of work might not be programming, but instead talking to random users etc.

Comment by Ben_West on JP's Shortform · 2021-09-06T17:10:36.617Z · EA · GW

Thanks for writing this up – I'm really interested in answers to this and have signed up for notifications to comments on this post because I want to see what others say.

I find it hard to talk about "working harder" in the abstract, but if I think of interventions that would make the average EA work more hours I think of things like: surrounding themselves by people who work hard, customizing light sources to keep their energy going throughout the day, removing distractions from their environment, exercising and regulating sleep well, etc. I would guess that these interventions would make the average EA more productive, not less.

(nb: there are also "hard work" interventions that seem more dubious to me, e.g. "feel bad about yourself for not having worked enough" or "abuse stimulants".)

One specific point: I'm not sure I agree regarding the benefits of "fresh perspective". It can sometimes happen that I come back from vacation and realize a clever solution that I missed, but usually me having lost context on a project makes my performance worse, not better.

Comment by Ben_West on Who do intellectual prizewinners follow on Twitter? · 2021-08-26T00:50:17.198Z · EA · GW

Thanks! I added these to the comment (because of how big query works, I can't easily add them to the dashboard).

Comment by Ben_West on Who do intellectual prizewinners follow on Twitter? · 2021-08-25T23:30:52.036Z · EA · GW

I've received a couple of kind suggestions of people I should have included as "EA adjacent influencers". Please reply to this comment (or DM me) if you see more.

  • SBF_FTX - 1 IPW Follower
  • LinchZhang - 1 IPW Follower
  • sapinker - 24 IPW Followers
  • tylercowen - 28 IPW Followers
  • mattyglesias - 24 IPW Followers
  • dominic2306 - 0 IPW Followers
  • lxrjl - 1 IPW Followers
  • chanamessinger - 0 IPW Followers
  • nathanpmyoung - 0 IPW Followers
Comment by Ben_West on Analyzing view metrics on the EA Forum · 2021-08-11T23:57:58.641Z · EA · GW

Thanks SEADS for your help with this research, and for taking the time to share it publicly!

Comment by Ben_West on [PR FAQ] Improving tag notifications · 2021-08-10T19:14:34.020Z · EA · GW

Thanks! This is helpful to know.

A digest for notifications is an interesting idea. And I agree the current UI gives more information on desktop, which is probably unavoidable given that there's just more real estate, but I do think we want our mobile experience to be good. (About half of our users are on mobile devices.)

Comment by Ben_West on [PR FAQ] Improving tag notifications · 2021-08-10T19:12:00.894Z · EA · GW

Thanks! I appreciate the feedback

Comment by Ben_West on [PR FAQ] Improving tag notifications · 2021-08-10T19:11:33.780Z · EA · GW

Thanks! Your comment is helpful information that we should make tag subscriptions more findable, even if we don't change the underlying functionality.

Comment by Ben_West on [PR FAQ] Adding profile pictures to the Forum · 2021-08-10T17:56:47.354Z · EA · GW

But to the extent these come at the cost of rational discussion, this is a cost we should be happy to pay.

What do you think the effect size is of adding pictures? My guess is that it's pretty small.

For example, the "beauty premium" in employment compensation is usually considered to be small (<10%),[1] and I would expect that to be much larger than the effect of profile pictures on a forum, because a) how your coworkers look is much more salient than how some commenter with a tiny picture looks, and b) beauty is more plausibly correlated with productivity in certain jobs (e.g. sales) than it is with forum post quality.


  1. This is going based off of memory from the last time I looked into this, but it seems to be confirmed by this article, which is the most recent review article a quick search could find. ↩︎

Comment by Ben_West on Our plan to share "PR FAQs" for new Forum features · 2021-08-06T16:18:00.640Z · EA · GW

Thanks Nathan! I look forward to hearing your feedback on them

Comment by Ben_West on Part 1: EA tech work is inefficiently allocated & bad for technical career capital · 2021-08-04T01:53:28.327Z · EA · GW

Hi Arepo, I think you are describing a tiny portion of EA software development, but are using the term "EA tech" to describe that small portion. I would suggest changing your post to something like "small EA organizations should hire an agency instead of hiring <=1 FTE of developers" and drop the term "EA tech work" unless it's something that genuinely applies to all EA tech work.

The claim “EA tech work is bad for technical career capital” seems particularly unsubstantiated.

I care about this not so much because it affects your agency proposal, but more that I worry software developers who are reading this won't understand that the experiences you describe are not representative, unless they read very closely.

As some justification: Perhaps the most obvious definition of "EA tech work" is to filter the 80 K job board for "engineering" positions. When I do this, the current top positions are at the UK government, DeepMind, OpenAI, the Rockefeller Foundation, and Microsoft. These positions generally do not suffer from the problems you mentioned, like looking bad on a resume.

The 80 K job board is sometimes criticized for being too longtermist-oriented. My guess is that most short-termist EA engineers are at places like Wave, which employ dozens to hundreds of developers, and similarly don't suffer from the difficulties you mention here, though there is not an equivalent job board to check.

In part II you say:

At any given time in the last few years, there have been perhaps 5-10 software developers working full time in EA nonprofits.

CEA, my current employer, single-handedly employs this many full-time software developers.[1] The same is true of my former employer Ought. I expect it's also true of Redwood or Anthropic. It's also true of EA-aligned animal rights organizations like The Humane League and Global Health and Development charities like GiveDirectly. So I'm guessing you are also excluding from consideration "EA nonprofits which have dedicated software development teams."

My best guess is that you are considering only EA organizations which hire <=1 FTE of software developers. This is an important target audience to consider, but is very different from all of "EA tech work".

You noted that 100% of the people who said they were worried about compensation being too low were just factually wrong about EA compensation. I suspect a similar thing is true regarding career capital, and would not want your post to reinforce that misimpression.


  1. Note that CEA includes some umbrella projects like EA Funds and GWWC ↩︎

Comment by Ben_West on Part 1: EA tech work is inefficiently allocated & bad for technical career capital · 2021-08-04T01:52:35.538Z · EA · GW

Hi Arepo, I think you are describing a tiny portion of EA software development, but are using the term "EA tech" to describe that small portion. I would suggest changing your post to something like "small EA organizations should hire an agency instead of hiring <=1 FTE of developers" and drop the term "EA tech work" unless it's something that genuinely applies to all EA tech work.

The claim “EA tech work is bad for technical career capital” seems particularly unsubstantiated.

I care about this not so much because it affects your agency proposal, but more that I worry software developers who are reading this won't understand that the experiences you describe are not representative, unless they read very closely.

As some justification: Perhaps the most obvious definition of "EA tech work" is to filter the 80 K job board for "engineering" positions. When I do this, the current top positions are at the UK government, DeepMind, OpenAI, the Rockefeller Foundation, and Microsoft. These positions generally do not suffer from the problems you mentioned, like looking bad on a resume.

The 80 K job board is sometimes criticized for being too longtermist-oriented. My guess is that most short-termist EA engineers are at places like Wave, which employ dozens to hundreds of developers, and similarly don't suffer from the difficulties you mention here, though there is not an equivalent job board to check.

In part II you say:

At any given time in the last few years, there have been perhaps 5-10 software developers working full time in EA nonprofits.

CEA, my current employer, single-handedly employs this many full-time software developers.[1] The same is true of my former employer Ought. I expect it's also true of Redwood or Anthropic. It's also true of EA-aligned animal rights organizations like The Humane League and Global Health and Development charities like GiveDirectly. So I'm guessing you are also excluding from consideration "EA nonprofits which have dedicated software development teams."

My best guess is that you are considering only EA organizations which hire <=1 FTE of software developers. This is an important target audience to consider, but is very different from all of "EA tech work".

You noted that 100% of the people who said they were worried about compensation being too low were just factually wrong about EA compensation. I suspect a similar thing is true regarding career capital, and would not want your post to reinforce that misimpression.


  1. Note that CEA includes some umbrella projects like EA Funds and GWWC ↩︎

Comment by Ben_West on Is effective altruism growing? An update on the stock of funding vs. people · 2021-08-02T21:04:35.104Z · EA · GW

I feel like these conversations often get confusing because people mean different things by the term "entrepreneur", so I wonder if you could define what you mean by "entrepreneur" and what you think they would do in EA?

Even with very commercializable EA projects like cellular agriculture, my experience is that the best founders are closer to scientists than traditional CEOs, and once you get to things like disentanglement research the best founders have almost no skills in common with e.g. tech company founders, despite them both technically being "entrepreneurs" in some sense.

Comment by Ben_West on Part 3: Comparing agency organisational models · 2021-08-02T16:03:17.381Z · EA · GW

Thanks for writing this up! 

An EA-specific agency would have to be low-bono, offering major discounts to EA orgs - otherwise it would be indistinguishable from the countless existing for-profit agencies.

This is building a bit on what Sanjay said, but I think this sentence deserves more highlighting.

If the only advantage of an EA agency is that it has lower prices, then it's equivalent to donating more money to the organization which hires the developers.

Donating more money has a lot of advantages over building an entire new agency: we don't have to go through all the hassle of identifying talent, structuring contracts, etc. (Indeed, the benefit of avoiding all that hassle is why contract agencies exist.)

An adjacent project I would be excited about is someone creating a vetted list of non-EA agencies. This seems to provide many of the benefits of an EA agency, without some of the costs.

Comment by Ben_West on Thoughts on being overqualified for EA positions · 2021-07-29T21:43:47.633Z · EA · GW

Congrats on the comment prize!

Would you agree that, if Bob was more politically skilled, he would be a better fit for this position? (E.g. he would be better able to convince Carol to do this ambitious project.)

If so, then maybe you want to say that he is "overqualified in technical knowledge and underqualified in political ability" or something, but chalking the problem up to being "overqualified" across-the-board seems misleading.

If you are a junior employee then sure, it's your managers responsibility to listen to your ideas. But as you become more senior, it becomes more of your responsibility to get buy-in. E.g.:

One of Steve’s direct reports told a story about a debate he had with Steve. Eventually, he backed down not because Steve had convinced him, but because he was afraid to keep arguing the point. When events proved that Steve had been wrong in his position, he stormed into his employee’s office and demanded, “Why did we do this??” When his employee pointed out that it had been Steve’s call, Steve exclaimed, “Well, it was your job to convince me I was wrong, and you failed!” - What Steve Jobs Taught Me About Debate in the Workplace

Comment by Ben_West on What grants has Carl Shulman's discretionary fund made? · 2021-07-04T02:02:40.462Z · EA · GW

Are you or the grantee planning to publish the results of the creatine investigation? I think it would be helpful for many in the community, even if it's a null result.

Comment by Ben_West on Project Ideas in Biosecurity for EAs · 2021-06-28T01:26:38.436Z · EA · GW

Write / find a single solid reference on “list-based sequence screening is flawed”

Do you have non-single/solid references about this? I know someone who might be interested in doing this write up, but is trying to get more background understanding.

Comment by Ben_West on EA Infrastructure Fund: Ask us anything! · 2021-06-23T18:34:42.566Z · EA · GW

Cool, for what it's worth my experience recruiting for a couple EA organizations is that labor supply is elastic even above (say) $100k/year, and your comments seem to indicate that you would be happy to fund at least some people at that level.

So I remain kind of confused why the grant amounts are so small.

Comment by Ben_West on EA Infrastructure Fund: Ask us anything! · 2021-06-04T18:36:17.202Z · EA · GW

This means that when we do encounter such an opportunity, we should most likely take it, even if it seems expensive or unlikely to succeed... Some EAs doing direct work could literally earn >$1,000 per hour if they pursued earning to give, but it's generally agreed that direct work seems more impactful for them

I notice that the listed grants seems substantially below $1000/hour; e.g. Rethink getting $250,000 for seven FTEs implies ~$35,000/FTE or roughly $18/hour. *

Is this because you aren't getting those senior people applying? Or are there other constraints?

* (Maybe this is off by a factor of two if you meant that they are FTE but only for half the year etc.)

Comment by Ben_West on [Job Ad] Help us make this Forum better · 2021-06-01T23:53:54.001Z · EA · GW

Thanks for commenting! Unfortunately applications for this position have closed, but I hope you will apply in a future round, or to one of the other positions for which we are currently hiring, if they are relevant to your skill set.

Comment by Ben_West on Retrospective on Catalyst, a 100-person biosecurity summit · 2021-05-28T00:31:58.643Z · EA · GW

Congratulations on such a successful event! 

  1. Regarding your NPS of 74: I think that's quite good; this page describes an NPS of 60+ as "very, very special, your only daughter probably just got married". 
  2. "We sent a lot of emails in the few weeks leading up to the summit, with various actions for participants to take, and that seemed like a good way to build enthusiasm / engagement." – could you say more about what you asked the attendees to do?
  3. I recently found out about Gather and was also pretty impressed. Do you happen to have filled out versions of the worksheets you could share? I'd be particularly interested in the "designer's agenda" you came up with
Comment by Ben_West on AMA: Working at the Centre for Effective Altruism · 2021-05-27T13:53:54.425Z · EA · GW

Positive: The people I work with, both at CEA as well as the wider EA community, are often impressive, talented, and kind.

Negative: I'm not a morning person, and living in Pacific time while working with Brits means I have to be up early a lot

Comment by Ben_West on AMA: Working at the Centre for Effective Altruism · 2021-05-25T20:02:19.596Z · EA · GW

I sometimes speak to people who aren't aware how many career paths in community building there are, even outside of EA. I do think this causes there to be fewer community builders than there "should" be.

It feels hard to make really broad statements though; some people's skills and interests are pretty clearly not a fit for community building, and I don't think they should try to force it.

Comment by Ben_West on Seven things that surprised us in our first year working in policy - Lead Exposure Elimination Project · 2021-05-18T18:27:00.184Z · EA · GW

Thanks for writing this up! Really helpful to hear about your experiences with governments, and it's cool that you've been able to make so much progress.

Comment by Ben_West on Animal Welfare Fund: Ask us anything! · 2021-05-13T19:42:35.579Z · EA · GW

I speak with a lot of people with software engineering backgrounds who are looking for impactful projects. Are there any software projects you wish people would take on?

I sometimes refer engineers to the cultivated meat modeling consortium, but that group doesn't seem very active.

Comment by Ben_West on Animal Welfare Fund: Ask us anything! · 2021-05-13T19:40:25.463Z · EA · GW

I've heard that academic research is funding constrained, in the sense that there are academics who would be willing to do research, particularly in the field of cellular agriculture, but they can't get grants. (I think this funding constraint is partially a reflection of biological research being pretty expensive.) I noticed that very few of your grantees are formally affiliated with an academic institution.

Is this just because you don't get applications from academics, or are there reasons against funding them (e.g. the minimum grant size is too high)?

Comment by Ben_West on Animal Welfare Fund: Ask us anything! · 2021-05-13T19:40:06.066Z · EA · GW

I sometimes hear from people who are interested in working on cellular agriculture or other meat alternatives, and want to do a PhD, but can't find an advisor who is working on one of those subjects, so they instead plan to research e.g. tissue engineering or cell modeling for the purpose of treating human disease (or some other better funded domain).

In your request for proposals, you seem mostly interested in people who are working full-time on animal-related research.

I'm curious if you have advice for people who are in the situation I described (including "it's really a lot better to immediately researching impactful things so you should try as hard as you can to do that"), and/or if there are any things people in this position could do that you would be excited to fund?

Comment by Ben_West on The Impossibility of a Satisfactory Population Prospect Axiology · 2021-05-13T04:24:05.105Z · EA · GW

Thanks for posting this! If I understand your "risky" assumptions correctly, it seems to be targeted at people who believe (as a simple example):

  1. Apples are better than oranges, and furthermore no amount of oranges can equate to one Apple
  2. Nonetheless, it's better to have a high probability of receiving an orange than a small probability of getting one Apple

Is that correct?

If so, what is the argument for believing both of these? My assumption is that someone who thinks that apples are lexically better than oranges would disagree with (2) and believe that any probability of an Apple is better than any probability of an orange.

Side question: the "risky" axioms seem quite similar to the Archimedean axiom in some variants of the VNM utility theorem. I think you also assume completeness and transitivity – are they enough to recover the entire VNM theorem? (I.e. do your axioms imply that there is a real-valued utility function whose expectation we must be trying to maximize?)

Comment by Ben_West on Thoughts on being overqualified for EA positions · 2021-05-07T23:48:23.273Z · EA · GW

Sure, but that's also a reason against appropriately qualified people working there also, right?

What I'm pushing against is the assumption that employees love outsiders coming in and telling them all the things they are doing wrong, and if they don't like you pointing out their mistakes it must mean you are "overqualified".

I actually hear the opposite more frequently: having a more junior title makes it easier for people to listen to your suggestions, because it's less threatening for you to point out mistakes.

Comment by Ben_West on Thoughts on being overqualified for EA positions · 2021-05-07T00:04:21.217Z · EA · GW

Thanks! I agree with the concern, but I think I disagree about the root cause:

She likes Bob's ideas and wants to find ways to implement them, but doesn't like Bob's leadership style so doesn't want to put him in a leadership position

In general, I'm skeptical about "putting" people in leadership positions, especially when their colleagues don't want to be led by them.

If people aren't listening to Bob because they don't like his leadership style, then I would say that Bob is a bad culture fit (or, to be blunt, not a good leader). I wouldn't describe this as the organization "not letting him thrive."

I do agree that it's harder to hire senior people though:

It's ... higher stakes to hire someone with more seniority

There's a related thing you might be pointing to like "in a big organization, Bob can just come up with ideas and someone else will implement them, diminishing the costs of his abrasive leadership style. But in a smaller organization he has to both come up with ideas and execute, and maybe he's not enough of a generalist for that." I definitely agree with this concern.

Comment by Ben_West on Thoughts on being overqualified for EA positions · 2021-05-04T01:18:12.332Z · EA · GW

Interesting point – my interpretation of that statistic is that external people are hired into more senior roles than internal people. I guess it's also consistent with the hypothesis that external people get less mentorship though.

Comment by Ben_West on Thoughts on being overqualified for EA positions · 2021-05-01T22:54:08.020Z · EA · GW

Thanks! I agree that the amount of career capital a position will generate is an important factor in any career decision, "overqualified" or not.

I'm curious about your "it can be hard to reverse" statement though: how frequently do you think this happens? At least in US tech, it's pretty common for people to take a year off to organize against Trump or whatever, and a year of charity work is definitely not irreversible. When I've talked to recruiters they basically just ignore charity work, at worst.

Anecdotally it always feels like now is the wrong time to leave your job because (good thing) is right around the corner, but it's usually perfectly fine to leave in actuality. Received wisdom (supported by some evidence) is that regularly switching jobs actually makes you more successful.

Comment by Ben_West on Concerns with ACE's Recent Behavior · 2021-04-22T02:42:47.651Z · EA · GW

Thanks for correcting my mistaken impression Jakub! I've updated my comment to link to yours.

Comment by Ben_West on Concerns with ACE's Recent Behavior · 2021-04-19T19:07:27.504Z · EA · GW

I guess I don't know OP's goals but yeah if their goal is to publicly shame ACE then publicly shaming ACE is a good way to accomplish that goal.

My point was a) sending a quick emails to someone about concerns you have with their work often has a very high benefit to cost ratio, and b) despite this, I still regularly talk to people who have concerns about some organization but have not sent them an email.

I think those claims are relatively uncontroversial, but I can say more if you disagree.

Comment by Ben_West on How much does performance differ between people? · 2021-04-17T03:59:42.112Z · EA · GW

Basic statistics question: the GMA predictors research seems to mostly be using the Pearson correlation coefficient, which I understand to measure linear correlation between variables.

But a linear correlation would imply that billionaires have an IQ of 10,000 or something which is clearly implausible. Are these correlations actually measuring something which could plausibly be linearly related (e.g. Z score for both IQ and income)?

I read through a few of the papers cited and didn't see any mention of this. I expect this to be especially significant at the tails, which is what you are looking at here.

Comment by Ben_West on Concerns with ACE's Recent Behavior · 2021-04-16T19:42:41.011Z · EA · GW

Yep, definitely don't want people to swing too far in the opposite direction. Just commenting that "talk to people about your concerns with them" is a surprisingly underutilized approach, in my experience.

Comment by Ben_West on Concerns with ACE's Recent Behavior · 2021-04-16T19:39:45.720Z · EA · GW

Thanks! I had interpreted "We are yet to see how successful the leadership transition turns out" as a pretty strong statement, but I agree that the review doesn't specify how the different factors they list are weighted and your interpretation could be correct. I hope someone from ACE can clarify.

Comment by Ben_West on Concerns with ACE's Recent Behavior · 2021-04-16T15:49:21.364Z · EA · GW

I do wish we could be having this discussion in a more productive and conciliatory way, which has less of a chance of ending in an acrimonious split.

At the risk of stating the obvious: emailing organizations (anonymously, if you want) is a pretty good way of raising concerns with them.

I've emailed a number of EA organizations (including ACE) with question/concerns, and generally find they are responsive.

And I've been on the receiving side of emails as well, and usually am appreciative; I often didn't even consider that there could be some confusion or misinterpretation of what I said, and am appreciative of people who point it out.

Comment by Ben_West on Concerns with ACE's Recent Behavior · 2021-04-16T14:49:28.118Z · EA · GW

Edit: Jakub says that ACE's evaluation was based on the Facebook comments, not leadership transition. The below is kept for historical purposes. Also, I should have noted in this post my appreciation for Anima's transparency – it wouldn't have been possible for me to post something like this with most organizations, because they would state that their CEO stepped down "spend more time with her family" or something similar.

Nevertheless, given the overall positive assessment, it's strange that Anima was awarded a "weak" rating in this category, and I think it's likely that Anima is being heavily punished for the public comments made by staff members.

Last year, Anima fired their CEO. The public statement said:

However, no matter how much we value her merits, there are issues in regards to everyday behaviour towards employees that we as an organization cannot accept. In Anima International we have to be a team that strongly supports each other.

I think ACE's rating about poor leadership and culture was based on that rather than Facebook comments made by staff members.

Comment by Ben_West on Meta-EA Needs Models · 2021-04-06T00:50:49.895Z · EA · GW

Good point

Comment by Ben_West on Meta-EA Needs Models · 2021-04-05T23:47:19.715Z · EA · GW

Thanks for sharing this! 

Feels like all the top people in EA would have gotten into EA anyway?

Possibly you don't endorse this statement and were just using it as an intro, but I think your interlocutor's response (1) is understated: I can't think of any products which don't benefit from having a marketing department. If EA doesn't benefit from marketing (broadly defined), it would be an exceptionally unusual product.

I imagine taking my best guess at the "current plan of meta-EA" and giving it to Paul Graham and him not funding my startup because the plan isn't specific/concrete enough to even check if it's good and this vagueness is a sign that the key assumptions that need to be true for the plan to even work haven't been identified.

For what it's worth, CEA's plans seem more concrete than mine were when I interviewed at YC. CLR's thoughts on creating disruptive research teams are another thing which comes to mind as having key assumptions which could be falsified.