How important is it for new hires to be EA-aligned?

post by sawyer · 2020-10-02T20:47:08.731Z · score: 12 (10 votes) · EA · GW · 1 comments

This is a question post.

When deciding who to hire to fill a role in your organization, there are lots of factors to consider. Relevant experience, demonstrated capability, education, perceived fit, etc. One variable that exists for EA orgs (but I think less so for non-EA orgs) is how strongly the person identifies as an effective altruist. This is directly related to value alignment and perceived fit, and possibly related to dedication, capability, and other factors.

Job postings for EA orgs often recommend "familiarity with EA concepts" or something similar. Applications and interviews will sometimes ask about this as well. Anecdotally, I've experienced several job searches at EA orgs (from the employer's side of things) in which "are they an EA?" is a question given somewhat high importance.

To be clear, I'm pretty convinced that "values alignment" in general is important for a new hire. If you're hiring someone for animal rights advocacy, you probably want someone who cares deeply about animal welfare. If you're hiring someone to research AI safety, you probably want someone who is passionate about making safe AI.

But for both the animal welfare hire and the AI safety hire, how important is it that they identify as an effective altruist? What if they're truly passionate about animal welfare, but they think that EA ideas around cause analysis are pointless?

Should hiring managers be prioritizing EA-alignment? Or should they just focus on values alignment for their specific organization?

(I struggled over the framing for this question, so feel free to reframe if that's helpful.)



Comments sorted by top scores.

comment by Peter_Hurford · 2020-10-02T21:43:25.102Z · score: 12 (7 votes) · EA(p) · GW(p)

A lot of it is likely specific to the role (e.g., you likely don't need an EA accountant) and specific to the strategy of the organization