What are the most impactful roles that EAs are currently not entering (and why)?

post by Cillian Crosson (cilliancrosson@gmail.com) · 2021-12-01T16:58:41.427Z · EA · GW · 5 comments

This is a question post.

The EA community has identified a lot of promising career paths. I want to collect a list of high-impact roles which few EAs are entering and potential reasons for why this might be.

Feel free to include:

Answers here could be directly useful for informing Training for Good’s 2022 strategy. However, please do include answers for which you think “training” is probably not the solution as I imagine others (eg. group leaders) might also be interested in this question.


answer by MaxRa · 2021-12-06T16:37:40.658Z · EA(p) · GW(p)

At EAG London, Will MacAskill suggested that the EA community is lacking professional historians. He discussed that in scenarios of looming civilizational collapse, a better understanding of the development of civilization probably would help us with increasing the odds of the reemergence of broadly moral advanced societies.

answer by acylhalide (Samuel Shadrach) · 2021-12-01T17:38:42.299Z · EA(p) · GW(p)

Somebody to run this anonymously as a full-time job and assume legal risk:

https://forum.effectivealtruism.org/posts/Zxiugmj5EnS6SXYnS/scihub-backups-for-open-research [EA · GW]

answer by Chris Leong (casebash) · 2021-12-03T09:24:29.568Z · EA(p) · GW(p)

AI Ethicists and Bioethicists. Covid has demonstrated how people in these roles can really mess things up if they spout nonsense and I think we should assume that the same applies to AI as well.

comment by Vilfredo's Ghost (Bluefalcon) · 2021-12-06T08:53:40.747Z · EA(p) · GW(p)

I think the problem is that unethical people have an insurmountable competitive advantage in getting jobs as an ethicist. At least if these are academic roles, you have to publish to be a viable candidate, it's a lot easier to say a new and false thing about ethics than a new and true thing, and reality won't slap you in the face for being wrong the way it would in science. So you'd probably need to aim to be able to influence the hiring process somehow w/o being subject to the perverse incentives. 

comment by gooder · 2021-12-06T04:43:06.914Z · EA(p) · GW(p)

I like this idea, and am also curious about whether they would be able to influence trajectory and what that might look like. 

Would the MO be to directly influence tech & bio leaders? Or indirectly influence them by getting citizens interested and applying pressure?

 I find ethics critical, but unfortunately am not sure how engagement around ethics could be enforced for  people who don't grok the importance. Maybe that's where governance and treaties come in.


Comments sorted by top scores.