Database of orgs relevant to longtermist/x-risk work

post by MichaelA · 2021-11-19T08:50:43.284Z · EA · GW · 56 comments


    Here’s a version of the database that you filter and sort however you wish, and here’s a version you can add comments to.
  Key points
  How, why, and when to use the database
  Why I made this
  Possible next steps
  See also

Here’s a version of the database that you filter and sort however you wish, and here’s a version you can add comments to.

Update: I've been slow to properly update the database, but am collecting additional orgs in this thread [EA(p) · GW(p)] for now.

Key points

Here’s a snippet of what the database looks like (from the "view" focused on "Funders/funding-influencers"):

I made this database and wrote this post in a personal capacity, not as a representative of my employers.

How, why, and when to use the database

(This is all how I use the database myself.)

You can filter, sort, and search the database based on the causes/topics and types of work (e.g., grantmaking vs policy advising vs research) you’re interested in.

You can use the database to:

  1. Generally learn about the landscape of actors in a given area
  2. Get ideas about what orgs could “provide inputs to you” (funding, advice, feedback, connections)
  3. Get ideas about what orgs could act as “nodes on your path to impact”, e.g. whose actions could be improved by a research project you’re considering doing or who could translate and transmit your findings on to key decision-makers

This could be useful in situations such as when you’re:

  1. Getting oriented to a new area
  2. Trying to build career capital in an area
  3. Generating project ideas, generating theories of change for those project ideas, and prioritising among them
  4. Conducting a project
  5. Helping someone else do any of the above things

(For elaboration on points 3 and 4 in the context of research projects specifically, see here, especially Slides 14-15. Those points are more relevant the more you aim to operate like a consultancy [EA · GW] or think tank.)

These benefits could occur via:

  1. The database making you aware of orgs you didn’t know about
  2. The database making you aware of info you lacked on some orgs, or
  3. The database “jogging your memory”
    • I find it’s easier to notice that an org is worth mentioning to someone I’m giving advice to or considering when making a project plan if I’m scanning a filtered list of maybe-relevant orgs than if I’m just doing free recall

Why I made this

Answer 1: As noted, I’m addicted [EA · GW] to creating [EA · GW] collections [EA · GW].

Answer 2: 18 months ago, I thought EAs should post more summaries and collections [EA · GW], and I still think that, and people seem to often like it when I do that.

Answer 3: 12 months ago, I made a smaller version of this database in hopes that it’d benefit the work of Rethink Priorities’ longtermism team (which I’m a part of) in the ways outlined in the previous section. I feel like it has indeed been useful (though mostly just through guiding my own work and my suggestions to other people; I think other people rarely use it directly). And I’ve also ended up fairly often using the database when giving career or project people advice (e.g., to remind myself what orgs I should suggest a person might want to talk to or check out the work of if they’re interested in nuclear risk or forecasting), or sharing snippets of it with people. So I figured I should make a publicly accessible version.


Mainly just what I said earlier, but I’ll say it again in bold for good measure:

  1. I aimed for (but likely missed) comprehensive coverage of orgs that are substantially focused on longtermist/x-risk-related issues and are part of the EA community
  2. I also included various orgs that are relevant despite being less focused on longtermism/x-risks and/or not being part of the EA community. But one could in theory include at least hundreds of such orgs, whereas I just included a pretty arbitrary subset of the ones I happen to know of.
  3. I created this fairly quickly and based partly on memory & guesswork

Other caveats:

Possible next steps

See also

If this database seems useful to you, you may also be interested in one or more of the following:


I drew on Pablo Stafforini’s and Jamie Gittins’ [EA · GW] lists of EA-related orgs. An earlier version of the database benefitted from comments by Janique Behman, David Rhys Bernard, Juan Gil, and perhaps other people who I’m forgetting. The current version of the database and/or this post benefitted from comments from Will Aldred, Aaron Gertler, Jaime Sevilla, Ben Snodin, Pablo Stafforini, and Max Stauffer.

  1. ...well, I haven’t actually entered that info, but I’ve made fields for it in hopes of crowdsourcing it from you. ↩︎


Comments sorted by top scores.

comment by Yonatan Cale (hibukki) · 2021-11-19T13:04:50.783Z · EA(p) · GW(p)

Two lists I'm considering making:

  1. Software developers who are interested in doing paid EA work (According to 80000 hours, it seems to be hard to hire software developers for EA orgs even though lots of software developers seem to exist in our community. Seems confusing. This would be a cheap first try at solving it)
  2. Pain points that could potentially be solved by software - from EA orgs (see #6 here [EA(p) · GW(p)]. The post is about looking for places to invest in software. I think the correct place to approach this would be to start from actual needs. But there's no place for orgs to surface such needs beyond posting a job)

Any thoughts?

Replies from: oagr, MichaelA
comment by Ozzie Gooen (oagr) · 2021-11-21T16:21:02.818Z · EA(p) · GW(p)

I'll note:

  1. When you say "paid", do you mean full-time? I've found that "part-time" people often drop off very quickly. Full-time people would be the domain of 80,000 Hours, so I'd suggest working with them on this.
  2. "no place for orgs to surface such needs beyond posting a job" -> This is complicated. I think that software consultancy models could be neat, and of course, full-time software engineering jobs do happen. Both are a lot of work. I'm much less excited about volunteer-type arrangements, outside of being used to effectively help filter candidates for later hiring.

I think that a lot of people just really can't understand or predict what would be useful without working in an EA org or in an EA group/hub. It took me a while! The obvious advice would be for people who want to really kickstart things, is to first try to work in or right next to an EA org for a year or so; then you'll have a much better sense.

Replies from: hibukki, Guy Raveh
comment by Yonatan Cale (hibukki) · 2021-11-22T19:53:11.596Z · EA(p) · GW(p)
  1. Developers who'd like to do EA work: Not only full time
  2. I'm talking about discovering needs here. I'm not talking at all about how the needs would be solved

Working at an EA org to discover needs: This seems much slower than asking people who work there, no? (I am not trying to guess the needs myself)

Replies from: oagr
comment by Ozzie Gooen (oagr) · 2021-11-23T09:41:33.215Z · EA(p) · GW(p)

Working at an EA org to discover needs: This seems much slower than asking people who work there, no? (I am not trying to guess the needs myself)

It really depends on how sophisticated the work is and how tied it is to existing systems.

For example, if you wanted to build tooling that would be useful to Google, it would probably be easiest just to start a job at Google, where you can see everything and get used to the codebases, than to try to become a consultant for Google, where you'd ask for very narrow tasks that don't require you to be part of their confidential workflows and similar.

Replies from: hibukki
comment by Yonatan Cale (hibukki) · 2021-11-23T13:13:42.572Z · EA(p) · GW(p)

I agree I won't get everything


Still, I don't think Google is a good example. It is full of developers who have a culture of automating things and even free time every week to do side projects. This is really extreme.


A better example would be some organization that has 0 developers. If you ask someone in such an organization if there's anything they want to automate, or some repetitive task they're doing a lot, or an idea for an app (which is probably terrible but will indicate an underlying need) - things come up

Replies from: hibukki
comment by Yonatan Cale (hibukki) · 2021-11-23T13:16:06.391Z · EA(p) · GW(p)

But also, I tried [EA · GW], and I think 0 such needs surfaced

That's what the experimental method is for, so that we don't have to resolve things just by arguing


comment by Guy Raveh · 2021-11-22T16:29:29.143Z · EA(p) · GW(p)

Just throwing a thought: if many EA orgs have software needs and are struggling to employ people who'll solve them; and on the other hand, part-time employees or volunteer directories don't help that much - would it make sense to start a SaaS org aimed at helping EA orgs?

Replies from: oagr
comment by Ozzie Gooen (oagr) · 2021-11-22T18:46:41.893Z · EA(p) · GW(p)

I could see a space for software consultancies that work with EA orgs, that basically help build and maintain software for them. 

I'm not sure what you mean by SaaS in this case. If you only have 2-10 clients, it's sort of weird to have a standard SaaS business model. I was imagining more of the regular consultancy payment structure.

Replies from: hibukki
comment by Yonatan Cale (hibukki) · 2021-11-22T19:59:25.325Z · EA(p) · GW(p)

EA Software Consultancy: In case you don't know these posts:


Part 1 [EA · GW]

In part 1, I argue that tech work at EA orgs has three predictable problems:[1] [EA · GW]

  • It’s bad for developing technical skills
  • It's inefficiently allocated
  • It’s difficult to assess hires

Part 2 [? · GW]

In this part I argue that each problem could be mitigated or even fixed by consolidating the workers into a single agency. I focus here on the benefits common to any form of agency

Part 3 [? · GW]

This post explicitly compares the low-bono option with various others on two axes: on entity type (ie individual or agency) and on different funding models.

Replies from: oagr
comment by Ozzie Gooen (oagr) · 2021-11-23T09:44:14.608Z · EA(p) · GW(p)

Yea, I was briefly familiar. 

I think it's still tough, and agree with Ben's comment here. [EA(p) · GW(p)]

But I think consultancy engineers could be a fit for maybe ~20-40% of EA software talent. 

comment by MichaelA · 2021-11-19T13:08:20.056Z · EA(p) · GW(p)

Both sound to me probably at least somewhat useful! I'm ~agnostic on how likely they are to be very useful, how they compare to other things you could spend your time on, or how best to do them, which is mostly because I haven't thought much about software development.

I expect some other people in the community (e.g., Ozzie Gooen, Nuno Sempere, JP Addison) would have more thoughts on that. But it might make sense to just spend like 0.5-4 hours on MVPs before asking anyone else, if you already have a clear enough idea in your head.

I can also imagine having a Slack workspace / Slack channel in an existing workspace for people in EA who are doing software development or are interested in that could perhaps be useful.

(Sidenote: You may also be interested in posts tagged software engineering [? · GW] and/or looking into their authors/commenters.)

comment by Jan-Willem (Jan-WillemvanPutten) · 2021-11-19T09:25:35.879Z · EA(p) · GW(p)

Great work Michael, I've already included this Airtable in the curriculum  of Training For Good's upcoming impactful policy careers workshop. Well done, this work is of high value!

Replies from: MichaelA
comment by MichaelA · 2021-11-19T10:03:49.673Z · EA(p) · GW(p)

Glad to hear that you think this'll be helpful!

(Btw, your comment also made me realise I should add Training For Good to the database, so I've now done so. )

comment by MichaelA · 2021-12-23T14:41:36.215Z · EA(p) · GW(p)

Also note that there are EA Forum Wiki entries [? · GW] for many of the orgs in this database, which will in some cases be worth checking out either for the text of the entry itself, for the links in the Bibliography section, or for the tagged posts.

comment by BrianTan · 2021-11-20T11:19:11.130Z · EA(p) · GW(p)

Cool that you made this, and that you even made a Softr page! Although I think the Softr page is worse than just sharing a public grid view of the Airtable.

I realize it would be cool to have a similar database for all EA-related organisations. Jamie Gittins made one on Notion and has a Forum post here [EA · GW] listing EA orgs, but they're both not easily filterable. It could have similar attributes to the Airtable you have. I saw that Taymon also has a Google Sheet, but it would be nice to have it on an Airtable and have it have more attributes, to make it more easily filterable and more colorful.

Replies from: MichaelA
comment by MichaelA · 2021-11-20T11:48:30.582Z · EA(p) · GW(p)

Can you share a public grid view of the Airtable in a way that allows people to filter and/or sort however they want but then doesn't make that the filtering/sorting that everyone else sees? I wasn't aware of how to do that, which is the sole reason I added the Softr option. I think the set of Airtable views I also link people to is probably indeed better if people are happy with the views (i.e., combos of filters and orders) that I've already set up.

Agreed that an all-of-EA version of this would also be useful, and that Airtable would be better for that than Notion, a Forum post, or a Google Sheet. I also expect it's something that literally anyone reading this could set up in less than a day, by:

  • duplicating my database
  • manually adding things from Gittins' and Taymon's database
  • maybe removing anything that was in mine that might be out of scope for them (e.g., if they want to limit the scope to just orgs that are in or "aware of & friendly to" EA, since a database of all orgs that are merely quite relevant to any EA cause area may be too large a scope)
  • looking up how to do Airtable stuff whenever stuck (I found the basics fairly easy, more so than expected)
Replies from: BrianTan
comment by BrianTan · 2021-11-20T13:13:56.508Z · EA(p) · GW(p)

You can share this link instead, which is better than the Softr view, and this means people don't need to get comment access to be able to view the Airtable grid. It also prevents people from being able to see each other's emails if they check the base collaborators. To find that link, I just pressed "Share" at the top right of the base, and scrolled down to the bottom of that modal/pop-up to find the link.

Replies from: MichaelA
comment by MichaelA · 2021-11-20T14:19:30.998Z · EA(p) · GW(p)

Ah, nice, thanks for that! It seems that that indeed allows for changing both "Filtered by" and "Sorted by", including from each of my pre-set views, without that changing things for other people, so that's perfect!

I still want to provide the comment access version as well, so people can more easily make suggestions on specific entries. But I'll edit my post to swap the softr link for the link you suggested and to make the comment access link less prominent.

Replies from: BrianTan
comment by BrianTan · 2021-11-20T14:34:00.905Z · EA(p) · GW(p)

No problem!

comment by EricHerboso · 2022-03-01T20:06:53.227Z · EA(p) · GW(p)

I just wanted to leave a note saying that I found this database useful in my work.

comment by MichaelA · 2022-07-22T07:21:38.840Z · EA(p) · GW(p)

I suggested as one possible next step "People could duplicate and then adapt this database in order to make [a] version that’s relevant to all EA cause areas"

I think such a database has now been made! (Though I'm not sure if that was done by duplicating & adapting my one.) Specifically, Michel Justen has made A Database of EA Organizations & Initiatives [EA · GW]. I imagine this'd be useful to some people who find their way to this post.*

Here's the summary section of their post, for convenience:

"I’ve created a new database of EA organizations and initiatives that I host on the recently revamped EA Opportunities page. Here’s the raw Airtable

  • I think this is the most comprehensive collection of organizations in or closely involved with EA to date. It features orgs explicitly within or adjacent to EA, as well as a non-comprehensive list of other orgs working on global catastrophic risks, even if they have little involvement with EA. As of writing this, there are 276 organizations in this database. Of these, 130 are labeled as “Part of EA community” and the rest are labeled as either “aware of and friendly to EA” or uninvolved.
  • I still recommend this database [EA · GW] as the most valuable database of organizations doing longtermist/x-risk work given its more comprehensive indicators for how orgs are aiming to reduce x-risk.
  • If you see any mistakes in this database, please let us know. You can also submit new organizations."

*I guess I should flag that I haven't looked closely at Michel's post or database, so can't personally vouch for its accuracy, comprehensiveness, etc.

comment by MichaelA · 2022-04-10T10:32:46.932Z · EA(p) · GW(p)

Some orgs that should maybe be added (I'd be keen for someone to fill in the form to add them, including relevant info on them): 

Replies from: MichaelA, MichaelA, MichaelA, MichaelA, MichaelA, MichaelA, MichaelA, MichaelA, MichaelA, MichaelA, MichaelA, MichaelA, MichaelA, MichaelA, MichaelA, MichaelA, MichaelA, MichaelA, MichaelA, MichaelA, MichaelA, MichaelA, MichaelA, MichaelA, MichaelA, MichaelA, MichaelA, Pablo_Stafforini
comment by MichaelA · 2023-02-12T10:39:57.965Z · EA(p) · GW(p)

Labour for the Long Term

Is Britain prepared for the challenges ahead?
We face significant risks, from climate change to pandemics, to digital transformation and geopolitical tensions. We need social-democratic answers to create a fair and resilient future.

Our vision
A leading role for the UK
Many long-term issues have an important political dimension in which the UK can play a leading role. Building on the work of previous Labour governments, we see a future where the UK can play a larger role in areas such as in reducing international tensions and in becoming a world leader in green technology.

comment by MichaelA · 2023-03-25T08:38:28.847Z · EA(p) · GW(p)

Apart Research

A*PART is an independent ML safety research and research facilitation organization working for a future with a benevolent relationship to AI.

We run AISI, the Alignment Hackathons, and an AI safety research update series.

comment by MichaelA · 2023-03-25T08:37:17.752Z · EA(p) · GW(p)

Also the European Network for AI Safety (ENAIS) [EA · GW]

TLDR; The European Network for AI Safety is a central point for connecting researchers and community organizers in Europe with opportunities and events happening in their vicinity. Sign up here to become a member of the network, and join our launch event on Wednesday, April 5th from 19:00-20:00 CET!

comment by MichaelA · 2023-03-06T10:03:38.148Z · EA(p) · GW(p)

Riesgos Catastróficos Globales

Our mission is to conduct research and prioritize global catastrophic risks in the Spanish-speaking countries of the world. 

There is a growing interest in global catastrophic risk (GCR) research in English-speaking regions, yet this area remains neglected elsewhere. We want to address this deficit by identifying initiatives to enhance the public management of GCR in Spanish-speaking countries. In the short term, we will write reports about the initiatives we consider most promising. [Quote from Introducing the new Riesgos Catastróficos Globales team [EA · GW]]

comment by MichaelA · 2023-03-06T10:01:38.534Z · EA(p) · GW(p)


We’re a team of researchers investigating and forecasting the development of advanced AI.

comment by MichaelA · 2023-03-05T10:22:27.018Z · EA(p) · GW(p)

International Center for Future Generations

The International Center for Future Generations is a European think-and-do-tank for improving societal resilience in relation to exponential technologies and existential risks.

As of today, their website lists their priorities as:

  • Climate crisis
  • Technology [including AI] and democracy
  • Biosecurity
comment by MichaelA · 2023-02-14T12:13:03.267Z · EA(p) · GW(p)

Harvard AI Safety Team (HAIST), MIT AI Alignment (MAIA), and Cambridge Boston Alignment Initiative (CBAI)

These are three distinct but somewhat overlapping field-building initiatives. More info at Update on Harvard AI Safety Team and MIT AI Alignment [EA · GW] and at the things that post links to.

comment by MichaelA · 2023-02-12T10:38:00.044Z · EA(p) · GW(p)

Policy Foundry

an Australian-based organisations dedicated to developing high-quality and detailed policy proposals for the greatest challenges of the 21st century. [source]

comment by MichaelA · 2023-02-12T10:36:03.164Z · EA(p) · GW(p)

The Collective Intelligence Project

We are an incubator for new governance models for transformative technology.

Our goal: To overcome the transformative technology trilemma.

Existing tech governance approaches fall prey to the transformative technology trilemma. They assume significant trade-offs between progress, participation, and safety.

Market-forward builders tend to sacrifice safety for progress; risk-averse technocrats tend to sacrifice participation for safety; participation-centered democrats tend to sacrifice progress for participation.

Collective flourishing requires all three. We need CI R&D so we can simultaneously advance technological capabilities, prevent disproportionate risks, and enable individual and collective self-determination.

comment by MichaelA · 2023-01-22T02:56:32.200Z · EA(p) · GW(p)

Also Cavendish Labs:

Cavendish Labs is a 501(c)(3) nonprofit research organization dedicated to solving the most important and neglected scientific problems of our age.

We're founding a research community in Cavendish, Vermont that's focused primarily on AI safety and pandemic prevention, although we’re interested in all avenues of effective research.


comment by MichaelA · 2023-01-03T09:38:14.084Z · EA(p) · GW(p)

Also the Forecasting Research Institute [EA · GW]

The Forecasting Research Institute (FRI) is a new organization focused on advancing the science of forecasting for the public good. 

[...] our team is pursuing a two-pronged strategy. One is foundational, aimed at filling in the gaps in the science of forecasting that represent critical barriers to some of the most important uses of forecasting—like how to handle low probability events, long-run and unobservable outcomes, or complex topics that cannot be captured in a single forecast. The other prong is translational, focused on adapting forecasting methods to practical purposes: increasing the decision-relevance of questions, using forecasting to map important disagreements, and identifying the contexts in which forecasting will be most useful.

[...] Our core team consists of Phil Tetlock, Michael Page, Josh Rosenberg, Ezra Karger, Tegan McCaslin, and Zachary Jacobs. We also work with various contractors and external collaborators in the forecasting space.

comment by MichaelA · 2023-01-03T09:36:36.455Z · EA(p) · GW(p)

Also School of Thinking [EA · GW]

School of Thinking (SoT) is a media startup.

Our purpose is to spread Effective Altruist, longtermist, and rationalist values and ideas as much as possible to the general public by leveraging new media. We aim to reach our goal through the creation of high-quality material posted on an ecosystem of YouTube channels, profiles on social media platforms, podcasts, and SoT's website. 

Our priority is to produce content in English and Italian, but we will cover more languages down the line. We have been funded by the Effective Altruism Infrastructure Fund (EAIF) and the FTX Future Fund.

comment by MichaelA · 2022-10-15T11:54:02.292Z · EA(p) · GW(p)

Also AFTER (Action Fund for Technology and Emerging Risk)

comment by MichaelA · 2022-10-01T15:22:50.714Z · EA(p) · GW(p)

Also Future Academy (but maybe that's not an org and instead a project of EA Sweden?).

comment by MichaelA · 2022-09-29T20:16:26.190Z · EA(p) · GW(p)

Also anything in Alignment Org Cheat Sheet [LW · GW] that's not in here. And maybe adding that post's 1-sentence descriptions to the info this database has on each org listed in that post.

comment by MichaelA · 2022-09-16T12:20:50.205Z · EA(p) · GW(p)

Also fp21 and maybe Humanity Forward.

(Reminder: This is a database of orgs relevant to longtermist/x-risk work, and includes some orgs that are not part of the longtermist/x-risk-reduction community, don't associate with those labels, and/or don't focus specifically on those issues.)

comment by MichaelA · 2022-07-08T11:54:48.758Z · EA(p) · GW(p)

Also EA Engineers [EA · GW]

comment by Pablo (Pablo_Stafforini) · 2022-04-10T21:07:00.641Z · EA(p) · GW(p)

To the best of my knowledge, Samotsvety is a group of forecasters, not an organization (although some of its members have recently launched or will soon launch forecasting-related orgs).

comment by NunoSempere · 2021-12-20T14:24:56.539Z · EA(p) · GW(p)

Times I have used this post in the course of my research: II.

Replies from: MichaelA
comment by MichaelA · 2021-12-20T15:21:17.166Z · EA(p) · GW(p)

Is that 11 or 2?

(Either way, thanks for letting me know :) )

Replies from: NunoSempere
comment by NunoSempere · 2021-12-22T11:40:56.780Z · EA(p) · GW(p)

2. Cheers.

comment by Davidmanheim · 2021-11-21T09:08:56.242Z · EA(p) · GW(p)

How do I submit notes / corrections on orgs in the table?

Replies from: MichaelA
comment by MichaelA · 2021-11-21T09:56:18.261Z · EA(p) · GW(p)

"If you spot any errors or if you know any relevant info I failed to mention about these orgs, let me know via an EA Forum message or via following this link and then commenting there."

(The very first link I provide in this post allows changing the filtering & sorting, but not commenting, so you have to instead either send a message or use that other link.)

Thanks for your interest in suggesting extra info / correction :)