Open Philanthropy is seeking proposals for outreach projects

post by abergal, ClaireZabel · 2021-07-16T20:34:52.023Z · EA · GW · 7 comments


  Proposals we are interested in
    that engage with promising young people
      Some reasons why we think this work has high expected value
      Made-up examples of programs we think could be impactful
    aiming at widespread dissemination of relevant high-quality content
      Some reasons why we think this work has high expected value
      Made-up examples of projects we think could be impactful
  Application process

Open Philanthropy is seeking proposals from applicants interested in growing the community of people motivated to improve the long-term future via the kinds of projects described below.[1]

Apply to start a new project here; express interest in helping with a project here.

We hope to draw highly capable people to this work by supporting ambitious, scalable outreach projects that run for many years. We think a world where effective altruism, longtermism, and related ideas are routine parts of conversation in intellectual spaces is within reach, and we’re excited to support projects that work towards that world.

In this post, we describe the kinds of projects we’re interested in funding, explain why we think they could be very impactful, and give some more detail on our application process.

Proposals we are interested in

Programs that engage with promising young people

We are seeking proposals for programs that engage with young people who seem particularly promising in terms of their ability to improve the long-term future (and may have interest in doing so).

Here, by “particularly promising”, we mean young people who seem well-suited to building aptitudes [EA · GW] that have high potential for improving the long-term future. Examples from the linked post include aptitudes for conducting research, advancing into top institutional roles, founding or supporting organizations, communicating ideas, and building communities of people with similar interests and goals, among others. Downstream, we hope these individuals will be fits for what we believe to be priority paths for improving the long-term future, such as AI alignment research, technical and policy work reducing risks from advances in synthetic biology, career paths involving senior roles in the national security community, and roles writing and speaking about relevant ideas, among others.

We’re interested in supporting a wide range of possible programs, including summer or winter camps, scholarship or fellowship programs, seminars, conferences, workshops, and retreats. We think programs with the following characteristics are most likely to be highly impactful:

Examples of such programs that Open Philanthropy has supported include SPARC, ESPR, the SERI [EA · GW] and FHI [EA · GW] summer research programs, and the recent EA Debate Championship [EA · GW]. However, we think there is room for many more such programs.

We especially encourage program ideas which:

We encourage people to have a low bar for submitting proposals to our program, but note that we view this as a sensitive area: we think programs like these have the potential to do harm by putting young people in environments where they could have negative experiences. Nicole Ross at the Centre for Effective Altruism (email is available to provide advice on these kinds of risks.

Some reasons why we think this work has high expected value

A priori, we would guess that people are more likely to get interested in new ideas and opportunities when they are relatively young and have fewer preexisting commitments. This guess is consistent with the results of a survey Open Philanthropy recently ran—we surveyed approximately 200 people who our advisors suggested had the potential to do good longtermist work, most of whom had recently made career changes that we thought were positive from a longtermist perspective. As part of this survey, we asked respondents several questions regarding the age at which they first encountered effective altruism or effective altruism-adjacent ideas.

Survey respondents often mentioned that hearing about EA before starting university would have been particularly helpful because they could have planned how to use their time at university better, e.g. what to major in.

We also asked survey respondents to brainstorm open-endedly about how to get people similar to them interested in these ideas. 10% of responses mentioned starting outreach programs younger, particularly in high school. Several respondents mentioned that SPARC and ESPR had been helpful for them and that they would recommend these programs to similar people. (Certain [EA · GW] other [EA · GW] high school outreach projects have reported less success, but we don’t think these less-targeted programs provide much evidence about how promising targeted high school outreach is likely to be overall, as discussed here [EA · GW].)

Our survey also showed that EA groups, particularly university groups, have had a lot of impact on longtermist career trajectories. On a free-form question asking respondents to list the top few things that increased their expected impact, respondents listed EA groups more commonly than any other factor. On other measures of impact we used in our survey analysis, EA groups came between second and fourth in potential factors, above many EA organizations and popular pieces of writing in the EA-sphere. Most of this impact (65 - 75% on one measure) came from university groups. We think this suggests that, more generally, offering high-quality opportunities for university students to get involved is a promising kind of intervention.

Made-up examples of programs we think could be impactful

These examples are intended to be illustrative of the kinds of programs we’d be interested in funding. This is not intended to be a comprehensive list, nor a list of the programs we think would be most impactful.

We think these programs are unlikely to work fully as written. Founders generally have to dive deep into a project plan to figure out what’s tenable, altering their plan multiple times as they get a better understanding of the space, and we haven’t done that work. As such, we’d like these examples to serve as inspiration, not as instructions. We think programs of this kind are more likely to be successful when the founders develop their own vision and understanding of their target audience.

We would ultimately like to support dedicated teams or organizations that run programs for young people at scale. That said, we are likely to recommend that applicants with less of a track record start by trying out a small pilot of their program and iterating while maximizing program quality and target fit, rather than scaling immediately.

Example 1: A free two-week summer school in Oxford that teaches content related to longtermism to promising high school students. The program could have a similar structure to SPARC and ESPR, but with a more explicitly longtermist focus, and it could engage a broader range of gifted high school students.

Example 2: A monthly AI safety workshop for computer science undergraduates, covering existing foundational work in AI safety.

Example 3: A one-week summer program about effective altruism in Berkeley combined with a prestigious $20,000 merit-based scholarship for undergraduate students. The scholarship would involve an application process that required substantial engagement with ideas related to effective altruism, e.g. a relevant essay and an interview.

Example 4: A monthly four-day workshop teaching foundational rationality content to promising young people.

Example 5: A fall jobs talk and follow-up discussion that’s held at top universities describing career paths in defensive work for future biological catastrophes.

Projects aiming at widespread dissemination of relevant high-quality content

We are also seeking proposals for projects that aim to share high-quality, nuanced content related to improving the long-term future with large numbers of people. Projects could cover wide areas such as effective altruism, rationality, longtermism, or global catastrophic risk reduction, or they could have a more specific focus. We’re interested in supporting people both to create original content and to find new ways to share existing content.

Potential project types include:

Existing projects along these lines include the 80,000 Hours Podcast, Robert Miles’s AI alignment YouTube channel, and Vox’s Future Perfect.

We encourage projects that involve content in major world languages other than English, especially by native speakers of those languages—we think projects in other languages are especially likely to reach people who haven’t had as many opportunities to engage with these ideas.

We would like interested people to have a low bar for submitting a proposal, but we think projects that misrepresent relevant ideas or present them uncarefully can do harm by alienating individuals who would have been sympathetic to them otherwise. We also think it’s important to be cognizant of potential political and social risks that come with content creation and dissemination projects in different countries. Nicole Ross at the Centre for Effective Altruism (email is available to provide advice on these kinds of risks.

Some reasons why we think this work has high expected value

Our sense from talking to people doing longtermist work we think is promising has been that, for many, particular pieces of writing or videos were central to their turn towards their current paths.

This seems broadly in line with the results of the survey we conducted mentioned above. The bodies of written work of Nick Bostrom, Eliezer Yudkowsky, and Peter Singer were in the top 10 sources of impact on longtermist career trajectories (of e.g. organizations, people, and bodies of work) across several different measures. On one measure, Nick Bostrom’s work by itself had 68% of the impact of the most impactful organization and 75% of the impact of the second most impactful organization. When asked what outreach would attract similar people to longtermist work, 8% of respondents in the survey gave free-form responses implying that they think simply exposing similar people to EA/EA-adjacent ideas would be sufficient.

These data points suggest to us that even absent additional outreach programs, sharing these ideas more broadly could ultimately result in people turning towards career activities that are high-value from a longtermist perspective. For many who could work on idea dissemination, we think increasing the reach of existing works with a strong track record, like those given above, may be more impactful per unit of effort than creating new content.

Made-up examples of projects we think could be impactful

As above, these examples are intended to be illustrative of the kinds of programs we’d be interested in funding. This is not intended to be a comprehensive list, nor a list of the programs we think would be most impactful. We think these programs are unlikely to work fully as written and would like these projects to serve as inspiration, not as instructions.

Example 1: Collaborations with high-profile YouTube creators to create videos covering longtermist topics.

Example 2: Targeted social media advertising of episodes of the 80,000 Hours podcast. The project would aim to maximize downloads of the 80,000 Hours Podcast episodes that go through social media referrals.

Example 3: A website that delivers free copies of physical books, e-books, or audiobooks that seem helpful for understanding how to do an outsized amount of good to people with a .edu email address who request them.

Example 4: A MOOC covering existing AI safety work.

Example 5: A new magazine that covers potentially transformative technologies and ways in which they could radically transform civilization in positive or negative ways.

Application process

Primary application

If you think you might want to implement either of the kinds of outreach projects listed above, please submit a brief pre-proposal here. If we are interested in supporting your project, we will reach out to you and invite you to submit more information. We encourage submissions from people who are uncertain if they want to found a new project and just want funding to seriously explore an idea. If it would be useful for applicants developing their proposals, we are open to funding them to do full-time project development work for 3 months. We are happy to look at multiple pre-proposals from applicants who have several different project ideas.

We may also be able to help some applicants (e.g. by introducing them to potential collaborators, giving them feedback about plans and strategy, providing legal assistance, etc.) or be able to help find others who can. We are open to and encourage highly ambitious proposals for projects that would require annual budgets of millions of dollars, including proposals to scale existing projects that are still relatively small.

We intend to reply to all applications within two months. We have also been in touch with the Effective Altruism Infrastructure Fund and the Long-Term Future Fund, and they have expressed interest in funding proposals in the areas we describe below. If you want, you can choose to have them also receive your application via the same form we are using.

There is no deadline to apply; rather, we will leave this form open indefinitely until we decide that this program isn’t worth running, or that we’ve funded enough work in this space. If that happens, we will update this post noting that we plan to close the form at least a month ahead of time.

Collaborator application

If you aren’t interested in starting something yourself, but you would be interested in collaborating on or helping with the kinds of outreach projects listed above (either full or part-time), let us know here. We will connect you to project leads if we feel like there is a good fit for your skills and interests.

If you have any questions, please contact

  1. Our work in this space is motivated by a desire to increase the pool of talent available for longtermist work. We think projects like the ones we describe may also be useful for effective altruism outreach aimed at other cause areas, but we (the team running this particular program, not Open Philanthropy as a whole) haven’t thought through how valuable this work looks from non-longtermist perspectives and don’t intend to make that a focus. ↩︎


Comments sorted by top scores.

comment by nonn · 2021-07-19T11:49:53.867Z · EA(p) · GW(p)

Minor suggestion:  Those forms should send responses after you submit, or give the option "would you like to receive a copy of your responses"

Otherwise, it may be hard to clarify whether a submission went through, or details of what you submitted

Replies from: abergal
comment by abergal · 2021-07-20T00:06:29.205Z · EA(p) · GW(p)

Changed, thanks for the suggestion!

comment by BrianTan · 2021-07-17T09:04:35.832Z · EA(p) · GW(p)

Thanks for this detailed post! This is interesting. I wanted to highlight this part for people who might not have read it, and I have a question about it:

We are open to and encourage highly ambitious proposals for projects that would require annual budgets of millions of dollars, including proposals to scale existing projects that are still relatively small.

Is OpenPhil willing to say how much they are willing to give in total per year for these kinds of outreach projects? I'm curious to know, and others might be too.

Replies from: abergal
comment by abergal · 2021-07-19T23:58:53.502Z · EA(p) · GW(p)

There's no set maximum; we expect to be limited by the number of applications that seem sufficiently promising, not the cost.

comment by lukefreeman · 2021-07-19T00:21:03.375Z · EA(p) · GW(p)

I'm thrilled to hear about this!

comment by eca · 2021-07-22T12:02:37.650Z · EA(p) · GW(p)

One more unsolicited outreach idea while I’m at it: high school career / guidance counselors in the US.

I’m not sure how idiosyncratic this was of my school, but we had this person whose job it was to give advice to older highschool kids about what to do for college and career. Mine’s advice was really bad and I think a number of my friends would have glommed onto 80k type stuff if it was handed to them at this time (when people are telling you to figure out your life all of a sudden). This probably hits the 16yo demographic pretty well.

Could look like adding a bit of entrypoint content geared at pre-college students to 80k, then making some info packets explaining 80k to counselors as a nonprofit career planning resource with handouts for students, and shipping them to every high school in the US or smth (possibly this is also an international thing, IDK).

comment by eca · 2021-07-22T11:51:02.275Z · EA(p) · GW(p)


This is probably not be the best place to post this but I’ve been learning recently about the success of hacking games in finding and training computer security people ( for a discussion, also this game I got excited about in high school:

I think there might be something to an EA/ rationality game. Like something with a save-the-world but realistically plot and game mechanics built around useful skills like Fermi estimation. This is a random gut feeling I’ve had for a while not something well thought through, so could be obviously wrong.

A couple advantages over the typical static content like videos or written intro sequences:

  • games can be “stickier”
  • ppl seem to enjoy intricate, complex games even while avoiding complex static media for lack of time; this is true of many high-school aged ppl in my experience
  • games can tailor different angles into EA material depending on the user’s input
  • games can both educate and filter for/ identify people who are high aptitude, contra to written content or video
  • because games can collect info about user behavior, you might have a much richer sense of where people are dropping out to prototype/ AB test on
  • anecdotally, smart ppl I went to highschool with seemed to have their career aspirations shaped by videogames, primarily toward wanting to do computer science to be game developers. Maybe this could be channelled elsewhere?

A few downsides of games

  • limited to a particular demographic interested in videogames
  • a lot of rationality/ EA stuff seems maybe quite hard to gamify?
  • maybe a game makes EA stuff seem fantastical
  • maybe a game would degrade nuance/ epistemics of content
  • maybe games are quite expensive to make for what they are?

I have zero expertise or qualifications except occasionally playing games, but feel free to DM me anyway if you are interested in this :)