CEA's Plans for 2021

post by MaxDalton (Maxdalton) · 2020-12-11T00:27:08.774Z · EA · GW · 10 comments


        Continued learning
        Quality connections
      Risk reduction
      Executive team and operations
      Plan for 2022
    Building Grants
      Location-specific application rounds
      Enhanced support
      Defining success
      Onboarding new EAs
        One-to-one follow-up
        Fellowships and resource development
        University group support
        Underrepresented groups
      Improve the understanding and networks of group organizers and EAs in hub cities
      Broad support and organizer training
      Mitigate key risks posed by groups or group members
    Altruism Forum and online content
      Potential activities
          Max's priorities:
          Other plans:
  Budget for 2021
  Funding gap

This post lays out the Centre for Effective Altruism's plans for 2021. We've also published a review of our progress in 2020 [EA · GW].


Long-term approach

Our mission is to build a community of students and professionals acting on EA principles, by nurturing high-quality discussion spaces.

We think we should focus on facilitation: curating spaces (online, at events, within groups) for quality discussion, and helping more experienced community members to onboard newer members, rather than writing content ourselves or doing personalized outreach.

To break this down, we plan to work across three broad areas: recruitment, retention, and risk reduction.

We aim to focus our recruitment efforts in 2021 around student groups.

Young people seem to be the most likely to get deeply involved [EA(p) · GW(p)], probably because they have more free time, flexibility in their plans, and openness to new ideas. We also have a strong track record with this group, and they are easy to reach. The main downsides are the long time it takes for students to make an impact and the possibility of value drift,[1] but we still think this should be a part of EA’s community building portfolio.

Groups are one of the top ways that young people get more deeply involved (as demonstrated by the EA survey [EA · GW] and interviews with community members). We are especially focused on supporting groups at highly ranked universities, which have a high concentration of people who will be influential in the future. We think that targeted events and resources can bolster student groups, and we aim to integrate events and the EA Forum more closely with groups work (as we did with the introductory events and Student Summit this year).

We aim for these people to take significant actions (a career plan/change or a 10% giving pledge) based on a good understanding of EA principles. We plan to carry out interviews to learn what actions they’ve taken (and why), but we don’t feel that we’re well-placed to rank individuals or career options, so we will be relatively agnostic as long as they seem to have a good understanding of the ideas. We hope that our focus on extremely promising populations and quality/in-depth presentations of key ideas will allow us to recruit excellent people. Throughout this work, we aim to be welcoming to people of different backgrounds.

We hope to retain these young people as they progress into exceptionally high-impact roles: this allows them to change their priorities in the future, develop knowledge and networks that will help them, and help to onboard new people. Related projects include EA Global, the Forum, and city groups.

We plan to maintain risk reduction work that preserves EA’s ability to grow and produce value in the future. This includes work on PR, diversity, risky actors, and epistemics.

We do not plan to work on reaching mid-career professionals or high net worth individuals. We also do not plan to work on fundraising, grantmaking (other than to groups), cause-specific community building, career advice, or research.

2021 plans

We are not confident that these are the best focuses, and they don’t cover all the ways we’ve created value in the past. However, we think that it’s useful to be more focused, and my impression is that these goals are facilitating collaboration between teams, generating new ideas, and stimulating open discussion about how the goals should be improved and changed. I think that our events and groups teams got more done in the past five months in significant part because we had clearer goals.


We aim to help onboard new students and young professionals, particularly from top universities and underrepresented groups. We plan to do this by following a cohort of students through the academic year,[2] and helping them to engage with quality content, make connections, and take action.


Existing community members need to continue to develop their understanding and networks, and to sustain their motivation. We aim to increase the total value that highly engaged EAs get from EA discussion spaces by 30%, via two areas: engaging with high-quality content (which will encourage continued learning) and making social/professional connections.

Continued learning

Quality connections

Risk reduction

We will focus on the following areas:

In addition, we will maintain other aspects of community health work. For more details, see the community health section below.

Executive team and operations

We plan to:

We also plan to hire to support the organizational goals above, likely by hiring additional groups support.

Plan for 2022

We expect to continue to improve on our work in 2021. Whilst I expect our recruitment work to be increasingly student-focused and goal-oriented, I anticipate that our work to support existing community members and reduce risks will remain broader.


All program plans are our current drafts, which we may update. They are not intended to be commitments.

Community Building Grants

Baseline Budget: $1.29M

Expansion Budget: $1.68M

FTE: 1.0


  1. Increase community building capacity of priority groups by the equivalent of 5 FTEs.
  2. The majority of grants made are highly valuable based on well-defined criteria.
  3. CBGs are considered to be one of the most promising career options by top group organizers.

Location-specific application rounds

Our top priority is to make better grants. Roughly, the best grants come when we fund a strong organizer in an important location. Our experience in NYC and Cambridge suggests that location-specific application rounds are a good way of finding unusually strong organizers in important locations. They are also more likely to be counterfactual than reactive grants. We think these benefits outweigh the increased evaluation costs, and plan to run more such rounds this year.

Enhanced support

In order to attract top organizers, the program needs to be considered one of the best available career opportunities for them. Therefore, we would also like to provide individual mentorship, training, networking opportunities, and a smooth experience for grantees. We might hire a contractor or staff member to provide this extra support.

Defining success

The goals refer to priority groups and grants that are highly valuable. We would like to develop those definitions for different types of groups. We generally feel like we understand these qualities for university groups better than we understand city- or national-level strategy. We are planning to spend some time thinking about strategy for these groups.

Group Support

Baseline Budget: $523,000

Expansion Budget: $734,000

FTE: 3.0, plus 6 contractors


  1. Build a system for onboarding new EAs
  2. Improve the understanding and networks of group organizers and EAs in key cities by sharing content and supporting knowledge sharing
  3. Maintain (or improve) our existing support for groups
  4. Maintain our work to mitigate key risks in groups by referring cases to community health.

Onboarding new EAs

The groups team will take primary responsibility for the organization-wide recruitment goal. We’ll focus on following a cohort of students[7] through the academic year, and helping them to engage with quality content, make connections, and take action. Following a cohort of students will help us to understand their bottlenecks and develop the most useful resources.

One-to-one follow-up

We are testing a system where individuals who attended the EA Student Summit will have a follow up from their local group leaders or another experienced member of the community. We also plan to continue notifying them about relevant content and future events.

Fellowships and resource development

We think that one of the best ways to give students a high-fidelity introduction to EA principles is through fellowships. A multi-week fellowship increases commitment and provides a space to read, discuss, and process core materials. As a result, we’re prioritizing:

We’re still collecting data on other in-demand resources from EA introductory fellowship and student summit attendees, but we’ve seen initial demand for career fellowships, cause area fellowships, social events, and podcast / book discussion groups.

University group support

We are exploring making a hire to add capacity to support university groups. Assuming we can make this hire, we plan to:

Underrepresented groups

We are exploring opportunities to provide an additional personal touchpoint for students from underrepresented demographic groups who aren’t yet highly engaged after the student summit / introductory fellowship. Following this cohort of students will allow us to better understand any specific barriers these groups face, so that we can better tailor our support.

Improve the understanding and networks of group organizers and EAs in hub cities

Our largest focus within the groups team is on recruitment, but within our retention goal above, we plan to focus on two user groups: existing EAs in EA hub cities, and group organizers.

Broad support and organizer training

In addition to the above, we aim to maintain or improve the resources and training available to all organizers.

This will likely include:

In addition, we will track and aim to improve organizer satisfaction with our support.

Mitigate key risks posed by groups or group members

The Groups team interacts with hundreds of group organizers, so we’re well placed to scan for risks and escalate cases to the Community Health team. These include PR risks, risks of low-fidelity translation, conflicts within the group, and cultural/epistemic issues. We aim to maintain our current work here.


We are currently hiring for someone to mentor university groups and create quality groups resources. We currently have approximately two FTE contractors, who are piloting and sharing resources for groups, and we have set aside funding to add four more contractors (for a total of six) across this area, to scale it quickly while minimizing risk.

Effective Altruism Forum and online content

Baseline Budget: $361,000 (baseline is the same as expansion)

FTE: 2.5


Potential activities

We think that the following activities might help us to achieve these goals:


Baseline Budget: $1.328M (baseline is the same as expansion)

FTE: 4.5 (new hire under expansion)

We plan to run a similar program of events to last year. However, our 2021 plans are still quite uncertain, because:

The table below shows our best guess under different scenarios. The plans below will depend on advice from our COVID Advisory Board. Our current best estimates are a 30-35% chance that London can go ahead and 60-70% chance that San Francisco will be able to go ahead.

We are planning to run six events in 2021, where three of these events may be held either virtually or in-person, depending on the advice of our advisory board.

Event plans

In addition, the team will provide funding and support to selected EAGx organizers. These events may be virtual or in-person, depending on COVID.

Community health

Baseline Budget: $471,000

Expansion Budget: $566,000

FTE: 3.5 (new hire under expansion)


We conducted an internal analysis of which areas to focus on in Q3, from among the different risks we monitor.[8] We made the following estimates:

Risk table

As a result, we’re starting to put more effort into the first two risk areas.


In general, a good chunk of our work is responsive, and we want to stay flexible as new opportunities arise. Below is a proposal of what we might do given our current capacity and understanding of key risks.

See the groups support section for more on our proposed work on DEI.

In addition, we aim to keep the following areas stable compared to 2020:


Baseline Budget: $1.08M

Expansion Budget: $1.13M

FTE: 4.3 (new hire under expansion)


Max's priorities:

Joan will be focused on driving forward our work on recruiting students, especially from top universities, and on gathering data to track and inform progress towards our goals.

Other plans:

Plus ongoing staff support work, like tracking and addressing staff morale issues; biannual feedback rounds; staff satisfaction interviews; digital content to support remote staff engagement; updating our compensation policy; encouraging upward feedback; and tracking and enacting potential role changes that better match people’s responsibilities to their areas of strength.


Baseline Budget: $626,000

Expansion Budget: $640,000

FTE: 5.25 (new hire under expansion)


Our key team metric is the internal satisfaction score from the annual survey of all operations customers (employees and grantmakers, weighted towards those we serve most directly).

Budget for 2021


For 2021, we have set a baseline budget of $5.60M and an expansion budget of $6.28M, compared to last year’s budget of $6.02M.

The chart below shows CEA’s plans for 2021 under both baseline and expansion, against the 2020 budget:

As discussed in the executive summary above, CEA is focusing on three key areas: recruitment of EAs, retention of highly engaged community members, and risk reduction. By grouping our programs into their respective focus areas, we can see that our biggest investment is in recruitment to EA.

2021 budget (fixed)

Net budget

We derive our net budget by summing the budgeted expenditure on programs and deducting non-fundraising income, such as EA Global tickets.

2021 net budget (fixed)

Funding gap

As discussed above, we believe we’ve made good progress overall in 2020, and we believe we’re set up to improve further. Over the past two years, we have increased our focus on the most important aspects of our work whilst improving execution and keeping costs relatively stable.

In 2020, our board agreed that in order to increase resilience, in line with other EA organizations and advised standard practice, CEA should aim to have 12 months of runway at all times. This will also help us to make multi-year commitments for Community Building Grants (increasing stability for grantees), make multi-year commitments to event venues (reducing costs), and attract staff. As such, we are seeking funding through to December 2022.

Our funding gap estimate is equal to our two-year budget, less our balance on hand and fundraising target from some of our major long-standing donors. We estimate our funding gap to be $2.2m on our expansion budget.

Funding gap

If you are considering making a major donation, please get in touch and we will be able to give you more details of our current funding situation.

  1. Value drift occurs when someone’s values change over time, leading them to e.g. be less interested in effective altruism at age 25 than they were at age 20. ↩︎

  2. All students who attended an introductory fellowship, our introductory event, or the EA Student Summit in 2020. ↩︎

  3. Measured by 30 day trailing average of 0.5 people/day referred from Forum to groups, events, or one-to-one introductions by EOY 2021. This might involve developing Forum profiles (adding pictures, extra fields for information, tagging, etc,), although we’re not sure if we will actually do this. ↩︎

  4. Our best guess is that we want this to increase by 30% year-on-year. ↩︎

  5. Question: “Roughly, how many new valuable connections did you make over the past year? For example, how many new people do you feel comfortable asking for a favour?” We're aiming for a 30% increase in total connections. ↩︎

  6. Same measure as above: 30 day trailing average of 0.5 people/day referred from Forum to groups, events, or one-to-one introductions by EOY 2021. This might involve developing Forum profiles (adding pictures, extra fields for information, tagging, etc.). ↩︎

  7. All students who attended an introductory fellowship, our introductory event, or the EA Student Summit in 2020. ↩︎

  8. Overall risk: “What’s the likelihood something happens (an event, or a series of events) that causes the expected number of engaged EAs to be net 200 less in the next 5 years [than it otherwise would have been]?”

    Maximum potential tractability: “If we had the entire CEA community health team + 80k and Open Phil taking effort to mitigate this risk, how much of the risk would we reduce?” ↩︎ ↩︎

  9. As rated by key users. ↩︎

  10. By the end of the year, Louis is able to reduce his working hours on core finance tasks by 20%. ↩︎

  11. Requires ~35% less executive time. ↩︎

  12. Improved retention, and donors report a high level of satisfaction with the service they receive. ↩︎


Comments sorted by top scores.

comment by EdoArad (edoarad) · 2020-12-11T07:59:00.981Z · EA(p) · GW(p)

Thanks for this informative write-up!

The mission as stated is 

to build a community of students and professionals acting on EA principles, by nurturing high-quality discussion spaces.

The focus on improving discussion spaces is relatively narrow compared to possible alternatives. Off the top of my head, some other (not necessarily good) alternatives might be:

  1. Directly manage content creation
  2. improving coordination among donors 
  3. lead a centralized information platform
  4. lead a common research agenda
  5. develop a single "brand" for EA and for main cause areas
  6. Support promising individuals and organizations
  7. obtain and develop tools and infrastructure to support EA researchers 
  8. Leading to common answers and community-wide decisions to some key questions about EA (should we expand or keep it small, should we have a different distribution of cause areas, should we invest more in cause prioritization or meta causes, ..)
  9. ...

I think that it makes a lot of sense to keep the focus on discussion platforms, and it seems that you are also naturally working on stuff outside of this focus as needed. I'd be interested to hear a bit more about what you'd want other initiatives from the community to take on, which you could have taken on yourselves but decided to withdraw. I'm also not sure from the post if you consider this mission as a long-term focus of CEA, or if this is only for the 1-2 coming years. 

Replies from: Maxdalton
comment by MaxDalton (Maxdalton) · 2020-12-11T16:30:02.376Z · EA(p) · GW(p)

Hi Edo, This is something that we’re keen to clarify and might publish more on soon. So thanks for giving me the opportunity to share some thoughts on this!

I think you’re right that this is a narrower mission: this is deliberate.

As we say on our mistakes page:

Since 2016, CEA has had a large number of projects for its size...Running this wide array of projects has sometimes resulted in a lack of organizational focus, poor execution, and a lack of follow-through. It also meant that we were staking a claim on projects that might otherwise have been taken on by other individuals or groups that could have done a better job than we were doing (for example, by funding good projects that we were slow to fund).

Since we wrote that, we have closed EA Grants and spun Giving What We Can out (while continuing to provide operational support), and we’re exploring something similar with EA Funds. I think that this will allow us to be more focused and do an excellent job of the things we are doing.

As you note, there are still many things in the area of building the EA community that we are not doing. Of course these things could be very impactful if well-executed (even though we don’t have the resources to take them on), so we want to let people know what we’re not doing, so they can consider taking them on.

I’ll go through some of the alternatives you mention and talk about how much I think we’ll work in this space. I’ll also share some rough thoughts about what might be needed, but I’m really not an expert in that question - I’d tend to defer to grantmakers about what they’re interested in funding.

A theme in what I write below is that I view CEA as one organization helping to grow and support the EA community, not the organization determining the community’s future. I think it’s mostly good for there not to be one organization determining the community’s future. I think that this isn’t a real change: the EA community’s development was always influenced by a coalition of organizations. But I do think that CEA sometimes aimed to determine the community’s future, or represented itself as doing so. I think this was often a mistake.

Directly manage content creation

We don’t have plans to create more content. We do curate content when that supports productive discussion spaces (e.g. inviting speakers to events, developing curricula for fellowships at groups). We also try to incentivize the creation of quality content via giving speakers a platform and giving out Forum prizes.

80,000 Hours is maybe the most obvious organization creating new content, but many research organizations are also creating useful content, and I think there’s room for more work here (while having high quality standards).

improving coordination among donors

We are currently running EA Funds, which I see as doing some work in this space (e.g. I think Funds and the donor lottery play some of this role). There might be room for extra work in this space (e.g. coordination between major donors), but I think some of this happens informally anyway, and I don’t have a sense of whether there’s a need for more at the moment.

lead a centralized information platform

I’m not sure quite what you have in mind here. I think the Forum is playing this role to some extent: e.g. it has a lot of posts/crossposts of important content, sequences, user profiles, and a tag/wiki system. We also work on the EA Hub resources. We don’t have plans beyond further developing these.

lead a common research agenda

We are not doing this, and we haven’t been doing research since ~2017. I think there are lots of great research organizations (e.g. Global Priorities Institute, Open Philanthropy, Rethink Priorities) that are working on this (though maybe not a single leader - I think this is fine).

develop a single "brand" for EA and for main cause areas

We do not plan to do this for specific cause areas. We do plan to do some work on testing/developing EA’s brand (as mentioned above in the community health section). However, I think that other organisations (e.g. 80,000 Hours) also play an important role, and I think it’s OK (maybe good) if there are a few different representations of EA ideas (which might work well for different audiences).

Support promising individuals and organizations

Supporting organizations: as mentioned in our annual review, we do some work to support organizations as they work through internal conflicts/HR issues. We also currently make grants to other organizations via EA Funds. We also provide operational support to 80,000 Hours, Forethought Foundation, GWWC, and a long-termist project incubator. Other than this, we don’t plan to work in this space.

Supporting individuals: Again, we currently do this to some extent via EA Funds. Historically we focused a bit more on identifying and providing high-touch support to individuals. I think that our comparative advantage is to focus more on facilitating groups and discussion, rather than identifying promising individuals. So this isn’t a current focus, although we do some aspects of this via e.g. support for group leaders. I think that some of this sort of work is done via career development programs like FHI’s Research Scholars Program or Charity Entrepreneurship’s internship program. I also think that lots of organizations do some of this work via their hiring processes. But I think there might be room for extra work identifying and supporting promising individuals.

In terms of non-financial support, the groups team provides support and advice to group organizers, and the community health team provides support to community members experiencing a problem or conflict within the community.

obtain and develop tools and infrastructure to support EA researchers

I think that the Forum provides some infrastructure for public discussion of research ideas. Apart from that, I don’t think this is our comparative advantage and we don’t plan to do this.

Leading to common answers and community-wide decisions to some key questions about EA (should we expand or keep it small, should we have a different distribution of cause areas, should we invest more in cause prioritization or meta causes, ..)

We do some work to convene discussion on this between key organizations/individuals (e.g. I think this sometimes happens on the Forum, and our coordination forum event allows people to discuss such questions, and where they can build relationships that allow them to coordinate more effectively). But we don’t do things that lead to “common answers or community-wide decisions”.

I actually don’t think we need to have a common answer to a lot of these questions: I think it’s important for people to be sharing their reasoning and giving feedback to each other, but often it’s fine or good if there are some different visions for the community’s future, with people working on the aspect of that which feels most compelling to them. For instance, I think that CEA has quite a different focus now from GWWC or Charity Entrepreneurship or OP or GPI, but I think that our work is deeply complimentary and the community is better off having a distribution of work like this. I also think that it works pretty well for individuals (e.g. donors, job-seekers) to decide which of those visions they most want to support, thus allowing the most compelling visions to grow.

For similar reasons, I think it would be bad to have a single organization “leading” the community. I think that CEA aspired to play this role in the past but didn’t execute it well. I think that the current slightly-more-chaotic system is likely more robust and innovative than a centralized system (even if it were well-executed). (Obviously there’s some centralization in the current system too - e.g. OP is by far the biggest grantmaker. I don’t have a strong view about whether more or less centralization would be better on the current margin, but I am pretty confident that we don’t want to be a lot more centralized than we currently are.)

Some other things we’re not planning to focus on:

  • Reaching new mid- or late-career professionals (though we are keen to retain mid- or late- career people and to make them feel welcome, we’re focused on recruiting students and young professionals)
  • Reaching or advising high-net-worth donors
  • Fundraising in general
  • Cause-specific work (such as community building specifically for effective animal advocacy, AI safety, biosecurity etc)
  • Career advising
  • Research, except about the EA community

Some of our work will occasionally touch on or facilitate some of the above (e.g. if groups run career fellowships, or city groups do outreach to mid-career professionals), but we won’t be focusing on these areas.

As I mentioned, we might say more on this in a separate post soon.

I'm also not sure from the post if you consider this mission as a long-term focus of CEA, or if this is only for the 1-2 coming years.

I expect this mission to be our long-term focus.

Replies from: edoarad
comment by EdoArad (edoarad) · 2020-12-12T08:22:49.730Z · EA(p) · GW(p)

Thanks! I think that this is clear enough for me to be able to mostly predict how you'd think about related questions :)

I am personally very confused about the benefits of centralization vs. decentralization and how to compare these in particular cases and can find myself drawn to either in different cases. For what it's worth, I like the general heuristic of centralized platforms for decentralized decision-making. 

 To investigate it a bit further, I opened this question [EA · GW] about possible coordination failures.

comment by AnonymousEAForumAccount · 2020-12-11T14:39:39.434Z · EA(p) · GW(p)

CEA’s Values document (thank you for sharing this) emphasizes the importance of “specific, focused goals.” It’s helpful to see the specific goals that specific teams have, but what do you see as the most important specific goals for CEA as an organization in 2021? I feel like this writeup gives me a sense of your plans for the year, but not the well-defined criteria you currently expect to use at the end of 2021 to judge whether the year was a success.

Replies from: Maxdalton
comment by MaxDalton (Maxdalton) · 2020-12-11T17:35:48.675Z · EA(p) · GW(p)

Hi, thanks for your question! 

The section on 2021 plans [EA · GW] is intended to be a summary of these criteria, sorry that wasn’t clear. 

  1. One target is focused on recruitment: building a system for onboarding people to EA (to the level where they are taking significant action based on a good understanding of EA principles). Specifically, we aim to help onboard 125 people to this level.
  2. The second target is focused around retention: for people who are already highly engaged EAs, growing the amount of time they spend engaging with high-quality content via our work (e.g. Forum view time or watching a talk from one of our events on YouTube) by 30%, and also growing the number of new connections (e.g. at events) they make by 30%. 
  3. The third target is focused on risk-reduction: this is covered in the community health targets above (which are slightly more tightly specified and fleshed out internally).

Internally we obviously have more details about how we plan to measure/assess these things, but we wanted to just give a summary here. We expect that most of these org-wide goals will be achieved as a collaboration between teams, but we have a single person responsible for each of the org-wide goals. (Operations and executive goals are a bit more complex, and are covered above.)

Replies from: AnonymousEAForumAccount
comment by AnonymousEAForumAccount · 2020-12-11T20:12:50.484Z · EA(p) · GW(p)

This is super helpful- thank you! I feel like I’ve got a much better understanding of your goals now. It really cleared things up to learn which of your multiple goals you're prioritizing most, as well as the precise targets you have for them (since you have a specific recruitment goal it might be worth editing the OP to add that).

I have two followup questions about the recruitment goal.

  1. How did you set your target of recruiting 125 people? That’s much lower than I would have guessed based on other recruitment efforts (GWWC has run a two-month pledge drive [EA · GW] that produced three times as many pledges, plus a bunch of people signing up for Try Giving). And with $2.5 million budgeted for recruitment, the implied $20,000 per recruit seems quite high. I feel like I might be misunderstanding what you mean about "following a cohort of students who attended an introductory fellowship, our introductory event, or the EA Student Summit in 2020" (discussed in the second bullet point).
  2. The recruitment section discusses a “plan to put additional effort into providing mentorship and opportunities for 1:1 connections for group members from demographic groups underrepresented in EA.” Do you have any specific goals for these efforts? For example, I could imagine having a goal that the cohort you recruit be more diverse than the current EA population along certain dimensions. If you don’t have specific goals, what do you plan to look at to know whether your efforts are having the desired effect?
Replies from: Maxdalton
comment by MaxDalton (Maxdalton) · 2020-12-14T20:12:16.377Z · EA(p) · GW(p)

Thanks for your questions. 

Re: target of 125 people. This is a relatively high bar: it’s focused on people who have taken significant action based on a good understanding of EA principles. So the bar is somewhat higher than the GWWC pledge, because we interview people and get a sense of why they chose the path they’re in and what would change their mind. We think that for most people this means >100 hours of engagement with quality content, plus carefully thinking through their career plans and taking action toward those plans (which might include significant donations).

I actually think that $20,000 per person in this position would still be a good deal: the expected lifetime value of a GWWC pledge might be around $73,000, and some people might be doing things significantly more promising than the average GWWC pledge. I don’t think that will be the full cost - e.g. these people will probably also benefit some from e.g. the Forum or 80k resources. However, I also think that these 125 people only represent some of the value that groups work creates (e.g. groups also help persuade people to take less intensive action, and to retain and empower people who are already engaged). I also think there’s a fair chance that we beat this target.

We arrived at 125 by estimating the number of individuals we think met this definition in 2019, applying a ~30% growth rate in the community, and then increasing this number further within key populations. One of our internal benchmarks is that the cohort of engaged EAs recruited in 2021 is more demographically diverse than the cohort of engaged EAs recruited in 2020.

Replies from: AnonymousEAForumAccount
comment by AnonymousEAForumAccount · 2020-12-15T15:54:26.686Z · EA(p) · GW(p)

Thanks for the explanations Max!

comment by MarisaJurczyk · 2020-12-14T02:13:13.174Z · EA(p) · GW(p)

Thanks for the thorough post! I appreciate the transparency CEA has been keeping in its strategy and plans. 

Small question: does EEAs = engaged EAs? Is that defined by a specific metric?

Replies from: Maxdalton
comment by MaxDalton (Maxdalton) · 2020-12-14T08:55:48.085Z · EA(p) · GW(p)

Hey Marisa, thanks, I'm glad you appreciated this! 

Yes, EEAs=highly-engaged EAs (I've now edited this throughout, so that it's a bit less jargon-y). This is a term that we're using internally to refer to people who are taking significant action (e.g. a career plan or a significant giving pledge or similar) based on a detailed understanding of EA ideas.