Posts

CEA update: Q4 2020 2021-01-15T09:55:09.408Z
CEA's strategy as of 2021 2021-01-15T09:42:14.569Z
Things CEA is not doing 2021-01-15T09:33:42.070Z
Giving What We Can & EA Funds now operate independently of CEA 2020-12-22T03:47:48.140Z
CEA's Plans for 2021 2020-12-11T00:27:08.774Z
CEA's 2020 Annual Review 2020-12-10T23:45:51.497Z
What are you grateful for? 2020-11-27T09:47:15.026Z
Some extremely rough research on giving and happiness 2020-09-09T08:33:06.084Z
CEA Mid-year update (2020) 2020-08-11T10:06:41.512Z
CEA's Plans for 2020 2020-04-23T07:50:44.921Z
CEA's 2019 Annual Review 2020-04-23T07:39:59.289Z
The Frontpage/Community distinction 2018-11-16T17:54:15.072Z
Why the EA Forum? 2018-11-07T23:24:49.981Z
Which piece got you more involved in EA? 2018-09-06T07:25:01.218Z
Announcing the Effective Altruism Handbook, 2nd edition 2018-05-02T07:58:24.124Z
Announcing Effective Altruism Grants 2017-06-09T10:28:15.441Z
Returns Functions and Funding Gaps 2017-05-23T08:22:44.935Z
Should we give to SCI or fund research into schistosomiasis? 2015-09-23T15:00:36.615Z

Comments

Comment by maxdalton on CEA update: Q4 2020 · 2021-01-15T22:00:28.439Z · EA · GW

Yes - student groups will be our main priority for additional support in 2021 (we say a bit more about why here, and we discuss what that means concretely for our groups team here). But we’ll be maintaining or expanding the support we give to all groups, including new initiatives like Virtual Programs

Comment by maxdalton on CEA update: Q4 2020 · 2021-01-15T22:00:11.179Z · EA · GW

We plan to do most hiring through public vacancies, but we will make occasional exceptions when we think we’re very likely to be aware of the top candidates. 

In the first case we wanted to hire someone who had experience leading a successful university group, and since we work closely with many group leaders we felt like we had a good enough sense of the talent pool to do a closed round (where we invited a small number of candidates to do work trials, interviews etc.) We might do this sort of thing again.

We brought on Sara and Aadil through the same hiring round, for an executive assistant position. Longview Philanthropy and 80,000 Hours had recently advertised a public position for an operations/executive assistant role. With the help of those organizations and their applicants, we were able to cut out some of the early advertisement/screening steps, and focus on some of their top candidates. I think this saved us a fair amount of time without compromising much on accessibility/fairness. We might do similar things in the future.

But I acknowledge the costs you share, and we plan to normally invite public applications (as for this new contractor role, and for a finance/data role that we plan to post soon).

Comment by maxdalton on Things CEA is not doing · 2021-01-15T21:58:29.340Z · EA · GW

Thanks for sharing this feedback Ozzie, and for acknowledging the tradeoffs too.

I have a different intuition on the centralization tradeoff - I generally feel like things will go better if we have a lot of more focused groups working together vs. one big organization with multiple goals. I don’t think I’ll be able to fully justify my view here. I’m going to timebox this, so I expect some of it will be wrong/confused/confusing.

Examples: I think that part of the reason why 80,000 Hours has done well is that they have a clear and relatively narrow mission that they’ve spent a lot of time optimizing towards. Similarly, I think that GWWC has a somewhat different goal from CEA, and I think both CEA and GWWC will be better able to succeed if they can focus on figuring out how to achieve their goals. I hope for a world where there are lots of organizations doing similar things in different spaces. I think that when CEA was doing grantmaking and events and a bunch of other things it was less able to learn and get really good at any one of those things. Basically, I think there are increasing returns to work on lots of these issues, so focusing more on fewer issues is good.

It matters to get really good at things because these opportunities can be pretty deep: I think returns don’t diminish very quickly. E.g. we’re very far from having a high-quality widely-known EA group in every highly-ranked university in the world, and that’s only one of the things that CEA is aiming to do in the long run. If we tried to do a lot of other things, we’d make slower progress towards that goal. Given that, I think we’re better off focusing on a few goals and letting others pick up other areas.

I also think that, as a movement, we have some programs (e.g. Charity Entrepreneurship, the longtermist entrepreneurship project, plus active grantmaking from Open Philanthropy and EA Funds) which might help to set up new organizations for success.

We will continue to do some of the work we currently do to help to coordinate different parts of the community - for instance the EA Coordination Forum (formerly Leaders Forum), and a lot of the work that our community health team do. The community health team and funders (e.g. EA Funds) also do work to try to minimize risks and ensure that high-quality projects are the ones that get the resources they need to expand.

I also think your point about ops overhead is a good one - that’s why we plan to continue to support 80k, Forethought, GWWC, EA Funds, and the longtermist incubator operationally. Together, our legal entities have over 40 full time staff, nearly 100 people on payroll, and turnover of over $20m. So I think we’re reaping some good economies of scale on that front. 

Finally, I think a more decentralized system will be more robust - I think that previously CEA was too close to being a single point of failure.

Comment by maxdalton on CEA's Plans for 2021 · 2020-12-14T20:12:16.377Z · EA · GW

Thanks for your questions. 

Re: target of 125 people. This is a relatively high bar: it’s focused on people who have taken significant action based on a good understanding of EA principles. So the bar is somewhat higher than the GWWC pledge, because we interview people and get a sense of why they chose the path they’re in and what would change their mind. We think that for most people this means >100 hours of engagement with quality content, plus carefully thinking through their career plans and taking action toward those plans (which might include significant donations).

I actually think that $20,000 per person in this position would still be a good deal: the expected lifetime value of a GWWC pledge might be around $73,000, and some people might be doing things significantly more promising than the average GWWC pledge. I don’t think that will be the full cost - e.g. these people will probably also benefit some from e.g. the Forum or 80k resources. However, I also think that these 125 people only represent some of the value that groups work creates (e.g. groups also help persuade people to take less intensive action, and to retain and empower people who are already engaged). I also think there’s a fair chance that we beat this target.

We arrived at 125 by estimating the number of individuals we think met this definition in 2019, applying a ~30% growth rate in the community, and then increasing this number further within key populations. One of our internal benchmarks is that the cohort of engaged EAs recruited in 2021 is more demographically diverse than the cohort of engaged EAs recruited in 2020.

Comment by maxdalton on CEA's Plans for 2021 · 2020-12-14T08:55:48.085Z · EA · GW

Hey Marisa, thanks, I'm glad you appreciated this! 

Yes, EEAs=highly-engaged EAs (I've now edited this throughout, so that it's a bit less jargon-y). This is a term that we're using internally to refer to people who are taking significant action (e.g. a career plan or a significant giving pledge or similar) based on a detailed understanding of EA ideas.  

Comment by maxdalton on CEA's Plans for 2021 · 2020-12-11T17:35:48.675Z · EA · GW

Hi, thanks for your question! 

The section on 2021 plans is intended to be a summary of these criteria, sorry that wasn’t clear. 

  1. One target is focused on recruitment: building a system for onboarding people to EA (to the level where they are taking significant action based on a good understanding of EA principles). Specifically, we aim to help onboard 125 people to this level.
  2. The second target is focused around retention: for people who are already highly engaged EAs, growing the amount of time they spend engaging with high-quality content via our work (e.g. Forum view time or watching a talk from one of our events on YouTube) by 30%, and also growing the number of new connections (e.g. at events) they make by 30%. 
  3. The third target is focused on risk-reduction: this is covered in the community health targets above (which are slightly more tightly specified and fleshed out internally).

Internally we obviously have more details about how we plan to measure/assess these things, but we wanted to just give a summary here. We expect that most of these org-wide goals will be achieved as a collaboration between teams, but we have a single person responsible for each of the org-wide goals. (Operations and executive goals are a bit more complex, and are covered above.)

Comment by maxdalton on CEA's 2020 Annual Review · 2020-12-11T17:35:17.342Z · EA · GW

Hi Brian, thanks for your question, and I’m glad the update was useful!

You’re correct about the overall approach we’re using (multiplying the expected value of the change by how much of that change is attributable to the group). I’ll flag this comment to Harri and he might follow up with some more details, publicly or privately.

Comment by maxdalton on CEA's 2020 Annual Review · 2020-12-11T17:24:39.122Z · EA · GW

Hi Brian, Thanks for your question! I’m not sure how much we can comment on the investment strategy or grantmaking of this fund, but I’ll flag your questions to Carl.

Comment by maxdalton on CEA's Plans for 2021 · 2020-12-11T16:30:02.376Z · EA · GW

Hi Edo, This is something that we’re keen to clarify and might publish more on soon. So thanks for giving me the opportunity to share some thoughts on this!

I think you’re right that this is a narrower mission: this is deliberate.

As we say on our mistakes page:

Since 2016, CEA has had a large number of projects for its size...Running this wide array of projects has sometimes resulted in a lack of organizational focus, poor execution, and a lack of follow-through. It also meant that we were staking a claim on projects that might otherwise have been taken on by other individuals or groups that could have done a better job than we were doing (for example, by funding good projects that we were slow to fund).

Since we wrote that, we have closed EA Grants and spun Giving What We Can out (while continuing to provide operational support), and we’re exploring something similar with EA Funds. I think that this will allow us to be more focused and do an excellent job of the things we are doing.

As you note, there are still many things in the area of building the EA community that we are not doing. Of course these things could be very impactful if well-executed (even though we don’t have the resources to take them on), so we want to let people know what we’re not doing, so they can consider taking them on.

I’ll go through some of the alternatives you mention and talk about how much I think we’ll work in this space. I’ll also share some rough thoughts about what might be needed, but I’m really not an expert in that question - I’d tend to defer to grantmakers about what they’re interested in funding.

A theme in what I write below is that I view CEA as one organization helping to grow and support the EA community, not the organization determining the community’s future. I think it’s mostly good for there not to be one organization determining the community’s future. I think that this isn’t a real change: the EA community’s development was always influenced by a coalition of organizations. But I do think that CEA sometimes aimed to determine the community’s future, or represented itself as doing so. I think this was often a mistake.

Directly manage content creation

We don’t have plans to create more content. We do curate content when that supports productive discussion spaces (e.g. inviting speakers to events, developing curricula for fellowships at groups). We also try to incentivize the creation of quality content via giving speakers a platform and giving out Forum prizes.

80,000 Hours is maybe the most obvious organization creating new content, but many research organizations are also creating useful content, and I think there’s room for more work here (while having high quality standards).

improving coordination among donors

We are currently running EA Funds, which I see as doing some work in this space (e.g. I think Funds and the donor lottery play some of this role). There might be room for extra work in this space (e.g. coordination between major donors), but I think some of this happens informally anyway, and I don’t have a sense of whether there’s a need for more at the moment.

lead a centralized information platform

I’m not sure quite what you have in mind here. I think the Forum is playing this role to some extent: e.g. it has a lot of posts/crossposts of important content, sequences, user profiles, and a tag/wiki system. We also work on the EA Hub resources. We don’t have plans beyond further developing these.

lead a common research agenda

We are not doing this, and we haven’t been doing research since ~2017. I think there are lots of great research organizations (e.g. Global Priorities Institute, Open Philanthropy, Rethink Priorities) that are working on this (though maybe not a single leader - I think this is fine).

develop a single "brand" for EA and for main cause areas

We do not plan to do this for specific cause areas. We do plan to do some work on testing/developing EA’s brand (as mentioned above in the community health section). However, I think that other organisations (e.g. 80,000 Hours) also play an important role, and I think it’s OK (maybe good) if there are a few different representations of EA ideas (which might work well for different audiences).

Support promising individuals and organizations

Supporting organizations: as mentioned in our annual review, we do some work to support organizations as they work through internal conflicts/HR issues. We also currently make grants to other organizations via EA Funds. We also provide operational support to 80,000 Hours, Forethought Foundation, GWWC, and a long-termist project incubator. Other than this, we don’t plan to work in this space.

Supporting individuals: Again, we currently do this to some extent via EA Funds. Historically we focused a bit more on identifying and providing high-touch support to individuals. I think that our comparative advantage is to focus more on facilitating groups and discussion, rather than identifying promising individuals. So this isn’t a current focus, although we do some aspects of this via e.g. support for group leaders. I think that some of this sort of work is done via career development programs like FHI’s Research Scholars Program or Charity Entrepreneurship’s internship program. I also think that lots of organizations do some of this work via their hiring processes. But I think there might be room for extra work identifying and supporting promising individuals.

In terms of non-financial support, the groups team provides support and advice to group organizers, and the community health team provides support to community members experiencing a problem or conflict within the community.

obtain and develop tools and infrastructure to support EA researchers

I think that the Forum provides some infrastructure for public discussion of research ideas. Apart from that, I don’t think this is our comparative advantage and we don’t plan to do this.

Leading to common answers and community-wide decisions to some key questions about EA (should we expand or keep it small, should we have a different distribution of cause areas, should we invest more in cause prioritization or meta causes, ..)

We do some work to convene discussion on this between key organizations/individuals (e.g. I think this sometimes happens on the Forum, and our coordination forum event allows people to discuss such questions, and where they can build relationships that allow them to coordinate more effectively). But we don’t do things that lead to “common answers or community-wide decisions”.

I actually don’t think we need to have a common answer to a lot of these questions: I think it’s important for people to be sharing their reasoning and giving feedback to each other, but often it’s fine or good if there are some different visions for the community’s future, with people working on the aspect of that which feels most compelling to them. For instance, I think that CEA has quite a different focus now from GWWC or Charity Entrepreneurship or OP or GPI, but I think that our work is deeply complimentary and the community is better off having a distribution of work like this. I also think that it works pretty well for individuals (e.g. donors, job-seekers) to decide which of those visions they most want to support, thus allowing the most compelling visions to grow.

For similar reasons, I think it would be bad to have a single organization “leading” the community. I think that CEA aspired to play this role in the past but didn’t execute it well. I think that the current slightly-more-chaotic system is likely more robust and innovative than a centralized system (even if it were well-executed). (Obviously there’s some centralization in the current system too - e.g. OP is by far the biggest grantmaker. I don’t have a strong view about whether more or less centralization would be better on the current margin, but I am pretty confident that we don’t want to be a lot more centralized than we currently are.)

Some other things we’re not planning to focus on:

  • Reaching new mid- or late-career professionals (though we are keen to retain mid- or late- career people and to make them feel welcome, we’re focused on recruiting students and young professionals)
  • Reaching or advising high-net-worth donors
  • Fundraising in general
  • Cause-specific work (such as community building specifically for effective animal advocacy, AI safety, biosecurity etc)
  • Career advising
  • Research, except about the EA community

Some of our work will occasionally touch on or facilitate some of the above (e.g. if groups run career fellowships, or city groups do outreach to mid-career professionals), but we won’t be focusing on these areas.

As I mentioned, we might say more on this in a separate post soon.

I'm also not sure from the post if you consider this mission as a long-term focus of CEA, or if this is only for the 1-2 coming years.

I expect this mission to be our long-term focus.

Comment by maxdalton on What are you grateful for? · 2020-11-27T10:06:12.470Z · EA · GW

I'm grateful to colleagues who have worked hard through a sometimes-difficult year, been willing to try out new things (like online events), and somehow kept a sense of fun through it all.

I'm especially grateful when they point out ways I could do better and help me to grow. 

Comment by maxdalton on What are you grateful for? · 2020-11-27T10:02:39.197Z · EA · GW

I'm grateful to group leaders: running a group can be difficult and most people do it on top of full time work or studies. It requires so many different skills - being socially adept, knowing the latest research, and being able to orchestrate complex plans. 

And I think it's really important work: it creates a personal and sustained way for people to learn about EA and decide to take action. Empirically, loads of great people got into EA this way.

Comment by maxdalton on What are you grateful for? · 2020-11-27T09:58:03.017Z · EA · GW

I'm grateful that effective altruism gives me a sense of purpose and a global community. 

It feels like it fills some of the human need I have to be part of a village.

Comment by maxdalton on What are you grateful for? · 2020-11-27T09:48:51.348Z · EA · GW

(I know I'm one day late for Thanksgiving! I hope that people who celebrated it had a good day.)

Comment by maxdalton on What quotes do you find most inspire you to use your resources (effectively) to help others? · 2020-11-19T09:36:00.944Z · EA · GW

"One day we ... may have the luxury of going to any length in order to prevent a fellow sentient mind from being condemned to oblivion unwillingly. If we ever make it that far, the worth of a life will be measured not in dollars, but in stars.

"That is the value of a life. It will be the value of a life then, and it is the value of a life now.

"So when somebody offers $10 to press that button, you press it. You press the hell out of it. It's the best strategy available to you; it's the only way to save as many people as you can. But don't ever forget that this very fact is a terrible tragedy.

"Don't ever forget about the gap between how little a life costs and how much a life is worth. For that gap is an account of the darkness in this universe, it is a measure of how very far we have left to go."

 - Nate Soares, The Value of a Life

Comment by maxdalton on What quotes do you find most inspire you to use your resources (effectively) to help others? · 2020-11-19T09:30:15.934Z · EA · GW

"Three passions, simple but overwhelmingly strong, have governed my life: the longing for love, the search for knowledge, and unbearable pity for the suffering of mankind. These passions, like great winds, have blown me hither and thither, in a wayward course, over a great ocean of anguish, reaching to the very verge of despair.

"I have sought love, first, because it brings ecstasy - ecstasy so great that I would often have sacrificed all the rest of life for a few hours of this joy. I have sought it, next, because it relieves loneliness--that terrible loneliness in which one shivering consciousness looks over the rim of the world into the cold unfathomable lifeless abyss. I have sought it finally, because in the union of love I have seen, in a mystic miniature, the prefiguring vision of the heaven that saints and poets have imagined. This is what I sought, and though it might seem too good for human life, this is what--at last--I have found.

"With equal passion I have sought knowledge. I have wished to understand the hearts of men. I have wished to know why the stars shine. And I have tried to apprehend the Pythagorean power by which number holds sway above the flux. A little of this, but not much, I have achieved.

"Love and knowledge, so far as they were possible, led upward toward the heavens. But always pity brought me back to earth. Echoes of cries of pain reverberate in my heart. Children in famine, victims tortured by oppressors, helpless old people a burden to their sons, and the whole world of loneliness, poverty, and pain make a mockery of what human life should be. I long to alleviate this evil, but I cannot, and I too suffer.

"This has been my life. I have found it worth living, and would gladly live it again if the chance were offered me."

 - Prologue to Bertrand Russell's Autobiography.

Comment by maxdalton on What types of charity will be the most effective for creating a more equal society? · 2020-10-13T10:06:33.522Z · EA · GW

Thanks! Those are both good points. I think you're right that they're open to changing their minds about some important aspects of their worldview (though I do think that "Please, if you disagree with me, carry your precious opinion elsewhere. " is some evidence that there are aspects that they're not very open to changing their mind about).

I also think that I reacted too strongly to the emotionally laden language - I agree this can be justified and appropriate, though I think it can also make collaborative truth-seeking harder. This makes me think that it's good to acknowledge, feel, and empathize with anger/sadness, whilst still being careful about the potential impact it might have when we're trying to work together to figure out what to do to help others. I do still feel worried about some sort of oversimplification/overconfidence wrt "all other problems are just derivatives".

To be clear, I always thought it was good to engage in discussion here rather than downvote, but I'm now a bit more optimistic about the dialogue going well.

Comment by maxdalton on What types of charity will be the most effective for creating a more equal society? · 2020-10-13T05:57:15.186Z · EA · GW

I didn't downvote, but I imagine people are reacting to a couple of phrases in the OP:

Please, if you disagree with me, carry your precious opinion elsewhere. I am only interested in opinions on how to most effectively create a more equal society.

I think that being open to changing your mind is an important norm. I think you could read this sentence as a very reasonable request to keep this discussion on topic, but I worry that it is a more general stance. (I also find the phrasing a bit rude.)

Some of the other phrases (e.g. "conviction" "deeply sick" "all other problems are just derivatives") make me worry about whether this person will change their mind, make me worry that they're overconfident, and make me worry that they'll use heated discourse in arguments rather than collaboratively truth seeking. All of these (if true) would make me a bit less excited about welcoming them to the community.

I also think that I could be reading too much into such phrases - I hope this person will go on to engage open-mindedly in discussion.

I really liked your answer - I think it's absolutely worth sharing resources, gently challenging, and reinforcing norms around open-minded cause prio. I personally think that that's a better solution than downvoting, if people have the time to do so.

Comment by maxdalton on CEA Mid-year update (2020) · 2020-10-08T09:35:52.415Z · EA · GW
Did you do any benchmarking for retention?

We did some light googling and also got some data from other EA orgs. Retention varies a lot by industry, but broadly I think our retention rates are now comparable or a bit better than the relevant average.

Do you know what the convention is for counting part-time versus full-time, interns, student employees, etc.?

Not sure what the convention is, but we looked at full-time employees.

Voluntary versus involuntary turnover?

Again, not sure what the convention is, but we're including both. I think that voluntary turnover is also a mildly negative sign (of bad hiring or training).

Comment by maxdalton on EA Survey Series 2019: How many EAs live in the main EA hubs? · 2020-09-03T10:04:05.266Z · EA · GW

Thanks for the extra analysis, that's interesting. Good point that it depends on your purpose.

Also, just to be clear, I didn't intend this as a criticism of the OP at all - this point just came up in conversation yesterday and I thought it was worth sharing! I find these posts really helpful and keep coming back to them as I think through all sorts of problems.

Comment by maxdalton on EA Survey Series 2019: How many EAs live in the main EA hubs? · 2020-09-03T08:43:07.116Z · EA · GW

I think some people might look at this to choose which EA hub to live in or where to found an organization (of course, not everyone can/should live in a hub).

I think it is easy to overlook the density of EAs when making such a decision: e.g. Oxford's population is ~60x smaller than London's and its land area is maybe 100-300x smaller. So the travel time to visit another random EA tends to be much lower, and it's a lot more likely that you bump into people on the street. (My impression is that Berkeley is somewhere between Oxford and London, but I don't know the details.)

Comment by maxdalton on CEA Mid-year update (2020) · 2020-08-13T19:02:39.234Z · EA · GW

Me too!

Comment by maxdalton on CEA Mid-year update (2020) · 2020-08-13T19:01:56.753Z · EA · GW

Thanks for sharing this! I found it somewhat surprising that the scale of effect looks like it's bigger for comments vs. posts. (I imagine that the difference in significance is also partly that the sample size for posters is much smaller, so it's harder to reach a significance threshhold.)

Comment by maxdalton on EA Forum feature suggestion thread · 2020-06-30T07:43:07.640Z · EA · GW

I don't know if you've seen ea.greaterwrong.com - that has a dark mode (in the left hand menu). 

Comment by maxdalton on What are some good charities to donate to regarding systemic racial injustice? · 2020-06-04T14:50:09.018Z · EA · GW

I think that applying EA principles and concepts to different areas is really valuable, even if they’re areas that EA hasn’t focused on a lot up to this point. I’m glad you asked this question!

Comment by maxdalton on CEA's Plans for 2020 · 2020-05-04T09:52:23.213Z · EA · GW

I think a very common failure mode for CEA over the past ~5 years has been: CEA declares they are doing X, now no one else wants to or can get funding to do X, but CEA doesn't actually ever do X, so X never gets done.

I agree with this.  I think we've been making progress both in following through on what we say we'll do and in welcoming others to fill neglected roles, and I'd like to see us continue to make progress, particularly on the latter.

Comment by maxdalton on CEA's Plans for 2020 · 2020-04-27T15:34:41.149Z · EA · GW

I agree that it’s important that CEA reliably and verifiably listens to the community.

I think that we have been listening, and we published some of that consultation - for instance in this post and in the appendix to our 2019 review (see for instance the EA Global section).

Over the next few months we plan to send out more surveys to community members about what they like/dislike about the EA community members, and as mentioned above, we’re thinking about using community member satisfaction as a major metric for CEA. If it did become a key metric, it’s likely that we would share some of that feedback publicly. 

We don’t currently have plans for a democratic structure, but we’ve talked about introducing some democratic elements (though we probably won’t do that this year). 

Whilst I agree that consultation is vital, I think the benefits of democracy over consultation are unclear. For instance, voters are likely to have spent less time engaging with arguments for different positions and there is a risk of factionalism. Also the increased number of stakeholders means that the space of feasible options is reduced because there are few options that a wide spread of the community could agree on, which makes it harder to pursue more ambitious plans. 

I think you’re right that this would increase community support for CEA’s work and make CEA more accountable. I haven’t thought a lot about the options here, and it may be that there are some mechanisms which avoid the downsides. I’d be interested in suggestions.

Anyway, I definitely think it’s important for CEA to listen to the community and be transparent about our work, and I hope to do more of that in the future.

Comment by maxdalton on CEA's Plans for 2020 · 2020-04-27T15:16:25.443Z · EA · GW

Yes, we’ve thought about this. We currently think that it’s probably best for them to spin off separately, so that’s the main option under consideration, but we might change our minds (for instance as we learn more about which candidates are available, and what their strategic vision for the projects would be). 

This is a bit of a busy week for me, so if you’d like me to share more about our considerations, upvote this comment, and I’ll check back next week to see if there’s been sufficient interest.

Comment by maxdalton on CEA's Plans for 2020 · 2020-04-24T15:42:58.739Z · EA · GW

I think this is a really important point, and one I’ve been thinking a lot about over the past month. As you say, I do think that having a strategy is an important starting point, but I don’t want us to get stuck too meta. We’re still developing our strategy, but this quarter we’re planning to focus more on object-level work.  Hopefully we can share more about strategy and object-level work in the future. 

That said, I also think that we’ve made a lot of object-level progress in the last year, and we plan to make more this year, so we might have underemphasized that. You can read more in the (lengthy, sorry!) appendix to our 2019 post, but some highlights are:

  • Responding to 63 community concerns (ranging from minor situations (request for help preparing a workshop about self-care at an event) to major ones (request for help working out what to do about serious sexual harassment in a local group)).
  • Mentoring 50 group organizers, funding 80 projects run by groups, running group organizer retreats and making 30 grants for full-time organizers, with 25 case studies of promising people influenced by the groups we funded.
  • Helping around 1000 EA Global attendees make eight new connections on average, with 350 self-reported minor plan changes and 50 self-reported major plan changes (note these were self-reports, so are nowhere near as vetted as e.g. 80k plan changes).
  • 85% growth over 10 months in our key Forum metric and 34% growth in views of EA Global talks.
  • ~50% growth in donations to EA Funds, and millions in reported donations from GWWC members

Of course, there are lots of improvements we still need to make, but I still feel happy with this progress, and with the progress we made towards more reliably following through on commitments (e.g. addressing some of the problems with EA Grants). 

Comment by maxdalton on CEA's Plans for 2020 · 2020-04-24T15:40:36.432Z · EA · GW

Sorry, that paragraph wasn’t clear. Before we had offices in Oxford and Berkeley. The change is to close the Berkeley office (for reasons discussed above) and keep the Oxford office open. We think it’s useful to be in Oxford because that’s where a lot of our staff are currently based, and because it allows us to keep in touch with other EA orgs (e.g. the Global Priorities Institute) who share our office in Oxford. 

Comment by maxdalton on CEA's Plans for 2020 · 2020-04-23T14:00:56.986Z · EA · GW

Thanks for your comments! 

>Wasn't GWWC previously independent, before it was incorporated into CEA in 2016?

Essentially, yes. Giving What We Can was founded in 2009. CEA was set up as an umbrella legal entity for GWWC and 80,000 Hours in 2011, but the projects had separate strategies, autonomous leadership etc. In 2016, there was a restructure of CEA such that GWWC and some of the other activities under CEA’s umbrella came together under one CEO (Will MacAskill at that time), whilst 80,000 Hours continued to operate independently. 

>What's changed over the last 5 years to warrant a reversal?

To be honest, I think it’s less that the strategic landscape has changed, and more that the decision 5 years ago hasn’t worked out as well as we hoped. 

(I wasn’t around at the time the decision was made, and I’m not sure if it was the right call in expectation. Michelle (ex GWWC Executive Director) previously shared some thoughts on this on the Forum.)

As discussed here, from 2017 to 2019 CEA did not invest heavily in Giving What We Can. Communications became less frequent and the website lost some features. 

We’ve now addressed the largest of those issues, but the trustees and I think that Giving What We Can is an important project that hasn’t lived up to its (high) potential under the current arrangement (although pledges continue to grow).

Giving What We Can is one of the most successful parts of CEA. Over 4500 members have logged over $125M in donations. Members have pledged to donate $1.5B.  Beyond the money raised, it has helped to introduce lots of people (myself included) to the EA community. This means that we are all keen to invest more in GWWC.

I also think it’s important to narrow CEA’s focus. That focus looks like it’s going to be nurturing spaces for people to discuss and apply EA principles. GWWC is more focused on encouraging a particular activity (pledging to donate to charities). Since it was successfully run as an independent project in the past, trying to spin it out seemed like the right call. I’m leading on this process and trustees are investing a lot of time in it too, and we’ll work very closely with new leadership to test things out and make sure the new arrangement works well.

Comment by maxdalton on Why I'm Not Vegan · 2020-04-10T06:35:52.738Z · EA · GW

"I think there's a very large chance they don't matter at all, and that there's just no one inside to suffer" - this strikes me (for birds and mammals at least) as a statement in direct conflict with a large body of scientific evidence, and to some extent, consensus views among neuroscientists (e.g. the Cambridge Declaration on Consciousness https://en.wikipedia.org/wiki/Animal_consciousness#Cambridge_Declaration_on_Consciousness).

I think that the Cambridge Declaration on Consciousness is weak evidence for the claim that this is a "consensus view among neuroscientists".

From Luke Muehlhauser's 2017 Report on Consciousness and Moral Patienthood:

1. The document reads more like a political document than a scientific document. (See e.g. this commentary.)

2. As far as I can tell, the declaration was signed by a small number of people, perhaps about 15 people, and thus hardly demonstrates a “scientific consensus.”

3. Several of the signers of the declaration have since written scientific papers that seem to treat cortex-required views as a live possibility, e.g. Koch et al. (2016) and Laureys et al. (2015), p. 427.

Comment by maxdalton on EA Leaders Forum: Survey on EA priorities (data and analysis) · 2019-12-08T13:24:18.292Z · EA · GW

(I was the interim director of CEA during Leaders Forum, and I’m now the executive director.) 

I think that CEA has a history of pushing longtermism in somewhat underhand ways (e.g. I think that I made a mistake when I published an “EA handbook” without sufficiently consulting non-longtermist researchers, and in a way that probably over-represented AI safety and under-represented material outside of traditional EA cause areas, resulting in a product that appeared to represent EA, without accurately doing so). Given this background, I think it’s reasonable to be suspicious of CEA’s cause prioritisation. 

(I’ll be writing more about this in the future, and it feels a bit odd to get into this in a comment when it’s a major-ish update to CEA’s strategy, but I think it’s better to share more rather than less.) In the future, I’d like CEA to take a more agnostic approach to cause prioritisation, trying to construct non-gameable mechanisms for making decisions about how much we talk about different causes. An example of how this might work is that we might pay for an independent contractor to try to figure out who has spent more than two years full time thinking about cause prioritization, and then surveying those people. Obviously that project would be complicated - it’s hard to figure out exactly what “cause prio” means, it would be important to reach out through diverse networks to make sure there aren’t network biases etc.

Anyway, given this background of pushing longtermism, I think it’s reasonable to be skeptical of CEA’s approach on this sort of thing.

When I look at the list of organizations that were surveyed, it doesn’t look like the list of organizations most involved in movement building and coordination. It looks much more like a specific subset of that type of org: those focused on longtermism or x-risk (especially AI) and based in one of the main hubs (London accounts for ~50% of respondents, and the Bay accounts for ~30%).* Those that prioritize global poverty, and to a lesser extent animal welfare, seem notably missing. It’s possible the list of organizations that didn’t respond or weren’t named looks a lot different, but if that’s the case it seems worth calling attention to and possibly trying to rectify (e.g. did you email the survey to anyone or was it all done in person at the Leaders Forum?)

I think you’re probably right that there are some biases here. How the invite process worked this year was that Amy Labenz, who runs the event, draws up a longlist of potential attendees (asking some external advisors for suggestions about who should be invited). Then Amy, Julia Wise, and I voted yes/no/maybe on all of the individuals on the longlist (often adding comments). Amy made a final call about who to invite, based on those votes. I expect that all of this means that the final invite list is somewhat biased by our networks, and some background assumptions we have about individuals and orgs. 

Given this, I think that it would be fair to view the attendees of the event as “some people who CEA staff think it would be useful to get together for a few days” rather than “the definitive list of EA leaders”. I think that we were also somewhat loose about what the criteria for inviting people should be, and I’d like us to be a bit clearer on that in the future (see a couple of paragraphs below). Given this, I think that calling the event “EA Leaders Forum” is probably a mistake, but others on the team think that changing the name could be confusing and have transition costs - we’re still talking about this, and haven’t reached resolution about whether we’ll keep the name for next year.

I also think CEA made some mistakes in the way we framed this post (not just the author, since it went through other readers before publication.) I think the post kind of frames this as “EA leaders think X”, which I expect would be the sort of thing that lots of EAs should update on. (Even though I think it does try to explicitly disavow this interpretation (see the section on “What this data does and does not represent”, I think the title suggests something that’s more like “EA leaders think these are the priorities - probably you should update towards these being the priorities”). I think that the reality is more like “some people that CEA staff think it’s useful to get together for an event think X”, which is something that people should update on less. 

We’re currently at a team retreat where we’re talking more about what the goals of the event should be in the future. I think that it’s possible that the event looks pretty different in future years, and we’re not yet sure how. But I think that whatever we decide, we should think more carefully about the criteria for attendees, and that will include thinking carefully about the approach to cause prioritization.

Comment by maxdalton on Movement Collapse Scenarios · 2019-09-05T12:22:09.987Z · EA · GW

Thanks for raising these points, John! I hadn't considered the "cash prize for criticism" idea before, but it does seem like it's worth more consideration.

I agree that CEA could do better on the front of generating criticisms from outside the organization, as well as making it easier for staff to criticize leadership. This is one of the key things that we have been working to improve since I took up the Interim Executive Director role in early 2019. Back in January/February, we did a big push on this, logging around 100 hours of user interviews in a few weeks, and sending out surveys to dozens of community members for feedback. Since then, we've continued to invest in getting feedback, e.g. staff regularly talk to community members to get feedback on our projects (though I think we could do more); similarly, we reach out to donors and advisors to get feedback on how we could improve our projects; we also have various (including anonymous) mechanisms for staff to raise concerns about management decisions. Together, I think these represent more than 0.1% of CEA's staff time. None of this is to say that this is going as well as we'd like - maybe I'd say one of CEA's "known weaknesses" is that I think we could stand to do more of this.

I agree that more of this could be public and transparent also - e.g. I'm aware that our mistakes page (https://centreforeffectivealtruism.org/our-mistakes) is incomplete. We're currently nearing the end of our search for a new CEO, and one of the things that I think they're likely to want to do is to communicate more with the community, and solicit the community's thoughts on future plans.

Comment by maxdalton on New protein alternative produced from CO2 · 2019-08-13T16:50:22.189Z · EA · GW

I wonder if this is also a thing that ALLFED might be interested in - I haven't looked into this much, but the article claims that the process only requires water, CO2, and electricity, which we might have in lots of disaster scenarios. So if production of this were scaled up in the short term, that might be helpful for ALLFED's mission.

Comment by maxdalton on Optimizing Activities Fairs · 2019-07-11T09:20:52.316Z · EA · GW

Thanks for the writeup! I really appreciate people taking the time to share what they've learned. I agree that activities fairs are a really high leverage time for student groups.

My summary of this approach is "Try to get as many email addresses as possible, and anticipate that many people will unsubscribe/never engage". I'd be interested to hear more about why this approach is recommended over others.

I think that this could well be the right approach, but it's not totally clear to me. It could be that having slightly longer conversations with people would build more raport, give them a better sense of the ideas, and make them a lot more likely to continue to engage, so you get more/higher quality people lower down your funnel. My memory of going to freshers fairs was that if I had a proper conversation with someone it did make some difference to the likelihood that I engaged later on.

I also worry a bit about the maximizing for email addresses approach coming across as unfriendly.

It does seem right to me that arguing with people isn't worth the time.

I'd be interested in why Eli and Aaron think that the "maximize for email addresses" approach is correct long-term. I could well imagine that they've tried both approaches, and seen more engagement lower down the funnel with the "max for email addresses" approach.

[Speaking from my experience as a groups organizer, not on behalf of CEA]

Comment by maxdalton on Impact investing is only a good idea in specific circumstances · 2018-12-06T12:22:03.949Z · EA · GW

I strong upvoted this. I think it's great to have a reference piece on this, and particularly one which has such a good summary.

Comment by maxdalton on What's Changing With the New Forum? · 2018-11-12T10:23:55.076Z · EA · GW

That's right, this is intended as a feature. All comments and posts start with a weak upvote (we assume you think the thing is good, or you wouldn't have posted it). You can strong upvote your content, which is designed as a way for you to signal-boost contributions that you think are unusually valuable. Obviously, we don't want people to be strong-upvoting all their content, and we'll keep an eye on that happening.

Comment by maxdalton on Even non-theists should act as if theism is true · 2018-11-09T08:58:20.106Z · EA · GW

To link this to JP's other point, you might be right that subjectivism is implausible, but it's hard to tell how low a credence to give it.

If your credence in subjectivism + model uncertainty (+ I think also constructivism + quasi-realism + maybe others?) is sufficiently high relative to your credence in God, then this weakens your argument (although it still seems plausible to me that theistic moralities end up with a large slice of the pie).

I'm pretty uncertain about my credence in each of those views though.

Comment by maxdalton on Even non-theists should act as if theism is true · 2018-11-09T08:41:04.334Z · EA · GW

Upvote for starting with praise, and splitting out separate threads.

Comment by maxdalton on Burnout: What is it and how to Treat it. · 2018-11-08T18:14:52.298Z · EA · GW

I found the Manager Tools basics podcasts, and the Effective Manager a great way to cover the basics. (But I know others have found them less helpful.)

A great piece on this from the Forum is: Ben West's post on Deliberate Performance in People Management.

Comment by maxdalton on How to use the Forum · 2018-11-08T14:41:52.402Z · EA · GW

As long as you make clear how it's relevant to figuring out how to do as much good as possible, that sort of content is welcome.

Comment by maxdalton on Why the EA Forum? · 2018-11-08T11:48:40.597Z · EA · GW

That's right - one of the main goals of having posts sorted by karma (as well as having two sections) - is to allow people to feel more comfortable posting, knowing that the best posts will rise to the top.

Comment by maxdalton on Which piece got you more involved in EA? · 2018-11-08T11:37:21.945Z · EA · GW

If you highlight the text, a hover appears above the text, and the link icon is one of the options - click on it, paste the url, and press enter.

Comment by maxdalton on Burnout: What is it and how to Treat it. · 2018-11-08T11:14:31.018Z · EA · GW

I sleep a lot better when I'm cooler, and I've found this helpful: https://www.chilitechnology.com/. Others recommend https://bedjet.com/.

Comment by maxdalton on Burnout: What is it and how to Treat it. · 2018-11-08T11:12:03.534Z · EA · GW

Link to Zvi's sequence on LessWrong, which includes the posts you mentioned: https://www.lesswrong.com/s/HXkpm9b8o964jbQ89

Comment by maxdalton on What's Changing With the New Forum? · 2018-11-08T11:03:30.716Z · EA · GW

Hi Richard, I think you're right that "basic concepts" is incorrect: I agree that it's important to discuss advanced ideas which build off each other. We'd want both of the posts you mention to be frontpage posts. I'll suggest an edit to Aaron.

By default, we're moving all content to either Frontpage or Community, since we're trying to have a slightly less active moderation policy than LessWrong. We might revisit this at some point. You can still click on a user's name to see their personal feed of posts.

Comment by maxdalton on Why the EA Forum? · 2018-11-08T10:31:04.352Z · EA · GW

Moderation notice: stickied on community.

Comment by maxdalton on What's Changing With the New Forum? · 2018-11-08T10:30:21.025Z · EA · GW

Moderation notice: Stickied in Community to give context for people familiar with the old Forum.

Comment by maxdalton on Keeping Absolutes in Mind · 2018-11-07T10:36:35.775Z · EA · GW

I agree with your point about subjective expected value (although realized value is evidence for subjective expected value). I'm not sure I understand the point in your last paragraph?

Comment by maxdalton on Keeping Absolutes in Mind · 2018-11-06T12:28:32.289Z · EA · GW

Strong upvote. I think this is an important point, nicely put.

A slightly different version of this, which I think is particularly insidious, is feeling bad about doing a job which is your relative advantage. If I think Cause A is the most important, it's tempting to feel that I should work on Cause A even if I'm much better at working on Cause B, and that's my comparative advantage within the community. This also applies to how one should think about other people - I think one should praise people who work on Cause B if that's the thing that's best for their skills/motivations.