Posts

CEA update: Q1 2021 2021-04-22T05:36:36.592Z
CEA update: Q4 2020 2021-01-15T09:55:09.408Z
CEA's strategy as of 2021 2021-01-15T09:42:14.569Z
Things CEA is not doing 2021-01-15T09:33:42.070Z
Giving What We Can & EA Funds now operate independently of CEA 2020-12-22T03:47:48.140Z
CEA's Plans for 2021 2020-12-11T00:27:08.774Z
CEA's 2020 Annual Review 2020-12-10T23:45:51.497Z
What are you grateful for? 2020-11-27T09:47:15.026Z
Some extremely rough research on giving and happiness 2020-09-09T08:33:06.084Z
CEA Mid-year update (2020) 2020-08-11T10:06:41.512Z
CEA's Plans for 2020 2020-04-23T07:50:44.921Z
CEA's 2019 Annual Review 2020-04-23T07:39:59.289Z
The Frontpage/Community distinction 2018-11-16T17:54:15.072Z
Why the EA Forum? 2018-11-07T23:24:49.981Z
Which piece got you more involved in EA? 2018-09-06T07:25:01.218Z
Announcing the Effective Altruism Handbook, 2nd edition 2018-05-02T07:58:24.124Z
Announcing Effective Altruism Grants 2017-06-09T10:28:15.441Z
Returns Functions and Funding Gaps 2017-05-23T08:22:44.935Z
Should we give to SCI or fund research into schistosomiasis? 2015-09-23T15:00:36.615Z

Comments

Comment by MaxDalton (Maxdalton) on Thoughts on being overqualified for EA positions · 2021-05-07T09:47:30.006Z · EA · GW

If people aren't listening to Bob because they don't like his leadership style, then I would say that Bob is a bad culture fit (or, to be blunt, not a good leader). I wouldn't describe this as the organization "not letting him thrive."

I could also imagine it being that the org has a bad culture (e.g. they systematically don't listen to the ideas of people in more junior roles)

Comment by MaxDalton (Maxdalton) on CEA update: Q1 2021 · 2021-04-26T05:53:50.068Z · EA · GW

For groups support calls, one staff member's NPS was 83% and another's was 55%. (They were talking to different user groups, which probably explains some of the discrepancy.)

Comment by MaxDalton (Maxdalton) on CEA update: Q1 2021 · 2021-04-26T05:26:27.689Z · EA · GW

Thanks for explaining! The guess about how people use the scale seems pretty plausible to me.

Comment by MaxDalton (Maxdalton) on CEA update: Q1 2021 · 2021-04-25T10:00:15.923Z · EA · GW

Hmm, I still think the threshold effects are kinda weird, and so NPS shouldn't be the main measure. (I know you're just asking for it as supplementary info, and I think we'd maybe both prefer mean + histogram.)

There's a prima facie case, that's like: the threshold effects say that you care totally about the 6/7 and 8/9 boundaries, and not-at-all about the 5/6, 7/8, 9/10 boundaries. That's weird!

I could imagine a view that's like "it's really important to have enthusiastic promoters because they help spread the word about your product" or something, but then why would that view want you to care not-at-all about the 9/10 boundary? I imagine 10s are more enthusiastic promoters, and it seems plausible to me that the 9/10 differential is the same or greater than the 8/9 differential. 

And why would it want you to care not-at-all about the 7/8 boundary? I imagine 8s could be enthusiastic promoters, more so than 7s.

 And similar comments for a view that's like "it's really important to avoid having detractors, because they put people off".

I could also imagine a kinda startup-y view that's like "it's really important to get excellent product market fit, which means focusing on getting some people to really love your product, rather than a large group of people to like it". But on that view,  why ignore the 9/10 boundary? And why care about detractors?

I also think that maybe all of the above views make more sense when your aim is to predict whether your product will grow virally (not our focus), vs. whether it's generally high quality/providing something that people want (more our focus). So they might just not carry over well to our case.

Comment by MaxDalton (Maxdalton) on CEA update: Q1 2021 · 2021-04-23T16:26:17.259Z · EA · GW

Thanks - I'll pass this on to the people involved! 

Comment by MaxDalton (Maxdalton) on CEA update: Q1 2021 · 2021-04-23T16:24:36.185Z · EA · GW

EA Global: Reconnect NPS was 20%

Comment by MaxDalton (Maxdalton) on CEA update: Q1 2021 · 2021-04-23T16:22:39.313Z · EA · GW

Sure! I've asked the relevant people to respond with the NPS figures if it's quick/easy for them to do so, but they might prioritize other things.

Btw, I disagree about how useful NPS is. I think it's quite a weird metric (with very strong threshold effects between 6/7 and 8/9, and no discrimination between a 6 and a 1). That's why we switched to the mean. I do think that looking at a histogram is often useful though- in most cases the mean doesn't give you a strong sense of the distribution.

Comment by MaxDalton (Maxdalton) on CEA update: Q1 2021 · 2021-04-23T12:46:34.435Z · EA · GW

Yes, that's right.

Comment by MaxDalton (Maxdalton) on CEA update: Q1 2021 · 2021-04-23T07:58:04.111Z · EA · GW

These terms are generally referring to 19 university groups which we give some additional support (e.g. we offer extra 1:1 calls with them, and we pilot some programs with them). This is on top of the support we offer all groups (e.g. online resources, funding for events, 1:1 calls, advice over Slack/email).

The groups are chosen primarily based on the university’s track record of having highly influential graduates (e.g. Nobel prize winners, politicians, major philanthropists). We also place some weight on university rankings, universities in regions with rapidly-growing global influence, the group’s track record, and leader quality. 

Current focus university groups in no particular order: Harvard, Swarthmore, Oxford, London School of Economics (LSE), Cambridge, Georgetown, Stanford, Hong Kong University, Yale, Princeton, MIT, Caltech, Berkeley, University of Chicago, Columbia, Penn.

Comment by MaxDalton (Maxdalton) on Cause Area: Human Rights in North Korea · 2021-04-09T07:45:11.927Z · EA · GW

I changed the publish date of this post back to Nov 20, 2017, since it seems like you wanted to do that.

Comment by MaxDalton (Maxdalton) on Announcing "Naming What We Can"! · 2021-04-01T16:02:23.949Z · EA · GW

Can I suggest that Max Daniel change his name to Max Ipok?

Comment by MaxDalton (Maxdalton) on AMA: JP Addison and Sam Deere on software engineering at CEA · 2021-03-26T11:21:25.482Z · EA · GW

Do you want effectivealtruism.org to reflect the views of the EA community? Engaged EAs? CEA? EA “leaders”?

I touched on this in an earlier comment:

In the future, I’d like CEA to take a more agnostic approach to cause prioritisation, trying to construct non-gameable mechanisms for making decisions about how much we talk about different causes. An example of how this might work is that we might pay for an independent contractor to try to figure out who has spent more than two years full time thinking about cause prioritization, and then surveying those people. Obviously that project would be complicated - it’s hard to figure out exactly what “cause prio” means, it would be important to reach out through diverse networks to make sure there aren’t network biases etc.

Although we haven’t yet commissioned that research, that’s still the spirit I want us to have as we create content. We are consulting with non-longtermists as we develop the content. I agree that it’s a shame that the EA.org resources are still quite similar to the handbook content. We’re working on a replacement which should be more up to date, but I’m not sure when we’ll make the relevant changes.

 Would CEA ever consider temporarily or permanently transferring the broader ownership of effectivealtruism.org to another person/organization?

We’d consider offers (contact us), but I think we’re more likely to aim to develop the capacity to do this in-house rather than finding someone external to take this on (though I don’t want to make specific commitments). 

Comment by MaxDalton (Maxdalton) on AMA: JP Addison and Sam Deere on software engineering at CEA · 2021-03-26T11:18:13.600Z · EA · GW

This problem should be fixed now too.

Comment by MaxDalton (Maxdalton) on EA Funds has appointed new fund managers · 2021-03-25T11:34:16.023Z · EA · GW

By the way, EA Funds ran this application process and EA Funds now operates independently of CEA.

Comment by MaxDalton (Maxdalton) on Some quick notes on "effective altruism" · 2021-03-25T08:45:58.864Z · EA · GW

I asked my team about this, and Sky provided the following information. This quarter CEA did a small brand test, with Rethink’s help. We asked a sample of US college students if they had heard of “effective altruism.” Some respondents were also asked to give a brief definition of EA and a Likert scale rating of how negative/positive their first impression was of “effective altruism.”

Students who had never heard of “effective altruism” before the survey still had positive associations with it. Comments suggested that they thought it sounded good  - effectiveness means doing things well; altruism means kindness and helping people. (IIRC, the average Likert scale score was 4+ out of 5). There were a small number of critiques too, but fewer than we expected. (Sorry that this is just a high-level summary - we don't have a full writeup ready yet.)

Caveats: We didn't test the name “effective altruism” against other possible names. Impressions will probably vary by audience. Maybe "EA" puts off a small-but-important subsection of the audience we tested on (e.g. unusually critical/free-thinking people).

I don't think this is dispositive - I think that testing other brands might still be a good idea. We're currently considering trying to hire someone to test and develop the EA brand, and help field media enquiries. I'm grateful for the work that Rethink and Sky Mayhew have been doing on this.

Comment by MaxDalton (Maxdalton) on How to run an effective stall (we think) · 2021-03-24T10:35:21.078Z · EA · GW

Thanks for writing this up and sharing it! People might also be interested in this post on the same topic, and this guide.

Comment by MaxDalton (Maxdalton) on Responses and Testimonies on EA Growth · 2021-03-23T19:22:22.294Z · EA · GW

In my view, (1-3) did not directly slow growth.

This surprised me - wouldn't you expect 1 and 2 to directly slow growth somewhat, e.g. by putting people off or causing growth projects to fail to meet their goals? (Maybe you just don't think these were very significant?)

"fundamentally, why has growth not sped up?"

I think it's good to ask "what was the relative importance of these factors?", but the framing of "fundamentally, why has growth not sped up?" seems to be implicitly pushing towards there being a single explanation. I think there were probably multiple significant factors.

Re your last para, I hope that CEA's plans are part of the answer, although I think it's good for us to pursue a variety of approaches - e.g. I also think it's good for GWWC to spread its message somewhat more quickly/widely.

Comment by MaxDalton (Maxdalton) on Responses and Testimonies on EA Growth · 2021-03-23T15:11:06.862Z · EA · GW

I agree that this is a (significant) part of the explanation. For instance, I think there are a variety of things I could have done last year that would have helped our groups support improve more quickly.

Plug: if you have feedback about mistakes CEA is making or has made, I'd love to hear it. You can share thoughts (including anonymously) here.

Comment by MaxDalton (Maxdalton) on A ranked list of all EA-relevant (audio)books I've read · 2021-02-18T09:51:27.604Z · EA · GW

I haven't read everything on your list, but I broadly agree with your rankings for the things I have read (with some tweaks - e.g. I'd probably put Inadequate Equilibria higher and Thinking Fast and Slow lower). 

I feel a bit confused still about how many/which things should be canonical. Maybe I want canonical ideas rather than canonical books? E.g. I think some of the ideas in the sequences are  important, and should be more widely known/used even in EA. But I also think it contains some less important stuff, and some people find the presentation offputting (while others love it). So I guess I'd ideally like there to be a few different presentations of the same ideas, and people can read the presentation that works best for them (bold academic book, super-well-evidenced academic papers, spicey blog posts, fanfic etc.). Maybe we now have this in some domains - e.g. you listed several presentations of AI safety?

I won't do a full  list of things I like right now, but some quick thoughts: 

  • I think The Great Courses can sometimes be great: I remember particularly liking one on biology. My understanding of biology overall is still quite imprecise, but the course gave me images of how a bunch of cellular biology works mechanistically which I think would be good scaffolding for a better understanding. I also really liked one on Chinese history (I think this one, which is a bit broader but still quite China-focused). I think the quality varies a bit between courses though.
  • I also love the In Our Time podcast . Especially the science ones - they have a version of the podcast that's only science. I like that they have several academics, which means you can get a variety of perspectives, and which makes me less worried that I'm only hearing one side of a debate.

P.s. I agree with a lot of your points in the other comment too, and I'm glad you posted this list!

Comment by MaxDalton (Maxdalton) on A ranked list of all EA-relevant (audio)books I've read · 2021-02-17T19:52:15.080Z · EA · GW

I agree there are some costs to having some canonical books, but I think there are also some real benefits: for instance it helps to create common knowledge, which can facilitate good discussion and coordination. Also maybe some books are sufficiently important and high-quality that ~all EAs should read them before reading a broader variety of books (e.g. maybe all EAs should read The Precipice and a few other books, but then they should branch out and read a variety of things). 

I don't think that everything on Michael's list should be canonical, but I think probably some of his top recommendations should be.  I agree that some of the things on the list are probably over-canonized too.

Comment by MaxDalton (Maxdalton) on Why do EA events attract more men than women? Focus group data · 2021-02-16T09:48:27.415Z · EA · GW

Thanks for doing this investigation, it's always helpful to have more data on this sort of thing.

Small note: I just re-skimmed this , and I found it really helpful that the headings were quite long: you can read the table of contents as a kind of summary, then click on bits that are interesting to you. I'm not sure if that was intentional, but I found it very helpful.

Comment by MaxDalton (Maxdalton) on CEA update: Q4 2020 · 2021-01-22T06:10:02.102Z · EA · GW

Interesting! Thanks for sharing.

Comment by MaxDalton (Maxdalton) on CEA update: Q4 2020 · 2021-01-21T15:56:36.240Z · EA · GW

That's interesting - I'm surprised by that and wonder if it's due to some differences between systems? In the UK people often begin to think about internships in their first or second years, and then look for jobs in the 3rd year, so I think there's quite a lot of ability to influence and discuss career plans early on. In the US degrees are longer, but early on people are trying to decide their major which is also a significant career decision. I also think that students have a lot more time and interest in engaging with new things, and they tend to be easier to reach (e.g. because they all come to activity fairs).  How do you find/target these early-career people? And aren't they already normally in employment/set on a career path?

CBGs remains open to non-student groups.

Comment by MaxDalton (Maxdalton) on Things CEA is not doing · 2021-01-21T15:49:19.793Z · EA · GW

No, I think you understood the original post right and my last comment was confusing. When I said "grow" I was imagining "grow in terms of people/impact in the areas we're currently occupying and other adjacent areas that are not listed as "things we're not doing"".

I don't expect us to start doing the things listed in the post in the next 4-10 years (though I might be wrong on some counts). We'll be less likely to do something if it's less related to our current core competencies, and if others are doing good work in the area (as with fundraising).

Comment by MaxDalton (Maxdalton) on Things CEA is not doing · 2021-01-19T09:33:36.930Z · EA · GW

Yeah, I agree that "be able to usefully scale" is a pretty positive instrumental goal (and maybe one I should pay attention to more). 

Maybe there are different versions of "be able to usefully scale":

  1. I'm mostly thinking of this in terms of "work on area X such that you have a model you can scale to make X much higher impact". I think this generally doesn't help you much to explore/scale area Y. (Maybe it helps if X and Y are really related.)
  2. I think you're thinking of this mostly in terms of "Be able to hire, manage, and coordinate large numbers of people quickly". I think this is also useful, and it's also something I'm aiming towards (though probably less than more object-level goals, like figuring out how to make our programs great, at the moment).
Comment by MaxDalton (Maxdalton) on Things CEA is not doing · 2021-01-19T09:30:25.686Z · EA · GW

Yeah, agree that experimentation and listening to customers is good. I mostly think this is a different dimension from things like "expand into a new area", although maybe the distinction is a bit fuzzy.

I also agree that aiming to find scaleable things is a great model. 

I do think of CEA as looking for things that can scale (e.g. I think Forum, groups, and EAGx could all be scaled relatively well once we work out the models) (but we also do things that don't scale, which I think is appropriate while we're still searching for a really strong product-market fit).

(Edited to add: below I make a difference between scaling programs and scaling staff numbers. My sense is that startups' first goal is to find a scalable business model/program, and that only after that should they focus on being able to scale staff numbers to capitalize on the opportunity.)

Comment by MaxDalton (Maxdalton) on Things CEA is not doing · 2021-01-19T09:27:02.203Z · EA · GW

This all seems reasonable. Some scattered thoughts:

  • To be clear, I'm not claiming 1), I'm more like "I'm still figuring out how fast/how to grow)"
  • I still think that "expanding staff/focus" is getting a bit too much emphasis: I think that if we focus on the right things we might be able to scale our impact faster than we scale our staff
Comment by MaxDalton (Maxdalton) on Things CEA is not doing · 2021-01-18T20:00:11.973Z · EA · GW

Another argument: I think that startup folk wisdom is pretty big on focus. (E.g. start with a small target audience, give each person one clear goal).  I think it's roughly "only start doing new things when you're acing your old things". But maybe startup folk wisdom is wrong, or maybe I'm wrong about what startup folk wisdom says.

I also think most (maybe basically all?) startups that got big started by doing one smaller thing well (e.g. Google did search, Amazon did books, Apple did computers, Microsoft did operating systems). Again, I think this was something like "only start new products when your old products are doing really well" ("really well" is a bit vague: e.g. Amazon's distribution systems were still a mess when they expanded into CDs/toys, but they were selling a lot of books).

Comment by MaxDalton (Maxdalton) on Things CEA is not doing · 2021-01-18T19:55:08.823Z · EA · GW

I think my view was that  (while they still think they're cost effective), orgs should be on the Pareto frontier of {size, quality}. However, they should also try to communicate clearly about what they won't be able to take on, so that others can take on those areas.

I imagine your reply being something like "That's OK, but org-1 is essentially creating an externality on potential-org-2 by growing more slowly: it means that potential-org-2 has to build a reputation, set up ops etc, rather than just being absorbed into org-1. It's better for org-1 to not take in potential-org-2, but it's best for the world if org-1 to take in potential-org-2."

I think the externality point is correct, but I'm still not sure what's best for the world: you also need to account for the benefits of quality/focus/learning/independence, which can be pretty significant (according to me). 

I think the answer is going to depend a bit on the situation: e.g. I think it's better for org-1 to take in potential-org-2 if potential-org-2 is doing something more closely related to org-1's area of expertise, and quite bad for it to try to do something totally different. (E.g. I think it would probably be bad for CEA to try to set up your org with you as PM.) I also think it's better for org-1 to take on new things if it's doing it's current things well and covering more of the ground (since this means that the opportunity cost of it's growth-attention are lower).

Comment by MaxDalton (Maxdalton) on Things CEA is not doing · 2021-01-18T19:48:26.032Z · EA · GW

I agree with your points about it being easier to find a PM than an ED, brand, centralizing operations etc., and I think they're costs to setting up new non-profits.

Comment by MaxDalton (Maxdalton) on Things CEA is not doing · 2021-01-18T19:43:57.773Z · EA · GW

I also agree with your point about different forms of centralization, and I think that we are in a world where e.g. funding is fairly centralized.

I also wanted to emphasize that I agree with Edo that it's good to have centralized discussion platforms etc. - e.g. I think it would probably be bad if someone set up a competitor to the Forum that took 50% of the viewers. (But maybe fine if they quickly stole 100% of the Forum's viewers by being much better.)

Comment by MaxDalton (Maxdalton) on Things CEA is not doing · 2021-01-18T19:40:24.571Z · EA · GW

I generally feel like there's a bit too much focus in your model on number of people vs. getting those people to do high-quality work, or directing those people towards important problems. I also think it's worth remembering that staff (time, compensation) come on the cost side of the cost-effectiveness calculation.

E.g. I don't think that GW succeeded because it had more staff than 80k. I think they succeeded because they were doing really high-quality work on an important problem. To do that they had to have a reasonable number of staff, but that was a cost they had to pay to do the high-quality work. 

And then related to the question about how fast to grow, it looks like it took them 6 years to get to 10 staff, and 9 years to get to 20. They had 18 staff around the time that Good Ventures was founded. So I think that simply being big wasn't a critical factor in their success, and I suspect that that relatively slow growth was in fact critical to figuring out what the org was doing and keeping quality high. 

Comment by MaxDalton (Maxdalton) on Things CEA is not doing · 2021-01-18T19:29:43.505Z · EA · GW

Thanks for the detailed reply!  I'll give a few shorter comment responses. I'm not running these by people at CEA, so this is just my view.

Organizational growth

First, I want to clarify that I'm not saying that I want CEA/other orgs to stay small. I think we might well end up as one of those big orgs with >40 people.

Given that I think there are some significant opportunities here, choosing to be focused doesn't mean we can't get big. We'd just have lots of people focused on one issue rather than lots focused on lots of issues. For some of the reasons I gave above about learning how to  do an area really well, I think that would go better.

I also think that as we grow, we might then saturate the opportunities in the area we're focused in and then decide to expand to a new area.

So then there are questions about how fast to grow and when to stop growing. I think that if you grow quickly it's harder to grow in a really high-quality way. So you need to make a decision about how much to focus on quality vs. size. I think this is a tricky question which doesn't have a great general answer.  If I could make CEA big, really high quality, and focused on the most important things, I'd obviously do that in a heartbeat. But I think in reality it's more about navigating these tradeoffs and improving quality/size/direction as quickly as you can.

Comment by MaxDalton (Maxdalton) on CEA update: Q4 2020 · 2021-01-15T22:00:28.439Z · EA · GW

Yes - student groups will be our main priority for additional support in 2021 (we say a bit more about why here, and we discuss what that means concretely for our groups team here). But we’ll be maintaining or expanding the support we give to all groups, including new initiatives like Virtual Programs

Comment by MaxDalton (Maxdalton) on CEA update: Q4 2020 · 2021-01-15T22:00:11.179Z · EA · GW

We plan to do most hiring through public vacancies, but we will make occasional exceptions when we think we’re very likely to be aware of the top candidates. 

In the first case we wanted to hire someone who had experience leading a successful university group, and since we work closely with many group leaders we felt like we had a good enough sense of the talent pool to do a closed round (where we invited a small number of candidates to do work trials, interviews etc.) We might do this sort of thing again.

We brought on Sara and Aadil through the same hiring round, for an executive assistant position. Longview Philanthropy and 80,000 Hours had recently advertised a public position for an operations/executive assistant role. With the help of those organizations and their applicants, we were able to cut out some of the early advertisement/screening steps, and focus on some of their top candidates. I think this saved us a fair amount of time without compromising much on accessibility/fairness. We might do similar things in the future.

But I acknowledge the costs you share, and we plan to normally invite public applications (as for this new contractor role, and for a finance/data role that we plan to post soon).

Comment by MaxDalton (Maxdalton) on Things CEA is not doing · 2021-01-15T21:58:29.340Z · EA · GW

Thanks for sharing this feedback Ozzie, and for acknowledging the tradeoffs too.

I have a different intuition on the centralization tradeoff - I generally feel like things will go better if we have a lot of more focused groups working together vs. one big organization with multiple goals. I don’t think I’ll be able to fully justify my view here. I’m going to timebox this, so I expect some of it will be wrong/confused/confusing.

Examples: I think that part of the reason why 80,000 Hours has done well is that they have a clear and relatively narrow mission that they’ve spent a lot of time optimizing towards. Similarly, I think that GWWC has a somewhat different goal from CEA, and I think both CEA and GWWC will be better able to succeed if they can focus on figuring out how to achieve their goals. I hope for a world where there are lots of organizations doing similar things in different spaces. I think that when CEA was doing grantmaking and events and a bunch of other things it was less able to learn and get really good at any one of those things. Basically, I think there are increasing returns to work on lots of these issues, so focusing more on fewer issues is good.

It matters to get really good at things because these opportunities can be pretty deep: I think returns don’t diminish very quickly. E.g. we’re very far from having a high-quality widely-known EA group in every highly-ranked university in the world, and that’s only one of the things that CEA is aiming to do in the long run. If we tried to do a lot of other things, we’d make slower progress towards that goal. Given that, I think we’re better off focusing on a few goals and letting others pick up other areas.

I also think that, as a movement, we have some programs (e.g. Charity Entrepreneurship, the longtermist entrepreneurship project, plus active grantmaking from Open Philanthropy and EA Funds) which might help to set up new organizations for success.

We will continue to do some of the work we currently do to help to coordinate different parts of the community - for instance the EA Coordination Forum (formerly Leaders Forum), and a lot of the work that our community health team do. The community health team and funders (e.g. EA Funds) also do work to try to minimize risks and ensure that high-quality projects are the ones that get the resources they need to expand.

I also think your point about ops overhead is a good one - that’s why we plan to continue to support 80k, Forethought, GWWC, EA Funds, and the longtermist incubator operationally. Together, our legal entities have over 40 full time staff, nearly 100 people on payroll, and turnover of over $20m. So I think we’re reaping some good economies of scale on that front. 

Finally, I think a more decentralized system will be more robust - I think that previously CEA was too close to being a single point of failure.

Comment by MaxDalton (Maxdalton) on CEA's Plans for 2021 · 2020-12-14T20:12:16.377Z · EA · GW

Thanks for your questions. 

Re: target of 125 people. This is a relatively high bar: it’s focused on people who have taken significant action based on a good understanding of EA principles. So the bar is somewhat higher than the GWWC pledge, because we interview people and get a sense of why they chose the path they’re in and what would change their mind. We think that for most people this means >100 hours of engagement with quality content, plus carefully thinking through their career plans and taking action toward those plans (which might include significant donations).

I actually think that $20,000 per person in this position would still be a good deal: the expected lifetime value of a GWWC pledge might be around $73,000, and some people might be doing things significantly more promising than the average GWWC pledge. I don’t think that will be the full cost - e.g. these people will probably also benefit some from e.g. the Forum or 80k resources. However, I also think that these 125 people only represent some of the value that groups work creates (e.g. groups also help persuade people to take less intensive action, and to retain and empower people who are already engaged). I also think there’s a fair chance that we beat this target.

We arrived at 125 by estimating the number of individuals we think met this definition in 2019, applying a ~30% growth rate in the community, and then increasing this number further within key populations. One of our internal benchmarks is that the cohort of engaged EAs recruited in 2021 is more demographically diverse than the cohort of engaged EAs recruited in 2020.

Comment by MaxDalton (Maxdalton) on CEA's Plans for 2021 · 2020-12-14T08:55:48.085Z · EA · GW

Hey Marisa, thanks, I'm glad you appreciated this! 

Yes, EEAs=highly-engaged EAs (I've now edited this throughout, so that it's a bit less jargon-y). This is a term that we're using internally to refer to people who are taking significant action (e.g. a career plan or a significant giving pledge or similar) based on a detailed understanding of EA ideas.  

Comment by MaxDalton (Maxdalton) on CEA's Plans for 2021 · 2020-12-11T17:35:48.675Z · EA · GW

Hi, thanks for your question! 

The section on 2021 plans is intended to be a summary of these criteria, sorry that wasn’t clear. 

  1. One target is focused on recruitment: building a system for onboarding people to EA (to the level where they are taking significant action based on a good understanding of EA principles). Specifically, we aim to help onboard 125 people to this level.
  2. The second target is focused around retention: for people who are already highly engaged EAs, growing the amount of time they spend engaging with high-quality content via our work (e.g. Forum view time or watching a talk from one of our events on YouTube) by 30%, and also growing the number of new connections (e.g. at events) they make by 30%. 
  3. The third target is focused on risk-reduction: this is covered in the community health targets above (which are slightly more tightly specified and fleshed out internally).

Internally we obviously have more details about how we plan to measure/assess these things, but we wanted to just give a summary here. We expect that most of these org-wide goals will be achieved as a collaboration between teams, but we have a single person responsible for each of the org-wide goals. (Operations and executive goals are a bit more complex, and are covered above.)

Comment by MaxDalton (Maxdalton) on CEA's 2020 Annual Review · 2020-12-11T17:35:17.342Z · EA · GW

Hi Brian, thanks for your question, and I’m glad the update was useful!

You’re correct about the overall approach we’re using (multiplying the expected value of the change by how much of that change is attributable to the group). I’ll flag this comment to Harri and he might follow up with some more details, publicly or privately.

Comment by MaxDalton (Maxdalton) on CEA's 2020 Annual Review · 2020-12-11T17:24:39.122Z · EA · GW

Hi Brian, Thanks for your question! I’m not sure how much we can comment on the investment strategy or grantmaking of this fund, but I’ll flag your questions to Carl.

Comment by MaxDalton (Maxdalton) on CEA's Plans for 2021 · 2020-12-11T16:30:02.376Z · EA · GW

Hi Edo, This is something that we’re keen to clarify and might publish more on soon. So thanks for giving me the opportunity to share some thoughts on this!

I think you’re right that this is a narrower mission: this is deliberate.

As we say on our mistakes page:

Since 2016, CEA has had a large number of projects for its size...Running this wide array of projects has sometimes resulted in a lack of organizational focus, poor execution, and a lack of follow-through. It also meant that we were staking a claim on projects that might otherwise have been taken on by other individuals or groups that could have done a better job than we were doing (for example, by funding good projects that we were slow to fund).

Since we wrote that, we have closed EA Grants and spun Giving What We Can out (while continuing to provide operational support), and we’re exploring something similar with EA Funds. I think that this will allow us to be more focused and do an excellent job of the things we are doing.

As you note, there are still many things in the area of building the EA community that we are not doing. Of course these things could be very impactful if well-executed (even though we don’t have the resources to take them on), so we want to let people know what we’re not doing, so they can consider taking them on.

I’ll go through some of the alternatives you mention and talk about how much I think we’ll work in this space. I’ll also share some rough thoughts about what might be needed, but I’m really not an expert in that question - I’d tend to defer to grantmakers about what they’re interested in funding.

A theme in what I write below is that I view CEA as one organization helping to grow and support the EA community, not the organization determining the community’s future. I think it’s mostly good for there not to be one organization determining the community’s future. I think that this isn’t a real change: the EA community’s development was always influenced by a coalition of organizations. But I do think that CEA sometimes aimed to determine the community’s future, or represented itself as doing so. I think this was often a mistake.

Directly manage content creation

We don’t have plans to create more content. We do curate content when that supports productive discussion spaces (e.g. inviting speakers to events, developing curricula for fellowships at groups). We also try to incentivize the creation of quality content via giving speakers a platform and giving out Forum prizes.

80,000 Hours is maybe the most obvious organization creating new content, but many research organizations are also creating useful content, and I think there’s room for more work here (while having high quality standards).

improving coordination among donors

We are currently running EA Funds, which I see as doing some work in this space (e.g. I think Funds and the donor lottery play some of this role). There might be room for extra work in this space (e.g. coordination between major donors), but I think some of this happens informally anyway, and I don’t have a sense of whether there’s a need for more at the moment.

lead a centralized information platform

I’m not sure quite what you have in mind here. I think the Forum is playing this role to some extent: e.g. it has a lot of posts/crossposts of important content, sequences, user profiles, and a tag/wiki system. We also work on the EA Hub resources. We don’t have plans beyond further developing these.

lead a common research agenda

We are not doing this, and we haven’t been doing research since ~2017. I think there are lots of great research organizations (e.g. Global Priorities Institute, Open Philanthropy, Rethink Priorities) that are working on this (though maybe not a single leader - I think this is fine).

develop a single "brand" for EA and for main cause areas

We do not plan to do this for specific cause areas. We do plan to do some work on testing/developing EA’s brand (as mentioned above in the community health section). However, I think that other organisations (e.g. 80,000 Hours) also play an important role, and I think it’s OK (maybe good) if there are a few different representations of EA ideas (which might work well for different audiences).

Support promising individuals and organizations

Supporting organizations: as mentioned in our annual review, we do some work to support organizations as they work through internal conflicts/HR issues. We also currently make grants to other organizations via EA Funds. We also provide operational support to 80,000 Hours, Forethought Foundation, GWWC, and a long-termist project incubator. Other than this, we don’t plan to work in this space.

Supporting individuals: Again, we currently do this to some extent via EA Funds. Historically we focused a bit more on identifying and providing high-touch support to individuals. I think that our comparative advantage is to focus more on facilitating groups and discussion, rather than identifying promising individuals. So this isn’t a current focus, although we do some aspects of this via e.g. support for group leaders. I think that some of this sort of work is done via career development programs like FHI’s Research Scholars Program or Charity Entrepreneurship’s internship program. I also think that lots of organizations do some of this work via their hiring processes. But I think there might be room for extra work identifying and supporting promising individuals.

In terms of non-financial support, the groups team provides support and advice to group organizers, and the community health team provides support to community members experiencing a problem or conflict within the community.

obtain and develop tools and infrastructure to support EA researchers

I think that the Forum provides some infrastructure for public discussion of research ideas. Apart from that, I don’t think this is our comparative advantage and we don’t plan to do this.

Leading to common answers and community-wide decisions to some key questions about EA (should we expand or keep it small, should we have a different distribution of cause areas, should we invest more in cause prioritization or meta causes, ..)

We do some work to convene discussion on this between key organizations/individuals (e.g. I think this sometimes happens on the Forum, and our coordination forum event allows people to discuss such questions, and where they can build relationships that allow them to coordinate more effectively). But we don’t do things that lead to “common answers or community-wide decisions”.

I actually don’t think we need to have a common answer to a lot of these questions: I think it’s important for people to be sharing their reasoning and giving feedback to each other, but often it’s fine or good if there are some different visions for the community’s future, with people working on the aspect of that which feels most compelling to them. For instance, I think that CEA has quite a different focus now from GWWC or Charity Entrepreneurship or OP or GPI, but I think that our work is deeply complimentary and the community is better off having a distribution of work like this. I also think that it works pretty well for individuals (e.g. donors, job-seekers) to decide which of those visions they most want to support, thus allowing the most compelling visions to grow.

For similar reasons, I think it would be bad to have a single organization “leading” the community. I think that CEA aspired to play this role in the past but didn’t execute it well. I think that the current slightly-more-chaotic system is likely more robust and innovative than a centralized system (even if it were well-executed). (Obviously there’s some centralization in the current system too - e.g. OP is by far the biggest grantmaker. I don’t have a strong view about whether more or less centralization would be better on the current margin, but I am pretty confident that we don’t want to be a lot more centralized than we currently are.)

Some other things we’re not planning to focus on:

  • Reaching new mid- or late-career professionals (though we are keen to retain mid- or late- career people and to make them feel welcome, we’re focused on recruiting students and young professionals)
  • Reaching or advising high-net-worth donors
  • Fundraising in general
  • Cause-specific work (such as community building specifically for effective animal advocacy, AI safety, biosecurity etc)
  • Career advising
  • Research, except about the EA community

Some of our work will occasionally touch on or facilitate some of the above (e.g. if groups run career fellowships, or city groups do outreach to mid-career professionals), but we won’t be focusing on these areas.

As I mentioned, we might say more on this in a separate post soon.

I'm also not sure from the post if you consider this mission as a long-term focus of CEA, or if this is only for the 1-2 coming years.

I expect this mission to be our long-term focus.

Comment by MaxDalton (Maxdalton) on What are you grateful for? · 2020-11-27T10:06:12.470Z · EA · GW

I'm grateful to colleagues who have worked hard through a sometimes-difficult year, been willing to try out new things (like online events), and somehow kept a sense of fun through it all.

I'm especially grateful when they point out ways I could do better and help me to grow. 

Comment by MaxDalton (Maxdalton) on What are you grateful for? · 2020-11-27T10:02:39.197Z · EA · GW

I'm grateful to group leaders: running a group can be difficult and most people do it on top of full time work or studies. It requires so many different skills - being socially adept, knowing the latest research, and being able to orchestrate complex plans. 

And I think it's really important work: it creates a personal and sustained way for people to learn about EA and decide to take action. Empirically, loads of great people got into EA this way.

Comment by MaxDalton (Maxdalton) on What are you grateful for? · 2020-11-27T09:58:03.017Z · EA · GW

I'm grateful that effective altruism gives me a sense of purpose and a global community. 

It feels like it fills some of the human need I have to be part of a village.

Comment by MaxDalton (Maxdalton) on What are you grateful for? · 2020-11-27T09:48:51.348Z · EA · GW

(I know I'm one day late for Thanksgiving! I hope that people who celebrated it had a good day.)

Comment by MaxDalton (Maxdalton) on What quotes do you find most inspire you to use your resources (effectively) to help others? · 2020-11-19T09:36:00.944Z · EA · GW

"One day we ... may have the luxury of going to any length in order to prevent a fellow sentient mind from being condemned to oblivion unwillingly. If we ever make it that far, the worth of a life will be measured not in dollars, but in stars.

"That is the value of a life. It will be the value of a life then, and it is the value of a life now.

"So when somebody offers $10 to press that button, you press it. You press the hell out of it. It's the best strategy available to you; it's the only way to save as many people as you can. But don't ever forget that this very fact is a terrible tragedy.

"Don't ever forget about the gap between how little a life costs and how much a life is worth. For that gap is an account of the darkness in this universe, it is a measure of how very far we have left to go."

 - Nate Soares, The Value of a Life

Comment by MaxDalton (Maxdalton) on What quotes do you find most inspire you to use your resources (effectively) to help others? · 2020-11-19T09:30:15.934Z · EA · GW

"Three passions, simple but overwhelmingly strong, have governed my life: the longing for love, the search for knowledge, and unbearable pity for the suffering of mankind. These passions, like great winds, have blown me hither and thither, in a wayward course, over a great ocean of anguish, reaching to the very verge of despair.

"I have sought love, first, because it brings ecstasy - ecstasy so great that I would often have sacrificed all the rest of life for a few hours of this joy. I have sought it, next, because it relieves loneliness--that terrible loneliness in which one shivering consciousness looks over the rim of the world into the cold unfathomable lifeless abyss. I have sought it finally, because in the union of love I have seen, in a mystic miniature, the prefiguring vision of the heaven that saints and poets have imagined. This is what I sought, and though it might seem too good for human life, this is what--at last--I have found.

"With equal passion I have sought knowledge. I have wished to understand the hearts of men. I have wished to know why the stars shine. And I have tried to apprehend the Pythagorean power by which number holds sway above the flux. A little of this, but not much, I have achieved.

"Love and knowledge, so far as they were possible, led upward toward the heavens. But always pity brought me back to earth. Echoes of cries of pain reverberate in my heart. Children in famine, victims tortured by oppressors, helpless old people a burden to their sons, and the whole world of loneliness, poverty, and pain make a mockery of what human life should be. I long to alleviate this evil, but I cannot, and I too suffer.

"This has been my life. I have found it worth living, and would gladly live it again if the chance were offered me."

 - Prologue to Bertrand Russell's Autobiography.

Comment by MaxDalton (Maxdalton) on What types of charity will be the most effective for creating a more equal society? · 2020-10-13T10:06:33.522Z · EA · GW

Thanks! Those are both good points. I think you're right that they're open to changing their minds about some important aspects of their worldview (though I do think that "Please, if you disagree with me, carry your precious opinion elsewhere. " is some evidence that there are aspects that they're not very open to changing their mind about).

I also think that I reacted too strongly to the emotionally laden language - I agree this can be justified and appropriate, though I think it can also make collaborative truth-seeking harder. This makes me think that it's good to acknowledge, feel, and empathize with anger/sadness, whilst still being careful about the potential impact it might have when we're trying to work together to figure out what to do to help others. I do still feel worried about some sort of oversimplification/overconfidence wrt "all other problems are just derivatives".

To be clear, I always thought it was good to engage in discussion here rather than downvote, but I'm now a bit more optimistic about the dialogue going well.

Comment by MaxDalton (Maxdalton) on What types of charity will be the most effective for creating a more equal society? · 2020-10-13T05:57:15.186Z · EA · GW

I didn't downvote, but I imagine people are reacting to a couple of phrases in the OP:

Please, if you disagree with me, carry your precious opinion elsewhere. I am only interested in opinions on how to most effectively create a more equal society.

I think that being open to changing your mind is an important norm. I think you could read this sentence as a very reasonable request to keep this discussion on topic, but I worry that it is a more general stance. (I also find the phrasing a bit rude.)

Some of the other phrases (e.g. "conviction" "deeply sick" "all other problems are just derivatives") make me worry about whether this person will change their mind, make me worry that they're overconfident, and make me worry that they'll use heated discourse in arguments rather than collaboratively truth seeking. All of these (if true) would make me a bit less excited about welcoming them to the community.

I also think that I could be reading too much into such phrases - I hope this person will go on to engage open-mindedly in discussion.

I really liked your answer - I think it's absolutely worth sharing resources, gently challenging, and reinforcing norms around open-minded cause prio. I personally think that that's a better solution than downvoting, if people have the time to do so.