EA Debate Championship & Lecture Series 2021-04-05T16:30:52.204Z
Announcing "Naming What We Can"! 2021-04-01T10:17:28.990Z
Early Alpha Version of the Probably Good Website 2021-03-01T15:13:41.709Z
Cost-effectiveness analysis of a program promoting a vegan diet 2020-11-12T17:24:28.721Z
Introducing Probably Good: A New Career Guidance Organization 2020-11-06T14:50:56.726Z
EA Israel Strategy 2020-21 2020-09-26T13:22:37.169Z


Comment by sella on EA Debate Championship & Lecture Series · 2021-04-08T19:41:33.246Z · EA · GW

Hi Ben, thanks for these questions.

Regarding whether we achieved “deep” engagement. We have not formally followed up with participants to be able to answer this meaningfully. I can say anecdotally that a couple of participants I know personally have since been active on EA-related Facebook groups, but I don’t know if this generalizes. We’ve also collected the contact information of participants in the study and are able to follow up with additional surveys (of those interested) in the future, exactly for analysis such as this. Also, just a minor clarification - what we meant by the original sentence was a “deep” engagement in a rather informal sense - we felt people meaningfully engaged with the material. We did not mean to claim we achieved deep engagement in a stronger sense than that, e.g. as the term is used in the model of an EA group.

Regarding analysis of how likely debaters are to reach global positions of influence - I’m not aware of any proper measurement of this. My impression that this is true comes not from looking at the distribution of careers of past debaters, but rather looking at the portion of top political leaders with experience as successful debaters, and getting the impression this is substantially higher than the portion of debaters in general (but again - just an impression, no proper analysis here). While quickly searching for any analysis on this I haven’t found anything truly reliable, but I did aggregate the examples I found in this document in case it’s helpful to anyone. I’d be interested if anyone has done or knows of more reliable data on this.

Comment by sella on Early Alpha Version of the Probably Good Website · 2021-03-02T15:12:26.930Z · EA · GW

Hi Brian, thanks for the feedback. While we do hope to add other indicators of credibility, we don’t plan on featuring Effective Altruism Israel specifically in the website. Though both Omer and I are heavily involved in EA Israel, and though it seems likely that Probably Good would not exist had EA Israel not existed, it is a separate org and effort from EA Israel. It is “supported by EA Israel” in the sense that I think members of EA Israel are supportive of the project (and I hope members of many other communities are too), but it is not “supported by EA Israel” in the sense that we receive any funding or resources from the group. I mention this mainly because our mission and intended audience are global, and connecting the website or organization with EA Israel may lead to confusion or discourage those who are not from Israel from engaging with us.

Comment by sella on Early Alpha Version of the Probably Good Website · 2021-03-02T15:01:46.541Z · EA · GW

Hi Peter, thanks for these suggestions!

I hadn’t seen the doc you linked to before, and is indeed a good starting point. We’re actively working on our internal M&E strategy at the moment, so this is particularly useful to us right now.

I agree with the other suggestions, and those are already planned. Their full implementation might take a while, but I expect us to have some updates related to this soon. 

Comment by sella on Early Alpha Version of the Probably Good Website · 2021-03-02T11:51:52.193Z · EA · GW

Thanks for this detailed feedback, I’m happy to hear you think the article would be useful to people in situations you’ve been in. All three of the points you raised seem reasonable - some touch on nuances that I already have down in my full notes but were dropped for brevity, while others are things we hadn’t heard yet from the people we interviewed (including those acknowledged in the article, and several others who preferred to remain anonymous). Based on consultation with others I’ll look into incorporating some of these nuances, though I apologize in advance that not all nuances will be incorporated.

Comment by sella on Introducing Probably Good: A New Career Guidance Organization · 2020-11-13T09:58:27.586Z · EA · GW

We’re definitely taking into account the different comments and upvotes on this post. We appreciate people upvoting the views they’d like to support - this is indeed a quick and efficient way for us to aggregate feedback.

We’ve received recommendations against opening public polls about the name of the organization from founders of existing EA organizations, and we trust those recommendations so we’ll probably avoid that route. But we will likely look into ways we can test the hypothesis of whether a “less controversial” name has positive or negative effects on the reaction of someone hearing this name for the first time.

Comment by sella on Introducing Probably Good: A New Career Guidance Organization · 2020-11-13T09:56:33.622Z · EA · GW

Hi Manuel, thanks for this comment. I think I agree with all your considerations listed here. I want to share some thoughts about this, but as you’ve mentioned - this is one of our open questions and so I don’t feel confident about either direction here.

First, we have indeed been giving general career coaching for people in Israel for several years now, so in a sense we are implementing your recommended path and are now moving onto the next phase of that plan. That being said, there still remain reasons to continue to narrow our scope even at this stage.

Second, you mention partnering with experts in the various cause areas to ensure accurate content - I completely agree with this, and wouldn’t dream of providing concrete career advice independently in fields I don’t have experience in. In the content we are writing right now we require interviewing at least 7 experts in the field to provide high-confidence advice, and at least 3 experts in the field even for articles we mark as low confidence (of which we warn people to be careful about). So it’s really important to me to clarify that none of the concrete career-specific advice we provide will be based exclusively on our own opinions or knowledge - even within the fields we do have experience in.

Finally, I think at least some of the issues you’ve (justifiably) raised are mitigated by the way we aim to provide this advice. As opposed to existing materials, which more confidently aim to provide answers to career-related questions, we have a larger emphasis on providing the tools for making that decision depending on your context. As community organizers, one of the things that pushed us to start this effort is the feeling that many people, who don’t happen to be from the (very few) countries that EA orgs focus on, have very little guidance and resources, while more and more is invested in optimizing the careers of those within those countries. We believe that doing highly focused work on Israel would not serve the community as well as providing guidance on what needs to be explored and figured out to apply EA career advice to your own context. As such, we want to provide recommendations on how to check for opportunities within the scope that’s relevant to you (e.g. country or skillset), rather than aiming to provide all the answers as final conclusions on our website. This applies most to our career guide, but also to specific career path profiles - where we want to first provide the main considerations one should look into, so that we provide valuable preliminary guidance for a wide range of people, rather than end-to-end analysis for fewer people.

The mitigations described above can be much better evaluated once we have some materials online, which will allow others to judge their implementation (and not only our aspirations). We plan on soliciting feedback from the community before we begin advocating for them in any meaningful way - hopefully that will help make these responses less abstract and still leave us time to collect feedback, consider it and try to optimize our scope and messaging.

Comment by sella on Introducing Probably Good: A New Career Guidance Organization · 2020-11-09T22:06:27.958Z · EA · GW

Hi Jack, thanks for the great question. 

In general, I don’t think there’s one best approach. Where we want to be on the education \ acceptance trade-off depends on the circumstances. It might be easiest to go over examples (including ones you gave) and give my thoughts on how they’re different.

First, I think the simplest case is the one you ended with. If someone doesn’t know what cause area they’re interested in and wants our help with cause prioritization, I think there aren’t many tradeoffs here - we’d strongly recommend relevant materials to allow them to make intelligent decisions on how to maximize their impact. 

Second, I want to refer to cases where someone is interested in cause areas that don’t seem plausibly compatible with EA, broadly defined. In this case we believe in tending towards the “educate” side of the spectrum (as you call it), though in our writing we still aim not to make it a prerequisite for engaging with our recommendations and advice. That being said, these nuances may be irrelevant in the short-term future (at least months, possibly more), as due to prioritization of content, we probably won’t have any content for cause areas that are not firmly within EA.

In the case where the deliberation is between EA cause areas (as is the case in your example) there are some nuances that will probably be more evident in our content even from day one (though may change over time). Our recommended process for choosing a career will involve engaging with important cause prioritization questions, including who deserves moral concern (e.g. those far from us geographically, non-human animals, and those in the long term future). Within more specific content, e.g. specific career path profiles, we intend to refer to these considerations but not try and force people to engage with them. If I take your global health example, in a career path profile about development economics we would highlight that one of the disadvantages of this path is that it is mainly promising from a near-term perspective and unclear from a long-term perspective, with links to relevant materials. That being said, someone who has decided they’re interested in global health, doesn’t follow our recommended process for choosing a career, and navigates directly to global health-related careers will primarily be reading content related to this cause area (and not material on whether this is the top cause area). Our approach to 1:1 consultation is similar - our top recommendation is for people to engage with relevant materials, but we are willing to assist people with more narrow questions if this is what they’re interested in (though, much like the non-EA case, we expect to be in over-demand in the foreseeable future, and may in practice prioritize those who are pursuing all avenues to increasing their impact). 

Hope this provides at least some clarity, and let me know if you have other questions.

Comment by sella on Introducing Probably Good: A New Career Guidance Organization · 2020-11-07T20:32:28.925Z · EA · GW

I agree this is an important question that would be of value to other organizations as well. We’ve already consulted with 80K, CE and AAC about it, but still feel this is an area we have a lot more work to do on. It isn’t explicitly pointed out in our open questions doc, but when we talk about measuring and evaluating our counterfactual benefits and harms, this question has been top of mind for us.

The short version of our current thinking is separated into short-term measurement and long-term measurement. We expect that longer term this kind of evaluation will be easier - since we’ll at least have career trajectories to evaluate. Counterfactual impact estimation is always challenging without an experimental set up which is hard to do at scale, but I think 80K and OpenPhil have put out multiple surveys that try to and extract estimates of counterfactual impact and do so reasonably well given the challenges, so we’ll probably do something similar. Also, at that point, we could compare our results to theirs, which could be a useful barometer. In the specific context of our effect on people taking existing priority paths, I think it’ll be interesting to compare the chosen career paths of people who have discovered 80K through our website relative to those who discovered 80K from other sources. 

Our larger area of focus at the moment is how to evaluate the effect of our work in the short term, when we can’t yet see our long-term effect on people’s careers. We plan on measuring proxies, such as changes to their values, beliefs and plans. We expect whatever proxy we use in the short term to be very noisy and based on a small sample size, so we plan on relying heavily on qualitative methods. This is one of the reasons we reached out to a lot of people who are experienced in this space (and we’re incredibly grateful they agreed to help) - we think their intuition is an invaluable proxy to figuring out if we’re heading in the right direction.

This is an area that we believe is important and we still have a lot of uncertainty about, so additional advice from people with significant experience in this domain would be highly appreciated.

Comment by sella on Introducing Probably Good: A New Career Guidance Organization · 2020-11-07T10:15:24.049Z · EA · GW

Hi dglid, I agree with your comment. I think there is a lot of value by making career guidance more available to the masses, even without 80K personally being involved.

I see local groups as being the primary type of organization responsible for this type of work - making EA information accessible and personalized for new people and communities. We don’t see ourselves taking over that role. That being said, we are interested in being involved in the process. We know there’s a lot of interest in creating content / tools / support in the career guidance space, both because we’ve seen it in EA Globals and group organizers’ groups, and also because we are group organizers ourselves, and it’s this need that has set us on this path (originally in our own local group).

All of this is to say - I think working with and empowering local EA groups to provide these services is a great way to improve careers at scale, and would especially love any feedback, requests and comments from local group organizers or anyone else on what you believe would be most helpful to you in this area.

Comment by sella on Introducing Probably Good: A New Career Guidance Organization · 2020-11-07T10:13:46.116Z · EA · GW

Hi Michael, as you mention - the issue of accurately defining our scope is still an important open question to us. I’m happy to share our current thinking about this, but we expect this thinking to evolve as we collect feedback and gain some more hands-on experience.

I think it’s worth making a distinction between two versions of this question. The first is the longer-term question of what is the set of all cause areas that should be within scope for this work. That’s a difficult question. At the moment, we’re happy to use the diversity of views meaningfully held in the EA community as a reasonable proxy - i.e. if there’s a non-negligible portion of EAs that believe a certain cause area is promising we think that’s worth investigating. As such, all three of the examples you mention would be potentially in-scope in my view. This is not, in and of itself, a cohesive and well-defined scope, and as I mentioned, it is likely to change. But I hope this gives at least an idea of the type of scope we’re thinking of.

The second version of this question is what we actually intend to work on in the upcoming months, given that we are just getting started and we are still constrained in time and resources. This question will dominate our actual decisions for the foreseeable future. Within the large scope mentioned above, we want to initially focus on areas based on two criteria: First, unmet needs within the EA community, and second, cause areas that are easier to evaluate. Both of these are very weak signals for where we want to focus long-term, but drastically influence how quickly we can experiment, evaluate whether we can provide significant value, and start answering some of our open questions. As a concrete example, we believe the Global Health & Development fits this bill quite well, and so at least part of our first career paths will be in this space. 

I hope this helps clarify some of these questions. I apologize if there are more open questions here than answers - it’s just really important to us to experiment first and make long-term decisions about priorities and scope afterwards rather than the other way around.

Comment by sella on Introducing Probably Good: A New Career Guidance Organization · 2020-11-07T10:02:31.313Z · EA · GW

That’s actually a great idea. I’ve now added a link from each clean doc to a commentable version. Feel free to either comment here, email us, or comment on the commentable version of the doc. Thanks!

Comment by sella on Introducing Probably Good: A New Career Guidance Organization · 2020-11-06T17:01:31.009Z · EA · GW

Great point Pablo.

I think the analogy to ImpactMatters is insightful and relevant, and indeed reaching a broader audience/scope (even at the cost of including less impactful career paths) is part of the justification for this work. I think the difference between inter-cause elasticity and intra-cause elasticity may be even larger when discussing careers, because in addition to people's priorities and values, many people will have education, experience and skills which make it less likely (or even desirable) that they move to a completely different cause area.

I do however also want to highlight that I think there are justifications for this view beyond just a numbers game. As we discuss in our overview and in our core principles, we think there are disagreements within EA that warrant some agnosticism and uncertainty. One example of this is the more empiricist view which focuses on measurable interventions and views speculative work that cannot be easily evaluated or validated skeptically, vs. the more hits-based approach which focuses on interventions that are less certain but are estimated to have orders of magnitude more impact in expectation. These views are (arguably) at the crux of comparisons between top cause areas that are a core part of the EA community (e.g. global poverty & health vs. existential risk mitigation). For many people working in both of these cause areas, we genuinely believe careers within their field are the most promising thing they could do.

Additionally, we not only believe in broader career advice is useful in optimizing the impact of those who would not choose top priority paths, but actually may lead to more people joining top priority paths in the focus areas of existing career orgs in the long run. As we mention in our overview and in our paths to impact, and based on our experience in career guidance so far, we believe that providing people answers to the questions they already care about, while discussing crucial considerations they might not think about often, is a great way to expose people to impact maximization principles. Our hope is that even if we care exclusively about top priority paths already researched by 80K and others, this organization will end up having a net positive effect on the number of people who pursue these paths. Whether this will be the case, of course, remains to be seen - but we intend on measuring and evaluating this question as one of our core questions moving forward.

Comment by sella on We're Lincoln Quirk & Ben Kuhn from Wave, AMA! · 2020-10-28T14:49:14.558Z · EA · GW

Thank you both for your thoughtful answers.

To clarify, I don't have a strong opinion on this comparison myself, and would love to hear more points of view on this. Sadly I'm not aware of any reading materials on this topic, but have heard the following arguments made in one on one conversations:

  1. For-profit entrepreneurship has built-in incentives that already cause many entrepreneurs to try and implement any promising opportunities. As a result, we'd expect it to be drastically less neglected, or at least drastically less neglected relative to nonprofit opportunities that are similar in how promising they are. This can affect both our estimate of how much we'd expect to find good opportunities still lying around, and also how we'd estimate our counterfactual impact (as if we hadn't implemented a profitable intervention, there's higher likelihood someone else would).
  2. The specific cause areas that the EA movement currently sees as the most promising - including global poverty and health, animal welfare, and the longterm future - all serve recipients who (to different degrees) are incapable of significantly funding such work. This could be seen as directly related to the first point, but even if the first point is false, one could still argue that it just happens to be the case that the most promising cause areas are not a good fit for for-profit entrepreneurship.  I think the case here applies more strongly to animals and future people (who clearly can't pay for services), but to a lesser extent can also apply to the extremely poor who can pay only very little.
  3. For-profit organizations may produce incentives that make it unlikely to make the decisions that will end up producing enormous impact (in the EA sense of that term). One variation of this argument is that the revenue/growth needs tend to always come first (I can't do any good if I don't exist), which means there ends up being little freedom to optimize for impact. Another variation argues that even if one could optimize for impact, these incentives alongside the environment can cause significant value drift, and many people following this path will end up not doing so.
  4. Finally, I've also heard from several people the claim that today EA has an immense amount of funding, and if you're a competent person founding a charity that works according to EA principles it is incredibly easy to get non-trivial amounts of funding. This is not necessarily an argument for nonprofits, but this potentially somewhat mitigates what is perhaps the strongest argument against nonprofits - access to capital. Somewhat like point 2, this is a more circumstantial argument than an inherent.

Finally, the fact that I listed arguments in favor of nonprofit entrepreneurship over for-profit entrepreneurship may give the impression that this is my opinion, so I want to clarify again that it is not and I am highly uncertain about this topic.

Comment by sella on We're Lincoln Quirk & Ben Kuhn from Wave, AMA! · 2020-10-27T19:51:19.729Z · EA · GW

Hi Lincoln and Ben, thanks for doing this! I would love to hear your perspective on the following topic:

Nonprofit entrepreneurship is a dominant career path within EA, with many people excited about the impact that it can achieve. Impact-focused for-profit entrepreneurship is rarely discussed or recommended by EA organizations, with a 2016 article about your startup being one of the only materials on this topic. I have also heard multiple people argue that for-profit entrepreneurship is an inherently less promising path than nonprofit entrepreneurship for various reasons.

What is your view on the value of for-profit entrepreneurship from an EA perspective? Do you believe this career path is undervalued by the EA community and its organizations today? If so, what do you believe people interested in for-profit entrepreneurship should do to found highly impactful organizations? Are there any specific opportunities you think are particularly interesting or exciting in this space?

Thanks in advance!

Comment by sella on EA Israel Strategy 2020-21 · 2020-10-02T09:43:55.708Z · EA · GW

Hi Brian, thanks for the kind words and the insightful feedback!

Here are my thoughts on the points you raise (not necessarily coordinated or representative of EA Israel in general):

1. I totally agree with your point about having separate metrics for proficiency with EA vs. engagement with EA Israel. In practice, our contributors and participants groups are actually some mix between these two metrics. For example, EAs professionally working in high-priority paths were often included in them (if they were interested) even if they were not actively engaging with EA Israel. All of this is to say - I don't think we're neglecting people who are proficient or involved in EA in practice, but this only strengthens the case for referencing these metrics explicitly.

2. I agree there is room for some more longterm planning. To be honest, it has been very long since it was unclear whether EA Israel would survive months, let alone years. It was only fairly recently, when Gidi started working part time (and received CEA's community building grant which allowed him to do so), that we've been able to more meaningfully organize and plan. We've started with defining this strategy, which focuses mostly on the upcoming year, and have not yet done meaningful longer term planning (and will probably only do so after our new mode of operation settles a bit). Regarding many of your specific questions, such as the number of sub-groups, and how those would be divided - I genuinely don't know, and prefer an experimental/empirical approach rather than trying to dictate our end goal from first principles.

3. I think the comment about student groups is a great point, and wanted to share a few thoughts on this. Students are actually our top-priority audience group, and a majority of our outreach efforts are focused on students (including a new academic course, a new fellowship, thesis consulting with effective thesis and more). In fact, reading this comment I re-read the strategy doc and was surprised to find out students are not mentioned even once given how much we prioritize and discuss outreach to students internally. I think we should definitely update this fact as part of our strategy. The other when referring to student groups is the specific framework, for which student groups are one possibility. I am generally in favor of this, though less certain as to whether this is the right way for us to try and get students involved. Formal student groups are less common and popular in Israel than in many other countries (see David's comment about some reasons why), so we're still thinking about how to onboard and organize students. We're going to try fellowships this year, there are also "cells" which are groups of students organizing to take action on a specific topic (e.g. sustainability, or various political groups). Long story short, engaging students is one of our top priorities, though we're still not willing to commit to a specific framework for doing so.

4. I would roughly estimate that about 80% of our time is dedicated to community building & outreach, while only about 20% of time is dedicated to direct work projects. In terms of the optimal ratio, this is a bit challenging to phrase in a way that isn't misleading. As we mention in the strategy doc, a main driver of our involvement in direct work projects is new members' interest in doing them, and the fact that we think it's a good way to get people more involved. Since one of our top priorities is to grow the pipeline of people becoming deeply involved, and (given our strategy) this would have the effect of having more people working on direct-work projects (at least in the early stages of their involvement), then I would love to see the percentages of direct work projects go up. However, I would not want people working on community building & outreach to switch to working on direct-work projects.

Thanks again, and please keep the questions and feedback coming!