Posts

Does participation in Effective Altruism predict success when applying to CEA? 2022-09-27T18:34:13.851Z
[Job] Project Manager: Community Health 2022-09-10T18:40:08.667Z
[TikTok] Comparability between suffering and happiness 2022-09-01T16:31:19.771Z
Product Managers: the EA Forum Needs You 2022-06-22T16:22:19.777Z
You should join an EA organization with too many employees 2022-05-21T19:38:21.160Z
EA Tours of Service 2022-05-10T17:03:37.510Z
Why CEA Online doesn’t outsource more work to non-EA freelancers 2022-05-04T14:52:04.769Z
Is it still hard to get a job in EA? Insights from CEA’s recruitment data 2022-04-29T17:39:50.392Z
Comparative advantage does not mean doing the thing you're best at 2022-04-29T15:49:41.310Z
Advice on people management from EA Global 2022-04-17T15:26:23.333Z
TikTok DGB Giveaway Results 2022-04-15T09:51:56.978Z
Software Developers: Should you apply to work at CEA? 2022-03-16T13:19:21.930Z
Why CEA has stopped using Net Promoter Score 2022-02-03T15:43:20.629Z
Supporting Video, Audio, and other non-text media on the Forum 2021-12-19T18:58:15.488Z
Creating Individual Connections via the Forum 2021-11-30T22:48:17.195Z
EA Communication Project Ideas 2021-11-19T19:56:58.008Z
TikTok EA Book Giveaway Intermediate Results 2021-11-11T23:53:39.050Z
EA Forum engagement doubled in the last year 2021-11-04T10:44:33.247Z
[Creative Writing Contest] An AI Safety Limerick 2021-10-18T19:11:32.184Z
Ben_West's Shortform 2021-10-17T19:59:24.343Z
Does the Forum Prize lead people to write more posts? 2021-09-21T03:09:30.837Z
Who do intellectual prizewinners follow on Twitter? 2021-08-25T15:26:19.293Z
[PR FAQ] Improving tag notifications 2021-08-09T10:23:25.239Z
Our plan to share "PR FAQs" for new Forum features 2021-07-29T04:10:26.253Z
Some 2021 CEA Retention Statistics 2021-07-09T17:11:29.933Z
Thoughts on being overqualified for EA positions 2021-04-30T03:19:41.186Z
[Job Ad] Help us make this Forum better 2021-03-25T23:23:26.801Z
Layman’s Summary of Resolving Pascallian Decision Problems with Stochastic Dominance 2021-03-12T03:51:24.215Z
Retention in EA - Part II: Possible Projects 2021-02-05T19:09:31.361Z
Retention in EA - Part I: Survey Data 2021-02-05T19:09:18.450Z
Retention in EA - Part III: Retention Comparisons 2021-02-05T19:02:05.324Z
EA Group Organizer Career Paths Outside of EA 2020-07-14T23:44:10.799Z
Are there robustly good and disputable leadership practices? 2020-03-19T01:46:38.484Z
Harsanyi's simple “proof” of utilitarianism 2020-02-20T15:27:33.621Z
Quote from Strangers Drowning 2019-12-23T03:49:51.205Z
Peaceful protester/armed police pictures 2019-12-22T20:59:29.991Z
How frequently do ACE and Open Phil agree about animal charities? 2019-12-17T23:56:09.987Z
Summary of Core Feedback Collected by CEA in Spring/Summer 2019 2019-11-06T11:52:48.390Z
EA Art: Neural Style Transfer Portraits 2019-10-03T01:37:30.703Z
Is pain just a signal to enlist altruists? 2019-10-01T21:25:44.392Z
Ways Frugality Increases Productivity 2019-06-25T21:06:19.014Z
What is the Impact of Beyond Meat? 2019-05-03T23:31:40.123Z
Identifying Talent without Credentialing In EA 2019-03-11T22:33:28.070Z
Deliberate Performance in People Management 2017-11-25T14:41:00.477Z
An Argument for Why the Future May Be Good 2017-07-19T22:03:17.393Z
Vote Pairing is a Cost-Effective Political Intervention 2017-02-26T13:54:21.430Z
Ben's expenses in 2016 2017-01-29T16:07:28.405Z
Voter Registration As an EA Group Meetup Activity 2016-09-16T15:28:46.898Z
You are a Lottery Ticket 2015-05-10T22:41:51.353Z
Earning to Give: Programming Language Choice 2015-04-05T15:45:49.192Z

Comments

Comment by Ben_West on To those who contributed to Carrick Flynn in Oregon CD-6 - Please help now · 2022-10-04T01:31:18.827Z · EA · GW

a principled distinction between the two isn't apparent to me.

Indeed, there was an explicit tit-for-tat donation of out-of-state money to Salinas intended to offset donations to Flynn:

Bold PAC, the Congressional Hispanic Caucus (CHC) campaign arm, is preparing the seven-figure independent expenditure in favor of its endorsed candidate, state Rep. Andrea Salinas (D), in the race for Oregon’s newly created 6th District.

... The new investment is a tit-for-tat blow against the Democratic leadership PAC, House Majority PAC (HMP), which earlier this month pledged a similar investment to prop up political newcomer Carrick Flynn in that race.

Comment by Ben_West on I'm interviewing prolific AI safety researcher Richard Ngo (now at OpenAI and previously DeepMind). What should I ask him? · 2022-10-03T15:01:07.403Z · EA · GW

What does he think about rowing versus steering in AI safety? Ie does he think we are basically going in the right direction and we just need to do more of it, or do we need to do more thinking about the direction in which we are heading?

Comment by Ben_West on Yonatan Cale's Shortform · 2022-09-28T12:44:36.599Z · EA · GW

Grantmakers are welcome to ask me for a reference. Yonatan is aligned and very dedicated, and is both knowledgeable about and helpful to many software engineers (see reviews here). He's also been directly helpful to us with recruiting, and I've referred him to multiple EA org's who are trying to hire software engineers.

Comment by Ben_West on On how various plans miss the hard bits of the alignment challenge · 2022-09-23T20:19:54.681Z · EA · GW

Thanks! That sounds right to me, but I had thought that Nate was making a stronger objection, something like "looking at nonhuman brains is useless because you could have a perfect understanding of a chimpanzee brain but still completely fail to predict human behavior (after a 'sharp left turn')."

Is that wrong? Or is he just saying something like "looking at nonhuman brains is 90% less effective and given long enough timelines these research projects will pan out - I just don't expect us to have long enough timelines?"

Comment by Ben_West on [deleted post] 2022-09-19T17:10:37.756Z

Also, this is a remarkably unhelpful graph. Like, you have to genuinely put effort in to make data this hard to understand

Comment by Ben_West on [deleted post] 2022-09-19T17:09:44.995Z

It's interesting to see how strongly typed languages have taken over the mind share of engineers: https://insights.stackoverflow.com/survey/2021#most-loved-dreaded-and-wanted-language-love-dread

Comment by Ben_West on Ben_West's Shortform · 2022-09-16T23:45:40.723Z · EA · GW

Democratizing risk post update
Earlier this week, a post was published criticizing democratizing risk. This post was deleted by the (anonymous) author. The forum moderation team did not ask them to delete it, nor are we aware of their reasons for doing so.
We are investigating some likely Forum policy violations, however, and will clarify the situation as soon as possible.

Comment by Ben_West on EA Forum feature suggestion thread · 2022-09-14T15:54:31.059Z · EA · GW

Thanks! I've added this to the issue tracking your original suggestion.

Comment by Ben_West on EA Forum feature suggestion thread · 2022-09-14T15:50:13.434Z · EA · GW

Thanks for the suggestion! I passed this on to our events team

Comment by Ben_West on Who's hiring? (May-September 2022) · 2022-09-10T18:35:51.765Z · EA · GW

CEA's Community Health team is hiring a project manager:

With recent media attention, increased funding, and growing ambition among community members, this is one of the most exciting times for EA, but also one of the riskiest. We need an ops-minded generalist to help address these risks, through both end-to-end ownership of targeted projects as well as building broader systems and processes to support other team members.

Example projects you might own:

  1. Create and manage a fund for community members’ physical and mental health
  2. Categorize the EA community into useful segments and conduct interviews to discover and understand their challenges, then summarize this feedback for stakeholders
  3. Organize a retreat for team members to discuss and work on key problems in the EA community

If this sounds like you, please apply!

Comment by Ben_West on On how various plans miss the hard bits of the alignment challenge · 2022-09-09T17:16:37.222Z · EA · GW

You maintain this pretty well as it walks up through to primate, and then suddenly it takes a sharp left turn and invents its own internal language and a bunch of abstract concepts, and suddenly you find your visualization tools to be quite lacking for interpreting its abstract mathematical reasoning about topology or whatever.

Empirically speaking, scientists who are trying to understand human brains do spend a lot (most?) of their time looking at nonhuman brains, no?

Is Nate's objection here something like "human neuroscience is not at the level where we deal with 'sharp left turn' stuff, and I expect that once neuroscientists can understand chimpanzee brains very well they will discover that there is in fact a whole other set of problems they need to solve to understand human brains, and that this other set of problems is actually the harder one?"

Comment by Ben_West on How might we align transformative AI if it’s developed very soon? · 2022-09-08T00:21:51.141Z · EA · GW

Is anyone working on detecting symmetric persuasion capabilities? Does it go by another name? Searches here and on lw don't turn up much.

Comment by Ben_West on Would creating and burying a series of doomsday chests to reboot civilization be a worthy use of resources? · 2022-09-07T03:36:15.230Z · EA · GW

This idea was discussed in some depth on this podcast: https://80000hours.org/podcast/episodes/lewis-dartnell-getting-humanity-to-bounce-back-faster/

Comment by Ben_West on Celebrations and gratitude thread · 2022-09-02T16:57:38.876Z · EA · GW

The earning to give company I started got acquired.

Comment by Ben_West on Celebrations and gratitude thread · 2022-09-02T16:50:42.012Z · EA · GW

And thanks to everyone who reads!

(Our target was 50% increase. We did quite a bit better.)
Comment by Ben_West on Celebrations and gratitude thread · 2022-09-02T16:48:16.813Z · EA · GW

This forum has taken off over the past year. Thanks to all the post authors who have dedicated so much time to writing content for us to read!

Number of posts per day has ~4x'd in the last year
Comment by Ben_West on Brain preservation to prevent involuntary death: a possible cause area · 2022-09-02T16:26:18.796Z · EA · GW

Thanks for writing this! I'm interested in brain preservation, and thought maybe this area would be competitive with existential risk stuff given certain philosophical pre-commitments, but now I'm actually not sure.

Even on a person affecting view, I'm not sure this is cost competitive with x-risk. If there are ~8 billion people who would be alive at the time of a human extinction event, then reducing the risks of extinction by 0.01% would save 800,000 of them (on expectation). Doing this might cost $100MM - $1B, for a cost of $125-$1,250/person.

This is cheaper than I think even fairly optimistic estimates of the cost of brain preservation.

Caveats:

  1. This analysis assumes that the people who survive an extinction event would go on to have a similar quantity and quality of life as those who have been preserved. I think this is a reasonable assumption if the extinction event is AI-singularity-shaped, but not if it's something like a pandemic.
  2. Certain methods of reducing extension risk (e.g. civilizational refuges) still result in almost everyone dying, so the cost-effectiveness of X-Risk reduction on person affecting grounds is probably lower than what I'm assuming above.
  3. These caveats might get you the extra 10-100x you need to become cost competitive, but I'm not sure, and even then you're only getting to cost competitiveness, not being much better.
Comment by Ben_West on Ben_West's Shortform · 2022-09-02T03:25:43.701Z · EA · GW

Person-affecting longtermism

This post points out that brain preservation (cryonics) is potentially quite cheap on a $/QALY basis because people who are reanimated will potentially live for a very long time with very high quality of life.

It seems reasonable to assume that reanimated people would funge against future persons, so I'm not sure if this is persuasive for those who don't adopt person affecting views, but for those who do, it's plausibly very cost-effective.

This is interesting because I don't hear much about person affecting longtermist causes.

Comment by Ben_West on [TikTok] Comparability between suffering and happiness · 2022-09-02T00:04:44.159Z · EA · GW

Thanks! I appreciate the detailed feedback.

Comment by Ben_West on [TikTok] Comparability between suffering and happiness · 2022-09-01T22:55:58.546Z · EA · GW

I am not sure, and I think this implies a good objection to my suggestion

Comment by Ben_West on [TikTok] Comparability between suffering and happiness · 2022-09-01T22:12:52.127Z · EA · GW

This is a good point, I agree

Comment by Ben_West on [TikTok] Comparability between suffering and happiness · 2022-09-01T19:24:36.742Z · EA · GW

Thanks for the question!

I'm merely claiming that statements like "one unit of suffering is worth two units of happiness" are coherent because "unit" can be defined in reference to particles. I'm not claiming that any particular ratio is correct, only that it's coherent to talk about ratios other than 1:1.

Comment by Ben_West on [TikTok] Comparability between suffering and happiness · 2022-09-01T17:59:03.295Z · EA · GW

Note: I couldn't find anyone making this specific response to Toby anywhere, but the basic idea has been around for quite some time, maybe originating in this 2012 blog post from Carl.

Comment by Ben_West on [TikTok] Comparability between suffering and happiness · 2022-09-01T16:37:08.397Z · EA · GW

Note: TikTok embeds don't seem to work fully here – if you can't see the video, try refreshing the page.

Comment by Ben_West on On longtermism, Bayesianism, and the doomsday argument · 2022-09-01T16:35:35.716Z · EA · GW

Thanks for writing this! If you're interested, there has been a decent amount of discussion about this on the forum, largely under the banner of whether we are living at the "hinge of history". E.g. in Will's post introducing the phrase he gives what he calls the "outside view" argument, which is similar to what you call the Doomsday Argument.

Comment by Ben_West on Earn To Give $1M/year or Work Directly? · 2022-08-30T05:33:50.332Z · EA · GW

I was interpreting the question to mean that you would donate $1M (and the amount you are making but not donating is left unspecified).

Comment by Ben_West on Earn To Give $1M/year or Work Directly? · 2022-08-29T23:57:53.898Z · EA · GW

Ben Millwood's summary seems basically right to me.

And yup, the team has tripled over the past year, but scaling the further team is a top priority, hence my interest in getting people to apply :)

Comment by Ben_West on How to do theoretical research, a personal perspective · 2022-08-19T23:51:56.547Z · EA · GW

Thanks, this was helpful. I often hear from people who don't really know what it means to do certain types of theoretical research, and I think this will go a decent way towards addressing their uncertainty.

Comment by Ben_West on Interactively Visualizing X-Risk · 2022-07-29T19:23:55.725Z · EA · GW

This is cool, thanks for doing this

Comment by Ben_West on Leaning into EA Disillusionment · 2022-07-23T23:06:28.711Z · EA · GW

That makes sense, thanks!

Comment by Ben_West on Leaning into EA Disillusionment · 2022-07-22T20:33:29.988Z · EA · GW

Thanks for sharing this! Do you have a sense for what the denominator is? I've previously tried to get some sense of this, and found it pretty challenging (mostly for obvious reasons like "people who have left EA are by definition harder for me to contact").

I'm guessing 3-5 people is like 1 in 50 of the EA's you know, over the course of a ~decade?

Comment by Ben_West on Leaning into EA Disillusionment · 2022-07-22T20:31:58.863Z · EA · GW

Do you have a sense for how common this kind of disillusionment is? I've previously tried to get some sense of this, and found it pretty challenging (mostly for obvious reasons like "people who have left EA are by definition harder for me to contact").

You say you know a handful of people for whom this is true, which I'm guessing is like 1 in 50 of the EA's you know, over the course of a decade?

Comment by Ben_West on If you're unhappy, consider leaving · 2022-07-22T20:23:45.693Z · EA · GW

Thanks for considering this. It does seem possible to me that more people should change how they interact with EA.

I feel a little surprised by you describing this as "leaving" though – I think basically everyone I know in EA has "left", by your definition. Do you have a different experience?

E.g. I personally meet all three of your criteria: my friends are mostly non-EA's, even though I work at an EA organization I don't rely on it for income, and my sense of self worth is not wholly tied to impact. But I would feel pretty surprised if someone described me as having left EA?

Comment by Ben_West on Is it still hard to get a job in EA? Insights from CEA’s recruitment data · 2022-07-20T20:21:39.876Z · EA · GW

Hmm, if we are still talking about comparing CEA versus Ashby, I'm not sure this carves reality at the joints: it's certainly true that people with zero experience have an uphill battle getting hired, but I don't think CEA is unusual in this regard. (If anything, I would guess that we are more open to people with limited experience.)

Comment by Ben_West on Is it still hard to get a job in EA? Insights from CEA’s recruitment data · 2022-07-18T13:12:12.439Z · EA · GW

the thing I find confusing is that you “didn't have particularly strong opinions about whether EA jobs are still hard to get.”... So I don’t really understand why you present a lot of data that all points the same way, yet remain unconvinced by the conclusion they lead to.

 

I think I'm largely like "bruh, literally zero of our product manager finalist candidates had ever had the title "product manager" before, how could we possibly be more selective than Ashby?"[1]

Some other data points:

  1. When I reach out to people who seem like good fits, they often decline to apply, meaning that they don't even get into the data set evaluated here
  2. When I asked some people who are well-connected to PMs to pass on the job to others they know, they declined to do so because they thought the PMs they knew would be so unlikely to want it it wasn't worth even asking

I acknowledge that, if you rely 100% on the data set presented here, maybe you will come to a different conclusion, but I really just don't think the data set presented here is that compelling.

  1. ^

    As mentioned, our candidates are impressive in other ways, and maybe they are more impressive than the average Ashby candidate overall, but I just don't think we have the evidence to confidently say that.

Comment by Ben_West on Is it still hard to get a job in EA? Insights from CEA’s recruitment data · 2022-07-16T02:41:55.417Z · EA · GW

offer rate is more relevant to selectivity (if you disagree, could you explain why?)

I think it's pretty uncontroversial that our applicants are more dedicated (i.e. more likely to accept an offer). My understanding of Ashby is that it's used by a bunch of random tech recruiting agencies, and I would guess that their applicants have ~0 pre-existing excitement about the companies they get sent to. 

I don’t see any metrics that suggest the opposite is true, or even that it’s a close call or ambiguous in any way.

The statement in the post is "CEA might be slightly more selective than Ashby’s customers, but it does not seem like the difference is large". This seems consistent with the view that CEA is selective? (It also just implies that Ashby is selective, which is a reasonable thing to believe.[1])

--

As a meta point: I kind of get the sense that you feel that this post is intended to be polemical, like we are trying to convince people that CEA isn't selective or something. But as you originally said: "the authors don’t seem to take an explicit stance on the issue" – we just wanted to share some statistics about our hiring and, at least as evidenced by that first comment of yours, we were somewhat successful in conveying that we didn't have particularly strong opinions about whether EA jobs are still hard to get.

This post was intended to provide some statistics about our hiring, because we were collecting them for internal purposes anyway so I figured we might as well publish. We threw in the Ashby thing at the end because it was an easily accessible data point, but to be honest I kind of regret doing that – I'm not sure the comparison was useful for many people, and it caused confusion.

  1. ^

    It sounds to me like you think Ashby is selective: "the Ashby benchmark (which itself likely captures selective jobs)."

Comment by Ben_West on Is it still hard to get a job in EA? Insights from CEA’s recruitment data · 2022-07-15T16:53:10.104Z · EA · GW

Thanks yeah sorry, there is a greater change in the percentage of drop off for Ashby on-site -> hired, but because we start with a smaller pool we are still more selective. 1 in 7 versus 1 in 5 is the correct comparison.

This data shows clear, consistent, and large differences all suggesting that CEA is much more selective than the industry benchmark

I guess I'm flattered that you trust the research we did here so much, but I think it's very much not clear:

  1. The number of applicants we get is very heavily influenced by how widely we promote the position, if the job happens to get posted to a job aggregator site, etc. To take a concrete example: six months ago we hired for a PM and got 52 applicants; last month we opened another PM position which got on to some non-EA job boards and got 113 applicants. If we hire one person from each round, I think you will say that we have gotten more than twice as selective, which is I guess kind of true, but our hiring bar hasn't really changed (the person who we hired last time would be a top candidate this time).
  2. I don't really know what Ashby's candidate pool is like, but I would guess their average applicant has more experience than ours – for example: none of our final candidates last round ever even had the job title "product manager" before, though they had had related roles, and in the current round neither of the two people at the furthest round in the process have ever had a PM role. I would be pretty surprised if Ashby's final rounds were consistently made up of people who had never been PMs before.[1]

The conclusion of this post was "Overall, CEA might be slightly more selective than Ashby’s customers, but it does not seem like the difference is large" and that still seems basically right to me: 1/7 vs. 1/5 is more selective, but well within the margin of error given how much uncertainty I have.

I think the OP’s summary of the industry benchmark exercise is extremely misleading

Thanks – I just cut that sentence since my inability to communicate my view even with our substantial back-and-forth makes me pessimistic about making a summary.

  1. ^

    In general, I would guess that CEA's applicants have substantially less experience than their for-profit counterparts, as EA is quite young, but our applicants are more impressive given their age. E.g. we get a lot of college student applicants, but those students are going to prestigious universities.

Comment by Ben_West on Co-Creation of the Library of Effective Altruism [Information Design] (1/2) · 2022-07-14T15:24:53.830Z · EA · GW

Some thoughts:

  1. Entrepreneurship and strategy: 
    1. I think lean startup seems better than everything there.
    2. If you're looking for something in particular drop, maybe good strategy, bad strategy?
  2. Change of heart: I don't think this has survived the replication crisis well, I would drop it. 
  3. Why We Love Dogs, Eat Pigs, and Wear Cows: I didn't think this was a very epistemically rigorous book, I would drop it
Comment by Ben_West on Is it still hard to get a job in EA? Insights from CEA’s recruitment data · 2022-07-12T19:59:23.085Z · EA · GW

Sorry for my slow response here, I missed the notification about your comment.

If EOIs are hard to get, that seems relevant to the question of whether EA jobs are hard to get since EOIs are quite sought after

I think maybe we just didn't explain what EOIs are well. As an example: we had a product manager EOI; once we opened a full hiring round for PMs we contacted all the people who filled out the EOI and said "hey are you still looking for a PM position" and then moved the ones who said "yes" into the p.m. hiring round.[1]

I’m not sure why Ben thinks hires as a “percent of applicants who get to the people ops interview stage” (the only stage where CEA is more likely to hire, and not an apples-to-apples comparison since CEA has a work trial before it and Ashby doesn’t) is the right metric

My conclusion was: "in some ways CEA is more selective, and in other ways we are less; I think the methodology we used isn't precise enough to make a stronger statement than 'we are about the same.'"

I don't think one of these comparison points is the "right metric" – they all have varying degrees of usefulness, and you and I might disagree a bit about their relative value, but, given their contradictory conclusions, I don't think you can draw strong conclusions other than "we are about the same".

  1. ^

    Sometimes exceptional candidates are hired straight from an EOI, the example I give is specific to that role. I think in retrospect we should have just left EOIs off, as the data was more confusing than helpful.

Comment by Ben_West on The Future Might Not Be So Great · 2022-07-06T02:21:56.736Z · EA · GW

The thing I have most changed my mind about since writing the post of mine you cite is adjacent to the "disvalue through evolution" category: basically, I've become more worried that disvalue is instrumentally useful. E.g. maybe the most efficient paperclip maximizer is one that's really sad about the lack of paperclips.

There's some old writing on this by Carl Shulman and Brian Tomasik; I would be excited for someone to do a more thorough write up/literature review for the red teaming contest (or just in general).

Comment by Ben_West on The Future Might Not Be So Great · 2022-07-06T02:09:43.383Z · EA · GW

My anecdata is also that most people have thought about it somewhat, and "maybe it's okay if everyone dies" is one of the more common initial responses I've heard to existential risk.

But I agree with OP that I more regularly hear "people are worried about negative outcomes just because they themselves are depressed" than "people assume positive outcomes just because they themselves are manic" (or some other cognitive bias).

Comment by Ben_West on CEA’s operations team is growing quickly! You might be a good fit · 2022-07-05T23:23:55.488Z · EA · GW

We currently employ over 60 staff and we expect that to grow to 110 in 2022.

The "we" in the sentence is CEA, not the ops team, right? I might suggest clarifying that in the post, as the rest of the post uses "we" to refer to the ops team.

Comment by Ben_West on Let's not have a separate "community building" camp · 2022-07-05T20:49:57.770Z · EA · GW

Thanks! I spend a fair amount of time reading technical papers, including the things you mentioned, mostly because I spend a lot of time on airplanes and this is a vaguely productive thing I can do on an airplane, but honestly this just mostly results in me being better able to make TikToks about obscure theorems.

Maybe my confusion is: when you say "participate in object level discussions" you mean less "be able to find the flaw in the proof of some theorem" and more "be able to state what's holding us back from having more/better theorems"? That seems more compelling to me.

Comment by Ben_West on Let's not have a separate "community building" camp · 2022-07-05T20:26:55.097Z · EA · GW

Cool, yeah that split makes sense to me. I had originally assumed that "talking to people about models of whether ELK helps anything" would fall into a "community building track," but upon rereading your post more closely I don't think that was the intended interpretation.[1]

FWIW the "only one track" model doesn't perfectly map to my intuition here. E.g. the founders of doordash spent time using their own app as delivery drivers, and that experience was probably quite useful for them, but I still think it's fair to describe them as being on the "create a delivery app" track rather than the "be a delivery driver" track. 

I read you as making an analogous suggestion for EA community builders, and I would describe that as being "super customer focused" or something, rather than having only one "track".

  1. ^

    You say "obsessing over the details of what's needed in direct work," and talking to experts definitely seems like an activity that falls in that category.

Comment by Ben_West on Let's not have a separate "community building" camp · 2022-07-01T18:14:17.969Z · EA · GW

I would find it helpful to have more precision about what it means to "participate more in object level discussion".

For example: did you think that I/the forum was more impactful after I spent a week doing ELK? If the answer is "no," is that because I need to be at the level of winning an ELK prize to see returns in my community building work? Or is it about the amount of time spent rather than my skill level (e.g. I would need to have spent a month rather than a week in order to see a return)?

Comment by Ben_West on The Future Might Not Be So Great · 2022-07-01T04:23:38.147Z · EA · GW

Minor technical comment: the links to subsections in the topmost table link to the Google docs version of the article, and I think it would be slightly nicer if they linked to the forum post version.

Comment by Ben_West on The inordinately slow spread of good AGI conversations in ML · 2022-06-30T18:24:03.130Z · EA · GW

Thanks for posting this! I thought the points were interesting, and I would have missed the conversation since I'm not very active on twitter.

Comment by Ben_West on Product Managers: the EA Forum Needs You · 2022-06-27T00:55:17.096Z · EA · GW

Thanks! It's a new feature we are piloting – authors can turn it on in their settings:

We will have an announcement post about it coming out soon

Comment by Ben_West on Product Managers: the EA Forum Needs You · 2022-06-24T15:02:15.800Z · EA · GW

Thanks! Lesswrong is currently experimenting with multidimensional voting; if you haven't already, I would suggest trying that out and giving them feedback.

Comment by Ben_West on Product Managers: the EA Forum Needs You · 2022-06-24T14:35:21.392Z · EA · GW

Thanks!