Is pursuing EA entrepreneurship becoming more costly to the individual?
post by konrad
This is a question post.
This question grew out of a reaction I had to Rob Wiblin's "Consider a wider range of jobs, paths and problems if you want to improve the long-term future [EA · GW]".
While reading, though, one thing struck me as a question that merits some more attention. I quote:
I suggest paying a bit more respect to the courage or initiative shown by those who choose to figure out their own unique path or otherwise do something different than those around them.
Given the entrepreneurial slant of EA culture, I worry that some people will end up concluding "we should celebrate risk-taking even more than we already do".
I wonder whether an EA who is encouraged by this might be more likely to underestimate how costly entrepreneurship is, and whether it is in fact becoming more costly to the individual?
answer by konrad
) · GW
[Epistemic status: patchwork of unsystematically tracked personal impressions since first encountering EA in 2014 noted down over the course of a work day]
So here's an attempt at partially explaining, from a historical perspective, why it might be getting more difficult to fulfil the necessary conditions to start independent EA projects without burning your"EA" career capital (and why that might have been different in the early days).
This perspective seems important if true because it would imply making more of an effort to update common EA career advice and culture.
It is only a partial answer, and I am sharing this because I would appreciate other perspectives.
With the increasing establishment of any field, such as EA, the likelihood of success of new projects decreases while the gravitas of failure for first-time founders increases. If you aren't yet part of one of the core groups that provide a safety net for their entrepreneurs, I would build career capital via safer paths first.
Trust-based networks scale badly
Networks provide access to their resources in a mostly trust-based manner. Resources are often allocated in somewhat formalized ways but still heavily reliant on personal references.
Verification of alignment is always costly
The ways in which a trust-based network can grow are limited: either through explicit entry criteria or through recommender-systems. Designing explicit entry criteria that work is hard, so our civilization mostly relies on opaque arbitration systems running on human brains.
EA organizations and thought leaders do an extraordinary job at documenting their thinking. But no matter how well you are aligned, verification will always remain fairly costly.
Core groups are increasingly hard to access
As an EA founder, you have to make verification of your alignment as cheap as possible. That's costly for a resource-strapped project in its starting phase. Thus, you want to get maximally relevant feedback - ideally from the gatekeepers of the network you want to join.
But if you are on the other side of things, somebody who has a lot of "EA capital", and get asked for feedback from someone you barely know, you are unlikely to engage. You don't have enough time to properly vet them and their project. Being even associated with it could suggest you endorse it - and you have better things to do than to create a nuanced write-up explaining your engagement and lessons learned from it.
As a result, the important EA people will not engage with cool new independent project unless the founder has sufficient "EA capital" prior to going "off track".
Losses look worse relative to safe bets
Even with relevant feedback it is difficult to build up a successful project. And if you don't have much status, few will recognize a worthy attempt in case of failure - and least likely the busy important people.
Unless you are already highly skilled or have a lot of resources to bring your project to fruition, you're likely to fail at a stage that doesn't provide much information about your skill-level. In that case, any failure provides at least a little bit of information about you not being good enough (assuming success is not pure luck).
As there are more and more smart youngsters, it gets more and more difficult not to be burnt by failure simply because failure looks worse relative to everyone else who is playing it safe or has succeeded.
EA used to be different because it was young
In the early days of EA, a handful of smart youngsters could start a bunch of projects because everyone knew everyone else. There was a lot of excitement about people exploring. The early entrepreneurs received status just by starting things - even if they weren't immediately successful.
Today, just starting a new project is getting you less and less "EA capital". This is because the network has grown, the space has been carved out. Growth of the network means more unknown people and thus core groups are becoming more prudent about who to extend their trust to.
This is not a bad thing; it's a sign of maturity. I just worry that it hasn't been made explicit often enough:
You're more likely not to get credit for trying nowadays because it's harder to interact with the key nodes of the network, as they now have to protect themselves more vigilantly from incurring significant opportunity and reputational costs.
If this were the dominant dynamic, entrepreneurship would not be worth going into for most people who currently self-identify as EAs. Especially the younger ones, let's say under 30. Even more so if being part of EA is already considered weird in your support circle. Not having a "proper" job might cost you too much social capital to have a larger impact later on in life (when you'd likely have most of your impact).
Given EA demographics, there only are few highly skilled, wealthy or well-supported people who can afford to resist these incentive structures. Most are better off continuing on the beaten career paths until they have accumulated enough capital to not lose status when taking risky bets - no matter how well calculated.
↑ comment by NunoSempere ·
2021-03-12T19:27:55.871Z · EA(p) · GW(p)
This answer brings valuable points, but it rubbed me the wrong way. After thinking about it, I think that it feels partisan because all of your points go in one direction, but just because the question doesn't ask about a tautology, I'd expect there to be competing considerations.
Here are some competing considerations:
- Charity Entrepreneurship now exists, and makes entrepreneurship much, much easier. I think that this effect is stronger than any other effect. Note that they offer a stipend.
- I think you're confusing selection effects for environmental effects. As EA becomes larger, it will include people who are less hardcore, but for a given level of hardcoreness, it's unclear whether entrepreneurship is easier or harder. For example, Toby Ord pledged to donate everything he earned above £18,000, whereas EAs today seems at least a tad softer.
- Hits-based giving has become institutionalized and popularized. This makes, e.g., an organization like ALLFED possible. I think that your "Losses look worse relative to safe bets" point might be dependent on your specific social circle, and maybe moot throughout most of EA.
- As EA becomes larger, (entrepreneurial) specialization becomes possible. This counteracts the effect of low-hanging fruit having been picked, to some extent.
- There are more EAs, meaning that network effects are stronger.
- I don't think that "burning you EA career capital" is a dynamic I've seen much
- Asking for feedback is still relatively easy, just by posting an idea on the EA forum.
- I also have some impressions based on my own experience, but I don't think that these generalize
Overall, my bottom line is that I'm uncertain, though I'm assigning slightly higher probability to it being harder.
Note that this is a different question than whether we should "pay a bit more respect to the courage or initiative shown by those who choose to figure out their own unique path or otherwise do something different than those around them", which one could model as balancing the marginal disillusionment of those who try and fail, and the high expected value of those who succeed. Note that if we also cherish those who try and fail, we can sort of have our cake and eat it too.
↑ comment by tamgent ·
2021-03-12T10:53:58.998Z · EA(p) · GW(p)
Thank you for sharing your analysis of what I also see as a major challenge for us to overcome (the challenge of EA entrepreneurship becoming more costly). I agree with many things in your answer, but strongly disagree with the conclusion or 'bottom line'. It seems very bleak, like giving up. Instead I think we should be creating better systems for mentorship and vetting. There are some initiatives trying to do things in this space, such as Charity Entrepreneurship and the longtermist incubator project. I am also excited about the new management and reform of EA Funds (see for example, this post on the ways in which EA Funds is more flexible than you might think [EA · GW]). To me, these are all positive signs that the ecosystem of mentorship and vetting is maturing a bit too. However, I think there is still a lot more work to be done in this area, and would like to see more initiatives (or better understand what those initiatives are bottlenecked on).
Also on your 'bottom line' - one does not need to choose necessarily between having a safe career and doing EA entrepreneurship. I'm doing both, and I think as long as you make bets that are proportional to feedback and have good contingencies, it can be done. Sometimes you do want to go 'all out' on an entrepreneurial venture, but you want to probably build up a track record and start with cheaper ventures first.
↑ comment by ryan_b ·
2021-03-11T16:56:29.733Z · EA(p) · GW(p)
And if you don't have much status, no one will recognize a worthy attempt in case of failure.
This is a fairly harsh indictment of community norms. It directly implies there is nothing different about EA norms in this dimension relative to society at large, which is kind of a problem because there are well-known areas with superior norms; a well conducted trial reflects well on lawyers even when they lose.
Doesn't make it wrong, naturally. But if true, it seems like it would definitely merit specific attention from the group.
answer by IanDavidMoss
) · GW
I generally think that full-time social entrepreneurship (in the sense of being dependent on contributed income) early in one's career is quite risky and a bad idea for most people no matter what context or community you're talking about. I would say that, if anything, EA has made this proposition seem artificially attractive in recent years because of a) the unusual amount of money it's been able to attract to the cause during its first decade of existence and b) the high profile of a few outlier founders in the community who managed to defy the odds and become very successful. But the fundamental underlying reality is that it's really hard to scale anything without a self-sustaining business model, and without the promise of scale on the other side it's really hard to justify taking risks.
With that being said, I do think that risk-taking is really valuable to the community and EA is unusually well positioned to enable it without forcing founders to incur the kinds of costs you're talking about. One option, as tamgent mentioned in another comment, is to encourage entrepreneurship as a side project to be pursued alongside a job, full-time studies, or other major commitment. After all, that's how GiveWell, Giving What We Can, and 80,000 Hours all got started, and the lack of a single founder on the job full-time at the very beginning certainly didn't harm their growth. Another option, as EA Funds is now encouraging, is to make a point of generously funding time-limited experiments or short-term projects that provide R&D value for the community without necessarily setting back a founder or project manager in their career. Finally, EA funders could seek to form stronger relationships with funders outside of the community that are aligned on specific cause areas or other narrow points of interest to be better referral sources and advocates for projects that expect to require significant funds over an extended period.
But coming back to your core point, I would definitely encourage most EAs to pursue full-time employment outside of the EA community, even if they choose to stay within the social sector broadly. It's a vast, vast world out there, and all too easy to draw a misleading line from EA's genuinely impressive growth and reach to a wild overestimate of the share of relevant opportunities it represents for anyone trying to make the world a better place.
↑ comment by tamgent ·
2021-03-12T15:00:32.770Z · EA(p) · GW(p)
I agree with much of this answer. However, I'm not sure it's the lack of promise of scale that makes projects not get funded, but rather other reasons [EA(p) · GW(p)]. I am also excited about EA Funds now encouraging time-limited all-in experiments. Replies from: IanDavidMoss
↑ comment by IanDavidMoss ·
2021-03-12T15:37:10.126Z · EA(p) · GW(p)
To clarify, when I wrote "without the promise of scale on the other side it's really hard to justify taking risks," I was talking from the perspective of the founder pouring time and career capital into a project, not a funder deciding whether to fund it.Replies from: tamgent
Comments sorted by top scores.
comment by Dicentra ·
2021-03-10T22:06:32.203Z · EA(p) · GW(p)
Sorry to say I had difficulty parsing what you were trying to say in the post here. Replies from: konrad
↑ comment by konrad ·
2021-03-11T08:13:23.327Z · EA(p) · GW(p)
Thanks for the feedback! I gave it another pass. Is there anything concrete that threw you off or still does? I'd appreciate pointers as I had other people look at it before.Replies from: tamgent
↑ comment by tamgent ·
2021-03-12T11:05:15.156Z · EA(p) · GW(p)
Here are few minor things I think you could modify for clarity:
Replace 'The sentence that made me think it's worth writing up a reaction was:' with 'From the article:'
Also, you repeat yourself at the end. The last two one-sentence paragraphs could just be one paragraph that says:
'Given the entrepreneurial slant of EA culture, I worry that some people will end up concluding "we should celebrate risk-taking even more than we already do". Isn't dangerous career advice for the average EA?'