Is EA growing? Rather than speculating from anecdotes, I decided to collect some data. This is a continuation of the analysis started last year [EA · GW]. For each trend, I collected the raw data and also highlighted in green where the highest point was reached (though this may be different from the period with the largest growth depending on which derivative you are looking at). You can download the raw data behind these tables here.
This year, I decided to separate growth stats into a few different categories, looking at how growth changes when we talk about people learning about EA through reading; increasing their commitment through joining a newsletter, joining a Facebook group, joining the EA Forum, or subscribing to a podcast; increasing engagement by committing -- self-identifying as EA on the EA Survey and/or taking a pledge; and having an impact by doing something, like donating or changing their careers.
When looking at this, it appears that there has been a decline of people searching and looking for EA (at least in the ways we track), with the exception of 80,000 Hours pageviews and EA Reddit page subscriptions which continued to grow but at a much lower pace. When we look at the rate of change, we can see a fairly clear decline across all metrics:
We can also see that when it comes to driving initial EA readership and engagement, 80,000 Hours is very clearly leading the pack while other sources of learning about EA are declining a bit:
In fact, the two sources of learning about EA that seem to best represent natural search -- Google interest and Wikipedia pageviews -- appear somewhat correlated and are now both declining together.
However, there are more people consuming EA in closer ways (what I termed “joining”) -- while growth rate in the EA Newsletter and 80K Newsletter has slowed down, the EA FB is more active, the EA Reddit and total engagement from 80K continues to grow, and new avenues like Vox's Future Perfect and 80K's podcast have opened up. However, this view of growth can change depending on which derivative you look at. Looking at the next derivative makes clear that there was a large explosion of interest in 2017 in the EA Reddit and the EA Newsletter that wasn’t repeated in 2018:
Additionally, Founder's Pledge continues to grow and OFTW has had explosive growth, though GWWC has stalled out a bit. The EA Survey has also recovered from a sluggish 2017 to break records in 2018. Looking at the rate of change shows Founder's Pledge clearly increasing, GWWC decreasing, and OFTW’s having fairly rapid growth in 2018 after a slowdown in 2017.
Lastly, the part we care about most seems to be doing the strongest -- while tracking the actual impact of the EA movement is really hard and very sensitive to outliers, nearly every doing/impact metric we do track was at its strongest in either 2017 or 2018, with only GiveWell and 80K seeing a slight decline in 2018 relative to 2017. However, looking at the actual rate of change shows a bleaker picture that we may be approaching a plateau.
Like last year, it still remains a bit difficult to infer broad trends given that a decline for one year might be the start of a true plateau or decline (as appears to be the case for GWWC) or may just be a one-time blip prior to a bounce back (as appears to be the case for the EA Survey).
Overall, the decline in people first discovering EA (reading) and the growth of donations / career changes (doing) makes sense, as it is likely the result of the intentional effort across several groups and individuals in EA over the past few years to focus on high-fidelity messaging and growing the impact of pre-existing EAs and deliberate decisions to stop mass marketing, Facebook advertising, etc. The hope is that while this may bring in fewer total people, the people it does bring in will be much higher quality on average. Based on this, while EA is maybe not growing as fast as it could if we optimized for short-run growth, I’d provisionally conclude that EA is growing in exactly the one would expect and intend for it to do so. Additionally, clear growth in pledges and money raised from Founders Pledge, Effective Giving, and One For The World show a potentially new promising path for future growth that could lead to many more donations in the future.
However, I don’t think we should take this at face value to assume the EA movement is safe from decline -- if fewer people are initially discovering EA, this could lead to much slower or reduced growth in impact a few years down the line as fewer people are growing the overall pie of EAs who can be counted on to have an impact in later years. Indeed, looking at the rate of change in these metrics shows a bleaker picture, with EA having gone through a critical acceleration period that has now mostly ended, potentially bringing about a future plateau in some of these statistics.
We’ll continue to monitor and see if these trends hold up in future years or if there emerge causes for concern. Also please feel free to mention other metrics we should consider adding for next year.
3 June 2019 - Founders Pledge information was originally given in thousands when it should be in millions. Additionally, totals given were slightly off (especially for 2015) and have now been corrected. This has now been corrected in the table, graphs, and downloadable CSV. This made the rate of change for Founders Pledge to be clearly positive and as such I added a bit of optimism to the conclusion. Additionally, I amended F18 to attribute Callum for new data and explain lumpiness in the estimations.
3 June 2019 - FN13 specifying the EA Newsletter was updated with info from Aaron's comment [EA · GW]. I also updated the conclusion slightly to explicitly mention discontinuing advertising campaigns as a deliberate reason for lower growth.
4 June 2019 - The highlighting for GiveWell's monthly unique visitors (excluding adwords) incorrectly identified 2016 as the year with the most visitors. That has been corrected to show 2015 as the year with the most visitors. (The underlying data was not wrong, just the highlighting.)
: See Google Trends data. These numbers are not search volumes -- they’re the mean relative “score” for that year, relative to the search volume for the highest day between January 2004 and the end of December 2018. Each volume number is the number as of the last day of December of the reported year.
: Data from 2016 and earlier was collected by Vipul Naik [EA · GW]. Data for 2017 and after is available but I am told that it would take too long to collect, so in the interest of publishing this post in a remotely timely manner, I will save collecting this data to next year.
 Vipul’s data [EA · GW] only has data starting mid-September 2014, so it seems most accurate to not count this year.
: These data come from the moderator panel for the Reddit. I was able to collect these data as I am a moderator. This panel is unfortunately not accessible to non-moderators. It also unfortunately only goes back one year at a time.
: Due to the limitations of the Reddit moderator panel only going back one year at a time, I have to use old data that is of the time range September 2016 to August 2017 as the estimate for 2017.
: These data is only slightly off - it actually represents Jan 5, 2018 to Jan 4, 2019.
: These data comes from asking Kelsey Piper, Vox Future Perfect staff writer.
: Both r/EffectiveAltruism and r/smartgiving have been simultaneous EA subreddits since September 2012. r/smartgiving was the default EA subreddit until an intentional migration on 28 Feb 2016. I will use r/smartgiving numbers for the 2014-2015 period and r/effectivealtruism numbers for all periods after that, to reflect the transition. Note that this growth will therefore involve some inherent double-counting as people who were subscribed on r/smartgiving re-subscribe on r/effectivealtruism. Pageviews for reddit were calculated via http://redditmetrics.com/.
: Data after 2016 comes from asking Aaron Gertler and is more reliable. Data from before 2016 comes from accessing archived data from Rethink Charity that is much more approximate. It should be noted that heavy growth in 2017 came from a heavy Facebook advertising campaign that was not continued into 2018.
: As I’m a moderator of the EA Facebook group, I was able to collect these data from the moderator panel that comes with Facebook. This panel is unfortunately not accessible to non-moderators. Unfortunately, I only have data going back to July 2017, where there were 8629 active users. At the end of 2018, there were 9104 active users.
: Data was only available going back to July 2017, so I fudge here by just doubling the number.
: These data come from asking Julia Wise.
: These data comes from Steve Hindi. Note that these data are for school years (thus the “2014” period here represents July 2013 to Jun 2014, etc.).
 These data come from asking Marie Paglinghi and Callum Calvert. Note that pledge totals may be a bit jumpy as they can be sensitive to small changes in larger donors.
: Both 2015 and 2016 donations were recorded as of the 2017 EA Survey [EA · GW] (as no EA Survey was run in 2016). This means that 2015 donations could be artificially low due to survivorship bias, if some donors didn’t fill out the EA survey two years later. (Not to mention the fact that there are likely many donors who don’t fill out the EA survey even one month later.)
: These data will be recorded in the forthcoming 2019 EA Survey.
: This data collected via Vipul Naik. Data is preliminary and has not been completely vetted and normalized. Money from the Open Philanthropy Project is counted for the year in which the grant is announced, which may be different from the year the grant is decided or the year the grant money is actually dispersed. Note that this might make 2018 artificially lower, as some 2018 grants may not yet be announced (or may have been announced but not recorded) as of the time of this writing.
: Note that, according to Julia, 2017 was the last year when a staff member sent repeated emails to people reminding them to record their donations and Julia suspects the lower number in 2018 is a result of that. Julia notes that in Spring 2019, they will resume recontacting people, so they will see if this increases reported 2018 donations.
: Note that this is separation mainly for illustrative purposes. While it may be tempting to arrange this into some sort of EA funnel, it is not quite that as we don’t have any evidence to back up this categorization. In fact, reading the EA Forum may actually signify fairly deep engagement [EA · GW] despite being reading, whereas being in the EA Survey panel is seen as committing but could just be a person who reads the EA Newsletter. Getting better metrics on EA engagement as well as putting more effort into figuring out what the EA Funnel may actually empirically consist of is an ongoing project of the EA Survey [EA · GW].
: However, it appears that the large growth in 2018 EA Survey takers was primarily driven by more people taking the EA Survey from the EA Newsletter, where the EA Survey was placed a lot more prominently (receiving a dedicated email) than in prior years and where the EA Newsletter itself had just undergone large growth the year before (between the 2017 and 2018 EA Surveys). This would suggest that the EA Survey growth might stagnate or decline in the future as sources of people finding out about the EA Survey also stagnate.
: We’re considering adding data around growth at pledges secured by Effective Giving, total pledge counts (not money) from Founders Pledge, total OpenPhil grants (amount + number of grants, and to more than just GiveWell), growth at the EA Hub, growth in traffic at effectivealtruism.org, sales of various EA books (e.g., Doing Good Better), metrics for the EA Forum (such as page views, total accounts, active users, total posts, total comments, total upvotes), and metrics around local groups.
This essay is a project of Rethink Priorities. It was written by Peter Hurford with graphs by Neil Dullaghan. Thanks to David Moss, Neil Dullaghan, Michael Trzesimiech, and Marcus A. Davis for comments. Thanks to Michael Trzesimiech for compiling the table into a downloadable CSV. Also, additional thanks to everyone who helped provide the underlying data collected for this post. If you like our work, please consider subscribing to our newsletter. You can see all our work to date here.
Hi Peter, thanks for such a detailed post.
I think there could be a misunderstanding in Founders Pledge numbers as their pledge numbers are growing very quickly and significantly higher than in 2015. (They have also been making great effort in increasing the impact of deployment including into far future.)
Also in the commitment section there is Effective Giving which is a relatively new and making significant progress.
These could point to a somewhat different picture in the commitment section to the one described above.
Perhaps we can review this and provide an update if necessary?
Hi Luke. I reached out to Marie at Founders Pledge and found that the numbers I originally used were meant to be in the millions, not thousands. I have corrected the post and sent it to Marie to review again to triple check.
I agree that the growth at Founders Pledge, OFTW, and Effective Giving sounds impressive and I'll make a note to follow up next year to see if that changes the narrative.
It's worth keeping in mind that some of these rows are 10 or 100 times more important than others.
The most important to me personally is Open Phil's grantmaking. I hadn't realised that the value of their non-GiveWell grants had declined from 2017 to 2018.
Fortunately if they keep up the pace they've set in 2019 so far, they'll hit a new record of $250m this year. In my personal opinion that alone will probably account for a large fraction of the flow of impact effective altruism is having right now.
Regarding EA Newsletter statistics: I didn't see this mentioned in the piece, but the heavy growth in late 2016 and early 2017 mostly happened because the team who were working on the Newsletter at the time advertised it heavily on Facebook. After the 2017 campaign ended, there wasn't any further advertising (as far as I'm aware).
The number of subscribers roughly tripled during this time, which translated to a 1.5-2x increase in the number of people opening the emails and clicking links (since new subscribers from FB ads weren't as interested in the content). I don't know whether the campaign stopped because the ads stopped working as well, or for some other reason.
I'm considering trying another advertising campaign at some point, but over the last few months, my focus (with the time I actually have to spend on newsletter work) has been on improving the quality of our content and improving our measurement (both for objective open-rate data and subjective "what's our impact?" data).
Anyway, the Newsletter's patterns of growth were heavily "hacked" (albeit not in a bad way) and shouldn't be taken as a measure of organic interest in effective altruism.
When I asked about what has caused EA movement growth to slow down, people answered it seemed likeliest EA made the choice to focus on fidelity instead of rapid growth. That is a vague response. What I took it to mean is:
EA as a community, led by EA-aligned organizations, chose to switch from prioritizing rapid growth to fidelity.
That this was a choice means that, presumably, EA could have continued to rapidly grow.
No limiting factor has been identified that is outside the control of the EA community regarding its growth.
EA could make the choice to grow once again.
Given this, nobody has identified a reason why EA can't just grow again if we so choose as a community by redirecting resources at our disposal. Since the outcome is under our control and subject to change, I think it's unwarranted to characterize the future as 'bleak'.
Great work Peter, thanks so much for doing this! Super helpful to be able to see all these numbers aggregated in the same place. And I love the categorization of the metrics. Strong upvote.
A couple of thoughts on metrics to include next year:
· I agree with Michelle’s comment [EA · GW] that traffic for EA.org is an important metric to look at, especially since that’s the top result when people google EA. I’d be interested in both organic search web traffic, and overall traffic (ex paid traffic).
· In general, I think it’s most helpful to look at numbers excluding paid traffic to give a better sense of organic growth rates. As Aaron notes [EA · GW], this helps explain the EA newsletter trajectory, and it’d be interesting to see how excluding paid traffic might affect the 80k traffic numbers as well.
· Total operational spending by EA orgs could be a helpful perspective on how inputs to EA are changing over time; the current metrics are all focused on outputs, and it would be nice to relate the two.
Quick answer for 80k: Paid traffic only comes from our free Google Adwords, which is a fixed budget each month. Over the last year, about 12% of the traffic was paid, roughly 10,000-20,000 users per month. This isn't driving growth because the budget isn't growing.
In general, I think it’s most helpful to look at numbers excluding paid traffic to give a better sense of organic growth rates. As Aaron notes, [EA · GW] this helps explain the EA newsletter trajectory, and it’d be interesting to see how excluding paid traffic might affect the 80k traffic numbers as well.
I think this is a good idea but I'm unsure if this data is available. I can try to make a better effort to find more data sources that do not include paid traffic for next year.
Total operational spending by EA orgs could be a helpful perspective on how inputs to EA are changing over time; the current metrics are all focused on outputs, and it would be nice to relate the two.
This is a metric I actually put some effort into collecting for this year - I tried to make a basket of orgs that have been around since ~2014 and made publicly accessible budgets, but it proved very time consuming to collect and I felt like it was potentially misleading due to orgs being excluded or orgs not publishing clear budgets.
My hope is that including "EA Funds payouts" could help capture some of this. One thing I really ought to have included but for some reason didn't think to is also total OpenPhil grants (not just to GiveWell or excluding GiveWell) as this may also capture some of the growth in the broader EA space.
Re: non-paid traffic, it should be very easy (a few minutes) to pull traffic data ex-adwords for any sites that are set up on Google Analytics. Excluding other types of paid traffic/conversions (e.g. newsletter signups driven by FB ads) would be harder (though generally doable).
One thing I really ought to have included but for some reason didn't think to is also total OpenPhil grants (not just to GiveWell or excluding GiveWell) as this may also capture some of the growth in the broader EA space.
Agree OpenPhil grants would be a helpful perspective on this, both total grants and grants within their EA focus area (which would be a proxy for meta investment)
Thanks for producing this Peter, it's very helpful. I sent you some metric data on the 80,000 Hours Podcast, but now that I've seen the post, I can give you the best numbers for the table. I would suggest putting these figures in instead.
The podcast only started half way through 2017, I'm not sure how you want to handle that.
Those are the maximum number of subscribers recorded at any point in the year. It's probably a few % too high in both cases, but I've found that's the measure most robust to random measurement variations. The overestimation should also be constant year to year.
Podcast downloads/plays don't correspond to actual times people listened to a full episode. They include people pressing play but only listening to a few seconds; bots downloading the show; automatic downloads by the podcasting software that are never actually listened to; and so on. So they're massive overestimates of the number of times an episode was listened to, say, half way to completion. However, the overestimate is likely to be a pretty constant fraction year-to-year, so you can still make relative comparisons.
[EA Forum traffic] data for 2017 and after is available but I am told that it would take too long to collect, so in the interest of publishing this post in a remotely timely manner, I will save collecting this data to next year.
I’m very surprised that pulling this data is non-trivial; I would have guessed it would take <5 minutes to get from Google Analytics. Is the EA Forum still set up on Google Analytics (which is where the pre-2017 data came from)? If not, why not, and how do those managing the platform measure usage and engagement?
The prior version of the Forum (before ownership was transferred to CEA) wasn't set up for Analytics. We began tracking this data one month ago. The Forum currently gets ~3000 pageviews per day from ~1000 users.
Other data I could have provided to Peter if he had asked: New posts and comments are appearing at a much faster rate than before the new Forum launched.
For example, there were 26 posts in the month of September and 71 posts in February (not counting posts written by CEA staff). The average number of comments per post doesn't seem to have changed much, but the most commented-upon posts get many more comments than they used to (at least two posts in the last two months got more than 100, more than double the count of any post in September). The "71" number is pretty typical -- we've been getting between two and three new posts per day in almost every week since the new Forum launched in November.
I haven't yet taught myself enough SQL to get these numbers from our database, but I found them within 10 minutes by clicking "load more posts" a bunch of times on the "All Posts" [EA · GW] page. If anyone else does the same, they'll also see the dramatic uptick in new posts after the new Forum launched. (Though this doesn't mean that EA is more popular now -- the new interface and extra promotion are probably most of the reason we've seen more posts, rather than an increase in the number of people who care about the movement.)
Peter didn't request this kind of data; he only asked for "page views and [the] number of newly created accounts".
As far as I know, we don't have Analytics data for page views Getting, account creation data back to 2014 should be doable, but we didn't have time to get to that request before Peter published the post.
We are interested in this data for our own purposes and may collect it at some point; I offered to let Peter know if we did so he could update the post, and he replied positively to the offer.
I generated a graph of the number of EA Forum posts per year, as well as the number of new user registrations. I extracted the data using the GraphQL API.
The raw JSON data for all posts is here. I had to split the user data into two files due to upload limits. The raw JSON data for all unbanned (but otherwise unfiltered) users is here. The JSON data for all banned users is here.
Two interesting questions that bounce of this are "how many members does EA have?" (obviously this is somewhat vague) and "how many members would be optimal?" (more members has clear benefits, but it's presumably possible to get too big). From e.g. Facebook group membership and survey responses, it seems like the answer to the first question is somewhere in the 1000-10000 range. I'm not sure what the best points of comparison with regard to the second question are, but the Extinction Rebellion movement, most major British political parties, and Scientology all have significantly more members.
To get a better sense of “how many members does EA have?”, going forward I suggest asking organizations for data on unique website visitors rather than pageviews since the latter somewhat conflates number of people and degree of engagement (pages per session).
Great work! I wonder if there are any ways to track quality adjusted engagement since that what we've mostly been optimizing for the last few years. E. g. if low-quality page views/joins/listeners are going down it seems hard to compensate with an equal number of high quality ones because they're harder to create. 80k's impact adjusted plan changes metric is the only suitable metric I can think of.
Impact adjusting is fairly values sensitive and may differ dramatically even between EAs, which is why I'd prefer to report raw data and let other people attempt their own impact adjustments on top of it.
I don't think you get enough information from pageview tracking to be able to impact adjust each pageview, but perhaps you could track engagement hours (as 80K does) or engagement on particular target pages. Additionally you could impact adjust based on source (e.g., weighing growth in 80K pageviews higher than growth in Future Perfect pageviews).
The pledge counts and donation totals also likely lend themselves to impact adjusting fairly well, as you could impact adjust based on charities donated to (most of the raw data to do this is available) or the kind of pledge taken.
One question: Was the last sentence of footnote 35 meant to be the last sentence of footnote 34? The sentence is "This would suggest that the EA Survey growth might stagnate or decline in the future as sources of people finding out about the EA Survey also stagnate", and the survey is the focus of footnote 34, but not of footnote 35.