I think having common knowledge of norms, ideas and future plans is often very important, and is better achieved by having everyone in the same place. If you split up the event into multiple events, even if all the same people attend, the participants of those events can now no longer verify who else is at the event, and as such can no longer build common knowledge with those other people about the things that have been discussed.
Interesting, this doesn’t fit with my experience for two reasons: a) attendance is so far past Dunbar’s number that I have a hard time knowing who attended any individual EA Global and b) even if I know that someone attended a given EA Global, I’m not sure whether they attended any individual talk/workshop/etc. (since many people don’t attend the same talks, or even any talks at all).
I’m curious if you have examples of “norms, ideas, or future plans” which were successfully shared in 2016 (when we had just the one large EA Global) that you think would not have successfully been shared if we had multiple events?
I have been to 3 EAGx events, all three of which seemed to me to be just generally much worse run than EAG, both in terms of content and operations
We have heard concerns similar to yours about logistics and content in the past, and we are providing more support for EAGx organizers this year, including creating a “playbook” to document best practices, having monthly check-in calls between the organizers and CEA’s events team, and hosting a training for the organizers (which is happening this week).
At least in recent years, the comparison of the Net Promoter Score of EAG and EAGx events indicate that the attendees themselves are positive about EAGx, though there are obviously lots of confounding factors:
The value of a conference does scale to a meaningful degree with n^2… I think there are strong increasing returns to conference size
Echoing Denise, I would be curious for evidence here. My intuition is that marginal returns are diminishing, not increasing, and I think this is a common view (e.g. ticket prices for conferences don’t seem to scale with the square of the number of attendees).
Group membership is in significant parts determined by who attends EAG, and not by who attends EAGx, and I feel somewhat uncomfortable with the degree of control CEA has over that
Do you have examples of groups (events, programs, etc.) which use EA Global attendance as a “significant” membership criterion?
My impression is that many people who are highly involved in EA do not attend EA Global (some EA organization staff do not attend, for example), so I would be pretty skeptical of using it.
To clarify my above responses: I (and the Events team, who are currently running a retreat with the EAGx organizers) believe that more people being able to attend EA Global is good, all other things being equal. Even though I’m less positive about the specific things you are pointing to here than you are, I generally agree that you are pointing to legitimate sources of value.
Thanks for writing this up despite all your other obligations Oli! If you have time either now or when you do the more in-depth write up, I would still be curious to hear your thoughts on success conditions for fiction.
I wanted to share an update: for the past month, our events team (Amy, Barry, and Kate) have been brainstorming ways to allow more people to attend EA Global SF 2020. Our previous bottleneck was the number of seats available for lunch: even with us buying out the restaurant next to the dome (M.Y. China), we only had space for 550 people. (Tap 415, another nearby restaurant which we had used in prior years, has gone out of business.)
We have now updated our agreements with the venue and contractors and brainstormed some additional changes that will allow more attendees in sessions and at lunch. This has increased our capacity by 70 (from 550 to 620).
(As a reference point: EA Global SF had 499 attendees in 2019.)
We don’t have any current plans to split EA Global into multiple sub-conferences. We have used the fact that not everyone attends talks to increase attendance (for example, at EA Global London 2019, we accepted more attendees than could fit in the venue for the opening talk on the assumption that not all of them would attend the opening).
We will keep the sub-conference idea in mind for the future.
Thanks for the questions. We have adjusted our promotion – for example: the application page and form lists who we believe EA Global to be a good fit for, and we send group leaders an email with this set of criteria and some FAQs about why group members may not be admitted. Conversely, we send emails to people we expect to accept (e.g. Community Building Grant recipients), to encourage them to apply. We try to make community members aware when applications open and convey who the event is aimed at, but we don’t try to promote it as strongly as we did in some past years.
Despite this, we know that there are still many people who would be a good fit for EA Global who do not apply, and others who apply and feel disappointed when they are not accepted. We want to express our appreciation to everyone who applies.
Regarding themes: in 2017 EA Global Boston had a theme of “expanding the frontiers of EA”, EA Global London had an academic theme, and EA Global SF had a community theme and had looser admission standards than the other two. We found that people primarily applied to the conference they were geographically closest to and did not seem to have strong preferences about themes. We’ve also run smaller targeted retreats on specific topics like organizing EA groups or working in operations.
This blog post does some calculations and estimates 250,000 animals per year.
They don't share the details, but it sounds like a pretty noisy estimate, e.g.:
We have to make a number of other assumptions as well, for example, we take the word of business analyst, Andrew Charles, who is widely quoted as suggesting Burger King could sell 50 Impossible Whoppers per day, and apply that same figure to all the QSRs in our model
Thanks for taking the time to share this. I didn't read it as bitter. I read it as you sharing your experience with a disappointment (understandable) and then going on to share helpful suggestions for us and others who care about the event. We sincerely appreciate that.
My only point of feedback would be to write a short version of this into the waitlist/rejection emails for future EAGs (or even the next wave for this EAG).
This is a good suggestion - we will add more information to those emails.
I definitely had an immediate reaction of not feeling valued by the community after being involved for ~5 years (and was sad I won't see my EA friends/acquaintances this year), hopefully other people didn't feel that way.
Thanks for raising this – we really do want to reiterate that we were not able to accept many dedicated EAs doing valuable things, and hope that you can help us share this message.
will there be some in-between events for people who are older and more experienced but didn't get into EAG this year?
We are not currently planning anything “in-between” the two EA Globals and the 3-4 EAGx’s in 2020.
I'd be interested to hear about future plans to accommodate the growth of having more "advanced" EAs across the US/world since right now it seems like if we don't live in a major EA city, EAG is our only chance to see other EAs and learn new things.
We plan to publish more information about our events strategy in early 2020. Until then you can check out my response to Denkenberger above for one example of a way we are trying to accommodate growth.
Hi David, one thing to note is that, since EA Global SF 2020 is in March, the dorms are unlikely to be available. Apart from that: you are correct that there are several ways in which the venue is less suited, e.g. because it is spread across three separate buildings, it’s worse for sparking “serendipitous” interactions between attendees, and the distribution of room sizes is a worse fit. (The venue we selected for EA Global SF 2020 [Bespoke] is very configurable, so it’s easier for us to have one big room for the opening talk, then split it into a bunch of small rooms for meet ups, etc.)
Regarding scaling the event: it’s hard for us to precisely estimate the cost of more attendees. One hypothesis we have is that improved matchmaking (either through formal matchmaking programs or through event apps which let attendees connect with each other) will let us increase the number of attendees at EA Global while preventing the “lost in the shuffle” feeling mentioned above. We have piloted several programs like this last year and will continue iterating and scaling them this year to see if that hypothesis is correct.
GDP will double in 6 months before it doubles in 24 months
Does anyone have the original version of this? The transcript says that Adam is (perhaps incorrectly) paraphrasing Paul Christiano.
I think I have some intuition about what this is getting at, but I don't think this statement is precisely correct (surely GDP has to double in 24 months before it doubles in six months, as long as there is at least 24 months of data).
Hey Peter, We considered having a third EAG in 2020 but decided to focus our efforts on EAGx instead (in addition to the two EA Globals and Leaders Forum). After getting feedback that the volunteer organizers of EAGx could use more support, we’re trying to spend more of our time and budget on those events to better prevent burnout among EAGx organizers. EAGx is also more oriented to students and other earlier-stage EAs who are less likely to be able to get into EA Global. We hope to get more information about both types of events this year and use that to help decide whether to have more of either type in the future.
But with all my respect to wikipedia, I think that having a local wiki would allow to focus on more action-related topics instead of some general knowledge
I'm curious to hear more about your concerns with just using Wikipedia. I agree that there will be some topics which are outside the scope of Wikipedia, but it seems like many EA-relevant topics are within the scope of Wikipedia, and do not have very well established pages. For example: there is no page on longtermism, cause neutrality, or the INT framework. Even the page on effective altruism itself is pretty short.
My guess is that someone could pretty easily just go through old Forum posts and copy facts into Wikipedia. E.g. the section on invertebrate sentience is two sentences long, and I would bet that a huge chunk of recent Forum posts on invertebrate sentience could be justifiably included in that Wikipedia article.
In general I have a lot of nervousness about trying to re-create an existing successful product (NIH syndrome), and my guess is that Wikipedia will be more considered trustworthy, get more views, and generally be more influential than a local wiki.
What do you see as the pros and cons of having an umbrella organization like RP which employees multiple researchers versus something like the EA Funds granting to independent researchers? (E.g. in what circumstances should a grant maker prefer to grant to RP who would then employ a researcher, versus granting directly to the researcher themselves?)
YC doesn't seem like a good example of avoiding geographic clustering effects. You are required to move your company to the San Francisco area while you are going through YC, and PG (cofounder of YC) has written about why founders should move to startup hubs. One of the weirdest parts of my personal YC experience was how they paid to fly me to Mountain View and stay a couple nights in a (ridiculously overpriced, IMO) airbnb, just to have a short conversation with me because they think being in person is so important.
One thing which did differentiate YC when they started is that they had "soft" adds: they offer you a lot of connections and advice, in addition to just money. Possibly more grantmakers should do this, but I'm not sure.
However, when that post is titled 'feedback for CEA', it looks like you believe that you're responsible for the friendliness of the EA community.
I think there may be a misunderstanding – the title of this post is “Feedback Collected by CEA”, not “for” CEA.
It would probably have been easiest to make the distinction between feedback on community health and feedback on CEA by posting to separate articles, but it could have also been accomplished in the introduction.
Thanks for the suggestion. I will keep this in mind for future articles.
(Along the same lines, I'd like more detail on specific positives and negatives about community health, especially in London. I feel like local community members are the ones who need to take the feedback forward, so we need to have access to as much quality information as possible.)
I agree that locale-specific information is important. You are probably already aware of this, but for other readers who are not: the EA Survey contains a bunch of data about geographic differences in EA. Your posts on Londondemographics come to mind as one example of local analysis that I would like to see more of.
Thanks for the question! There are different degrees and types of unusual-ness and riskiness. For example, as a reason why someone may choose not to donate to the Long-Term Future Fund we state:
First, they may prefer to support established organizations. The fund's most recent grants have mostly funded newer organizations and individual researchers. This trend is likely to continue, provided that promising opportunities continue to exist.
Established organizations focused on the long-term future such as the Future of Humanity Institute (FHI) are, in some ways, “unusual” and “risky”, but some donors may still prefer to donate to FHI instead of an independent researcher with a short track record, and those donors may not be a good fit for donating to the Long-Term Future Fund.
As a side note: we have been brainstorming internally about the correct adjective which would differentiate between e.g. FHI and an independent researcher. As you noted, “risk” is not exactly the dimension along which these two donation targets differ – if anyone has a better suggestion, we would appreciate hearing it.
Thanks for the question! 38% of confirmed speakers at EAG SF 2019 were female and 27% were people of color. (NB: The final numbers may have been slightly different than the confirmed count, due to last-minute cancellations.)
Thanks, Milan! I agree that one impact of the Forum is sharing new ideas, and there are other impacts as well – my colleagues JP and Aaron have written some of these here.
I think our respondents were pointing out that, regardless of what the forum “should” be for, newcomers are going to assume that the content is representative of EA. One project we are working on is putting a new version of the handbook on the Forum, which we hope will provide newcomers with a more representative introduction to EA, while still keeping the existing aspects of the Forum for those who want new articles.
Thanks for asking about that. I agree that calls and emails are valuable, in addition to in-person visits. I think it's accurate that there are groups who would say they'd like more support from CEA, of various forms. The respondents here weren't current group leaders, though, so I believe the comment you're pointing to might not provide much data for your question. But it's a good question, and we've invested this year in building connections with more groups.
For example: Alex Barry, our Groups Associate, worked with LEAN on a group organizers survey this year, so we could get feedback from a wide range of groups and update our list of contacts. Alex has also had about 100 calls with group organizers this year, answered ~500 emails from group organizers, and had about 50 meetings during EA Globals. He can be booked by emailing email@example.com.
We also have a Slack channel for group organizers, as well as a Facebook group. This post on our groups support has more information – if you are a group organizer, please check out the resources listed there or let others know about them!
Sorry, fixed to be less jargony! CEA's former Individual Outreach (IO) team did a series of retreats on different topics in 2018. For example, the Ops Retreat brought together a group of people interested in finding EA operations roles; it included a workshop aimed at improving ops skills and chances to talk with different orgs that were planning to hire for ops positions.
Ah, I can see why that question would come up! I didn't see this document as “primarily about CEA’s successes and failures” – about half of the questions I asked were targeted towards things CEA directly does, but as you have noticed, about half were about the EA community in general.
As our goal is to grow and maintain the EA community, it’s important for us to understand how that community is functioning - even the aspects not directly related to CEA.
We have another post forthcoming that's focused more specifically on CEA, and will cover the kinds of issues noted on our “mistakes” page.
I'd be curious if you've thought any more about decentralizing some of what CEA does?
I'm glad you raised this, Peter. We have been thinking carefully about where our comparative advantage lies and which projects we are best placed to do. As mentioned above:
In 2019, we have made a concerted effort to be careful with our commitments and only agree to things we are confident we can deliver. This is reflected in internal processes, such as a commitments project in Asana where we record and regularly track progress on any commitments we have made, as well as external humility in scaling back the number of programs and promises we make.
This has resulted in us taking on fewer projects in 2019, and I expect the trend to continue in 2020. If people have particular opinions about which aspects of our work would be most valuable to “decentralize” (and what “decentralization” looks like), we would love to hear that.
Thanks for the suggestion, Ozzie! We agree it’s important that we understand community members’ experiences. I appreciate the pointer to Service Design.
We are working on an update to the mistakes page that will include more data and more recent issues, but I’m not certain if/when we’ll revise that particular item on the mistakes page. Still, I’ve noted the request - thanks.
Thanks for writing this – concerns about early mortality in r-selected species are something I've struggled to fully calculate, and this way of looking at it seems extremely helpful to me. The graphs in particular are very concise explanations of the underlying idea.
Some prophets say the world is gonna end tomorrow
But others say we've got a week or two
The paper is full of every kind of blooming horror
And you sit wondering
what you're gonna do.
I got it.
Come. And be my baby.
I also realize that there are other fanfictions, e.g. Friendship is Optimal, that, in theory at least, seem well-placed to introduce concerns about AI alignment to the public. To the extent you can explain why these were less successful than HP:MoR (or any general theory of what success looks like here), I would be interested in hearing it!
Thanks! While I am making demands on your time, I would also be interested in understanding your opinion of Crystal Society (which seems like it might be similar to what Miranda is proposing?), if you think it was successful in accomplishing the goals you hope Miranda's work would accomplish, and why or why not.
As one example thing I am confused about: you list HP:MoR as "very likely the single most important recruitment mechanism for productive AI alignment researchers," and it is not clear to me why Crystal Society has been so much less successful, given that it seems better targeted for that purpose (e.g. it's pretty clearly about the alignment problem).
effectively diverting attention and funding from more effective risk-reduction measures
Yeah, if you count "may distract from an even better intervention" as a reason why something is "not obviously good", then I think that basically nothing is obviously good. (Which might be true, just pointing out that this criticism seems pretty general.)
Here are some examples of communities and institutions that I think used fiction very centrally in their function
Ender's Game is often on military reading lists (e.g. here). A metric which strikes me as challenging but exciting would be to create a book which gets on one of these lists. (Or on the list of some influential person, e.g. Bill Gates list.)
This would also help me understand the theory of change. I agree with your assessment that some fiction has had a significant impact on the world, but would also guess that most fiction has approximately zero impact on the world, so I would be curious to better understand the "success conditions" for this grant.
I like this quote from the beginning of Strangers Drowning:
There is one circumstance in which the extremity of do-gooders looks normal, and that is war. In wartime — or in a crisis so devastating that it resembles war, such as an earthquake or a hurricane — duty expands far beyond its peacetime boundaries… In wartime, the line between family and strangers grows faint, as the duty to one’s own enlarges to encompass all the people who are on the same side. It’s usually assumed that the reason do-gooders are so rare is that it’s human nature to care only for your own. There’s some truth to this, of course. But it’s also true that many people care only for their own because they believe it’s human nature to do so. When expectations change, as they do in wartime, behavior changes, too.
In war, what in ordinary times would be thought weirdly zealous becomes expected… People respond to this new moral regime in different ways: some suffer under the tension of moral extremity and long for the forgiving looseness of ordinary life; others feel it was the time when they were most vividly alive, in comparison with which the rest of life seems dull and lacking purpose.
In peacetime, selflessness can seem soft — a matter of too much empathy and too little self-respect. In war, selflessness looks like valor. In peacetime, a person who ignores all obligations, who isn’t civilized, who does exactly as he pleases — an artist who abandons duty for his art; even a criminal — can seem glamorous because he’s amoral and free. But in wartime, duty takes on the glamour of freedom, because duty becomes more exciting than ordinary liberty…
This is the difference between do-gooders and ordinary people: for do-gooders, it is always wartime. They always feel themselves responsible for strangers — they always feel that strangers, like compatriots in war, are their own people. They know that there are always those as urgently in need as the victims of battle, and they consider themselves conscripted by duty.
Thanks for the clarification! I agree that there are lots of ways that spending money on yourself can make you more productive, and a gym membership seems plausibly like one of those for you. I'm just pointing out that not all ways of spending money on yourself improve your productivity (which is a claim you might not endorse, but seems to have gotten some traction in EA).