EA Forum Prize: Winners for August 2020

post by Aaron Gertler (aarongertler) · 2020-11-05T06:22:16.737Z · EA · GW · 3 comments

Contents

  What is the EA Forum Prize?
  About the winning posts and comments
  Donor Lottery Debrief
  The case of the missing cause prioritisation research
  Using Subjective Well-Being to Estimate the Moral Weights of Averting Deaths and Reducing Poverty
  More empirical data on 'value drift'
  Research Summary: The Subjective Experience of Time
  The winning comments
  The voting process
  Feedback
None
3 comments

This post is arriving late — my fault, not that of any other judge. We’re catching up on a Prize backlog and expect to be current again by the time October prizes are given.

CEA is pleased to announce the winners of the August 2020 EA Forum Prize! 

The following users were each awarded a Comment Prize ($75):

See here [? · GW] for a list of all prize announcements and winning posts.

What is the EA Forum Prize?

Certain posts and comments exemplify the kind of content we most want to see [? · GW] on the EA Forum. They are well-researched and well-organized; they care about informing readers, not just persuading them.

The Prize is an incentive to create content like this. But more importantly, we see it as an opportunity to showcase excellent work as an example and inspiration to the Forum's users.

About the winning posts and comments

Note: I write this section in first person based on my own thoughts, rather than by attempting to summarize the views of the other judges.

Donor Lottery Debrief [EA · GW]

This is the second time [EA · GW] a donor lottery winner has written a post about where they gave, and I hope to see many more such posts.

Elements I liked from this writeup:

The post is also beautifully written; every sentence adds new information and flows logically from the last. (This is true of most winning posts, but for whatever reason, I really noticed it here.)

The case of the missing cause prioritisation research [EA · GW]

Note: The author also received enough votes to win a prize for this post [EA · GW], but we only give out one prize per author, per month. This makes them the second “multi-winner” we’ve had, after Tegan McCaslin in March 2019 [EA · GW]. Congratulations!

So my request to you: Either disagree with me, tell me that sufficient progress is happening, or change how you act in some small way. Be a bit more uncertain, a bit more willing to donate to fund or to go into cause prioritisation research. And if you work in an EA org please stop focusing so much on the cause areas you each believe are most important and increase the amount of cause-neutral work and funding that you do.

I can’t think of many better ways to sum up the EA Forum than: “Either disagree with me [...] or change how you act in some small way.” Magnificent!

I found that line especially elegant, but I also appreciated the rest of this post; I think it argues convincingly that relatively little cause prioritization work is currently happening, relative to what would be ideal for our movement as a whole. 

Another thing I liked: Rather than making a blanket claim about cause prioritization as a whole, the author splits out that concept into a set of smaller concepts (e.g. “empirical cause selection beyond RCTs”, “consideration of different views and ethics”). This allows for more nuanced judgments about progress in different areas.

Using Subjective Well-Being to Estimate the Moral Weights of Averting Deaths and Reducing Poverty [EA · GW]

Our main purpose here is not to argue that the WELLBY method should be used, although we will briefly motivate it later. Rather, we want to show how it can be used.

I can’t recall many other Forum posts with this level of ambition. The authors don’t only propose a new metric we could use to estimate the value of different outcomes — they also make a detailed attempt to draw out what that metric would imply for certain real-world interventions. I’ve heard arguments before that EA should make more use of subjective well-being, but this is the first time I’ve had a concrete sense of what that might look like.

Other elements I liked about this post:

More empirical data on 'value drift' [EA · GW]

My estimate for someone who’s highly engaged, enthusiastic, and socially integrated would be about ~10% over 5 years. I estimate there are ~500 people in the community who are at this level of risk for value drift.

I’m always interested to see more discussion about value drift, which seems like an important factor in the long-term flourishing of the EA community. Marisa Jurczyk’s qualitative analysis of the phenomenon [EA · GW] won a Forum Prize. Now, Benjamin Todd has gone quantitative, combining a series of surveys and other data sources to produce an estimate of how often dedicated members stop participating actively in the community.

The post explains the context behind each source (good!), covers some considerations beyond those sources that might raise or lower our estimates (great!), and suggests additional work that others could do to further our knowledge in this area (splendid!!!).

Research Summary: The Subjective Experience of Time [EA · GW]

If you’ve ever had the misfortune of being in a car accident or fighting in a war zone or being attacked by a wild animal, you may already be familiar with putative differences in the subjective experience of time. 

When confronted with life-threatening circumstances, humans often report that time seems to slow down. Events that are over in tens of seconds seem to stretch on for minutes, allowing rapid assessment of the scene and quickfire decisions that, in some cases, save one’s life. These types of differences are also sometimes induced artificially. An LSD trip might seem to extend for days when in fact it was over in an afternoon.

Jason Schukraft has a track record of publishing really interesting research on the Forum — now, he’s taken steps to make some of it even more accessible, through a summary that condenses two previous research posts to under 5% of their total word count. As a professional writer, I can attest that this is very hard to do, and I love Schukraft’s commitment to helping people actually read what he’s written. 

If you know of a long research post that you wish had gotten more engagement — whether you wrote it, or someone else did — try producing your own summary!

The winning comments

I won’t write up an analysis of each comment. Instead, here are my thoughts on selecting comments for the prize [EA · GW].

The voting process

The winning posts were chosen by four people (Rob Wiblin didn’t vote this month):

All posts published in the titular month qualified for voting, save for those in the following categories: 

Voters recused themselves from voting on posts written by themselves or their colleagues. Otherwise, they used their own individual criteria for choosing posts, though they broadly agree with the goals outlined above.

Judges each had ten votes to distribute between the month’s posts. They also had a number of “extra” votes equal to [10 - the number of votes made last month]. For example, a judge who cast 7 votes last month would have 13 this month. No judge could cast more than three votes for any single post.


The winning comments were chosen by Aaron Gertler, though the other judges had the chance to nominate other comments and to veto comments they didn’t think should win.

Feedback

If you have thoughts on how the Prize has changed the way you read or write on the Forum, or ideas for ways we should change the current format, please write a comment or contact me.

3 comments

Comments sorted by top scores.

comment by abergal · 2020-11-06T01:40:53.590Z · EA(p) · GW(p)

Random thought: I think it would be kind of cool if there were EA forum prizes for people publicly changing their minds in response to comments/ feedback.

comment by Mark Xu · 2020-11-06T02:56:54.913Z · EA(p) · GW(p)

This creates weird incentives, e.g. I could construct a plausible-but-false view, make a post about it, then make a big show of changing my mind. I don't think the amounts of money involved make it worth it, but I'm wary of incentivizing things that are so easily gamed. 

comment by Aaron Gertler (aarongertler) · 2020-11-06T23:22:11.290Z · EA(p) · GW(p)

I think someone doing this in a post is likely to aid their chances of winning a prize, but that's not an official thing -- just based on how I'd expect judges to react (and how I might react, depending on the context). The "changed my mind" post/comment is one of several really good post/comment genres.