Posts

Run Posts By Orgs 2023-03-29T02:40:55.921Z
Books: Lend, Don't Give 2023-03-22T19:11:14.109Z
Write a Book? 2023-03-16T00:11:11.750Z
Milk EA, Casu Marzu EA 2023-02-27T14:14:50.002Z
Consent Isn't Always Enough 2023-02-24T15:43:08.049Z
Make Conflict of Interest Policies Public 2023-02-09T19:33:10.268Z
When Did EA Start? 2023-01-25T14:51:42.891Z
GWWC Pledge History 2023-01-24T15:54:46.988Z
GWWC's Handling of Conflicting Funding Bars 2023-01-11T20:31:04.973Z
GWWC Should Require Public Charity Evaluations 2023-01-09T20:11:18.854Z
CEA Disambiguation 2022-12-19T13:27:36.024Z
Historical Notes on Charitable Funds 2022-12-04T23:30:00.123Z
Our 2022 Giving 2022-12-04T15:35:04.424Z
If Professional Investors Missed This... 2022-11-16T15:00:00.125Z
Mildly Against Donor Lotteries 2022-11-01T18:10:00.130Z
Consider Applying to the Future Fellowship at MIT 2022-10-25T15:50:00.166Z
Research Deprioritizing External Communication 2022-10-06T12:20:00.126Z
Case Rates to Sequencing Reads 2022-09-21T02:00:00.158Z
Hiring Programmers in Academia 2022-07-24T20:20:00.248Z
Spending Update 2022 2022-07-19T14:10:00.259Z
Passing Up Pay 2022-07-13T14:10:00.157Z
Leaving Google, Joining the Nucleic Acid Observatory 2022-06-11T01:37:24.910Z
Revisiting "Why Global Poverty" 2022-06-01T20:20:00.193Z
Increasing Demandingness in EA 2022-04-29T01:20:00.132Z
US Taxes: Adjust Withholding When Donating? 2022-04-12T15:50:00.164Z
Responsible Transparency Consumption 2022-03-11T21:34:52.137Z
EA Dinner Covid Logistics 2021-12-11T21:50:00.669Z
Issues with Giving Multiplier 2021-09-29T21:40:00.637Z
What should "counterfactual donation" mean? 2021-09-23T12:59:09.842Z
GiveWell Donation Matching 2021-09-21T22:50:00.545Z
Limits of Giving 2021-03-04T02:20:00.618Z
When I left Google 2021-02-28T21:40:00.565Z
Giving Tuesday 2020 2020-11-30T22:30:00.575Z
EA Relationship Status 2020-09-19T01:50:00.599Z
Leaving Things For Others 2020-04-12T11:50:00.602Z
Why I'm Not Vegan 2020-04-09T13:00:00.683Z
Candy for Nets 2019-09-29T11:11:51.289Z
Long-term Donation Bunching? 2019-09-27T13:09:09.881Z
Effective Altruism and Everyday Decisions 2019-09-16T19:39:59.370Z
Answering some questions about EA 2019-09-12T17:44:47.922Z
There's Lots More To Do 2019-05-29T19:58:55.470Z
Value of Working in Ads? 2019-04-09T13:06:53.969Z
Simultaneous Shortage and Oversupply 2019-01-26T19:35:24.383Z
College and Earning to Give 2018-12-16T20:23:26.147Z
2018 ACE Recommendations 2018-11-26T18:50:57.764Z
2018 GiveWell Recommendations 2018-11-26T18:50:22.620Z
Donation Plans for 2017 2017-12-23T22:25:49.690Z
Estimating the Value of Mobile Money 2016-12-21T13:58:13.662Z
[meta] New mobile display 2016-12-05T15:21:22.121Z
Concerns with Intentional Insights 2016-10-24T12:04:22.501Z

Comments

Comment by Jeff Kaufman (Jeff_Kaufman) on Run Posts By Orgs · 2023-03-30T00:30:54.075Z · EA · GW

Edited to add something covering this, thanks!

Comment by Jeff Kaufman (Jeff_Kaufman) on Run Posts By Orgs · 2023-03-29T14:01:03.052Z · EA · GW

Just came across @Raemon saying something similar in 2017:

Running critical pieces by the people you're criticizing is necessary, if you want a good epistemic culture. (That said, waiting indefinitely for them to respond is not required. I think "wait a week" is probably a reasonable norm)

Reasons and considerations: [read more]

Comment by Jeff Kaufman (Jeff_Kaufman) on On what basis did Founder's Pledge disperse $1.6 mil. to Qvist Consulting from its Climate Change Fund? · 2023-03-28T20:00:47.353Z · EA · GW

I do think this is a hard balance, and as someone whose writing motivation is far stronger with the reward of immediate publication, it's one I've struggled with and probably one I've gotten this wrong in both directions. A norm of sharing a draft ahead of time and giving them, say, a week to prepare a response if they want, though, seems pretty good?

[EDIT: expanded this into a post]

Comment by Jeff Kaufman (Jeff_Kaufman) on On what basis did Founder's Pledge disperse $1.6 mil. to Qvist Consulting from its Climate Change Fund? · 2023-03-28T15:18:57.218Z · EA · GW

you seem to have been able to put this together (I hope) relatively quickly.

Johannes is being polite, which is a good response to public criticism, but as someone unrelated I'm going to be more blunt. You looked into a decision an org made and ended up with a lot of questions. Instead of asking the org or running a draft by them so they could prepare a response you posted your questions publicly in a way that looks a lot like an accusation of corruption. Practically, the organization needs to respond as soon as possible or many people will see your post, some will downgrade their view of the org, and most will never see the follow-up. That he was able to assemble something so quickly while on leave mostly speaks to the (unnecessary) urgency you gave to this situation, and not to how easy the task was. His response probably also skipped some steps grantmakers commonly have in writing publicly about their decisions, like running it by the grantee for accuracy.

As a positive example, I think the recent critical post Why I don’t agree with HLI’s estimate of household spillovers from therapy handled this well: if James had published that publicly on a Sunday night with no warning then HLI would have been scrambling to put together a response. Instead, James shared it in advance and we got a much more detailed response form HLI, published at the same time as the rest of the piece, which is really helpful for outsiders trying to make sense of the situation.

Comment by Jeff Kaufman (Jeff_Kaufman) on Some Comments on the Recent FTX TIME Article · 2023-03-26T11:37:54.735Z · EA · GW

The threat model is still unclear

Maybe the threat was via Sam being influential and people checking in with him about past employees?

41/49 After I left FTX US and up until November, nearly every conversation I had with a venture capitalist about my new company eventually came around to the same kind of question: “Is FTX investing? Is Sam okay with you doing this? Do you mind if we confirm with him?”

Comment by Jeff Kaufman (Jeff_Kaufman) on Shutting Down the Lightcone Offices · 2023-03-23T17:20:21.682Z · EA · GW

I didn't vote either way on your comment, but I take the disagreement to be people thinking (a) Will's community building work was the right choice given what he and others knew then and/or (b) finding people "who are good at running organisations and willing to the do the community-leadership/public facing stuff" is really hard.

Comment by Jeff Kaufman (Jeff_Kaufman) on Books: Lend, Don't Give · 2023-03-23T11:10:46.136Z · EA · GW

I like the idea of thinking of this as a spectrum! When I've done tabling, however, (only a little, when I was at Google) I still found loaning worthwhile: since we're all regularly coming to the same campus returning things wasn't that hard. And if someone had acted like they didn't think they'd be able to return it I'd have told them not to stress about it and to pass it on to someone else when they were done.

(Not trying to pick on you or your group! And you're better placed to figure out what's working for you.)

Comment by Jeff Kaufman (Jeff_Kaufman) on Assessment of Happier Lives Institute’s Cost-Effectiveness Analysis of StrongMinds · 2023-03-22T20:49:53.445Z · EA · GW

A comment here is not a good place for this: it's barely related to the content of the post. A new top level post, your shortform, or a comment on relevant post would be a much better fit.

(It's also very similar to a comment you wrote a week ago in another unrelated thread.)

Comment by Jeff Kaufman (Jeff_Kaufman) on Time Article Discussion - "Effective Altruist Leaders Were Repeatedly Warned About Sam Bankman-Fried Years Before FTX Collapsed" · 2023-03-22T17:22:50.766Z · EA · GW

How so? Aren't these both cases where Habryka has similar amounts of professional knowledge? If not, which case do you think he knew more about?

Comment by Jeff Kaufman (Jeff_Kaufman) on Design changes & the community section (Forum update March 2023) · 2023-03-22T17:12:08.560Z · EA · GW

Google Fonts who get fined from time to time for not being GDPR-compliant

I don't believe Google Fonts has been fined for GDPR violations? Are you think thinking about European publishers being fined for using Google Fonts, because this meant sending user IP addresses to an American company?

Comment by Jeff Kaufman (Jeff_Kaufman) on Design changes & the community section (Forum update March 2023) · 2023-03-22T17:08:22.547Z · EA · GW

A/B testing in general is great. For UI-related changes you generally want to run the experiment sticky per user, to reduce confusion and allow the time for users to adapt to changes. This does add statistical complexity, though, because one heavy user in an experimental treatment can have a large impact on aggregate statistics like "total number of comments per category".

Happy to talk more about this if you'd find it helpful; this is an area I used to work in.

Comment by Jeff Kaufman (Jeff_Kaufman) on Write a Book? · 2023-03-22T16:47:46.649Z · EA · GW

I like it!

Comment by Jeff Kaufman (Jeff_Kaufman) on Some Comments on the Recent FTX TIME Article · 2023-03-21T17:37:38.344Z · EA · GW

There are auditors that business-people know to trust less? Then why do they exist? I guess because most consumers don't know enough to downgrade trust?

Maybe the big 4 are enough more expensive that it's common for people to go with other firms for reasons other than "we're doing fraudulent stuff and hope to sneak it past auditors"? And so even if you would be able to afford one of the big 4 it doesn't send a strong signal by going with someone else?

Comment by Jeff Kaufman (Jeff_Kaufman) on Some Comments on the Recent FTX TIME Article · 2023-03-21T02:21:44.060Z · EA · GW

There are a lot of degrees of freedom in "worst". I'm not that familiar with Enron's accounting, but my impression is their finances were careful and relatively normal looking, just intentionally over-complex and with 'aggressive' choices that made things look much better than they actually were? So if you think sloppiness is a very serious issue in this business (a sensible position to hold!) then you might not need very much before the state of documentation can be 'worse' than Enron?

Comment by Jeff Kaufman (Jeff_Kaufman) on Some Comments on the Recent FTX TIME Article · 2023-03-20T19:41:47.544Z · EA · GW

Thanks for sharing your first-hand experience!

FTX’s new CEO has slammed their accounting practices, and I don’t understand why he would be incentivized to lie here

This is minor, but the more he paints a picture of FTX as a mess the harder a job it looks like he's taken on, the more slack people give him, and the better his efforts are likely to be judged in hindsight. I doubt he would lie as in "say things he knows are untrue" but it's very common for people to end up with somewhat self-serving beliefs.

EDIT: not saying their accounting practices were good! Just that I expect them to be less bad than if I took the CEO's statements literally.

Comment by Jeff Kaufman (Jeff_Kaufman) on Shutting Down the Lightcone Offices · 2023-03-19T12:37:10.793Z · EA · GW

That "floated" is so weasely!

Comment by Jeff Kaufman (Jeff_Kaufman) on How my community successfully reduced sexual misconduct · 2023-03-18T18:07:56.007Z · EA · GW

That's a neat approach! I think it only works for longer events, generally with pre-registration? You don't want to be requiring a class before you can attend, say, your first EA meetup.

(And within EA I think longer events maybe mostly already check in with the community health team?)

Comment by Jeff Kaufman (Jeff_Kaufman) on Time Article Discussion - "Effective Altruist Leaders Were Repeatedly Warned About Sam Bankman-Fried Years Before FTX Collapsed" · 2023-03-17T10:53:07.232Z · EA · GW

Sorry, edited my comment to clarify what I meant by being busy here. The idea isn't that people might not have time to respond to questions, it's that the CEO of a fast growing startup deciding they don't have time to be on the board of a foundation isn't likely to generate questions.

Comment by Jeff Kaufman (Jeff_Kaufman) on Time Article Discussion - "Effective Altruist Leaders Were Repeatedly Warned About Sam Bankman-Fried Years Before FTX Collapsed" · 2023-03-16T21:58:27.287Z · EA · GW

Would the board normally face questions if someone left? Especially if they [edit: as in "the person leaving"] clearly had other big things they were working on?

Comment by Jeff Kaufman (Jeff_Kaufman) on Time Article Discussion - "Effective Altruist Leaders Were Repeatedly Warned About Sam Bankman-Fried Years Before FTX Collapsed" · 2023-03-16T17:46:08.619Z · EA · GW

Can you say more about how this works with organizations like EA Germany? I don't know anything about SBF leaving the CEA board, but here's a plausible case:

  1. Some people have concerns about him.

  2. They talk quietly among each other.

  3. Someone respected quietly takes SBF aside and says they think he should resign.

  4. He resigns, looking externally like anyone else who leaves the board including for reasons like "I don't have time for this now that my company is growing rapidly".

In this case I think we probably wouldn't learn about 1-3 or the motivation for 4 even if there was community oversight, unless the people with the concerns or the people they shared them with decided to make them public. And this doesn't seem like a question of governance?

I do see how governance might affect what the community learned if he didn't want to step down and was voted out, but even people who don't want to leave projects often will do it without being officially forced to go if they can see how the official process would turn out and leaving voluntarily lets them save face or get concessions.

Comment by Jeff Kaufman (Jeff_Kaufman) on Time Article Discussion - "Effective Altruist Leaders Were Repeatedly Warned About Sam Bankman-Fried Years Before FTX Collapsed" · 2023-03-16T17:38:39.705Z · EA · GW

... no longer effective in comparison to alternatives we could counterfactually fund ...

I suspect a big part of the disagreement here is whether this aspect of the analogy holds?

Comment by Jeff Kaufman (Jeff_Kaufman) on Write a Book? · 2023-03-16T16:03:08.170Z · EA · GW

While people reading the book changing their altruistic behavior in ways that counterfactually improve the world is one way I see this book being valuable, I think a larger component of its value would be via people better understanding what EA is about, and what EAs are doing and why. As above:

Existing EA writing is also generally aimed at an elite audience. I see why some people have decided to take that approach, but I also think it's really important to have a presentation of these ideas grounded in common sense. If we ignore the general public we leave EA's popular conception to be defined by people who don't understand our ideas very well.

How people who don't decide to get into EA view EA approaches to the world matters, and I think we've been neglecting this. I'm concerned about a growing dynamic where we're increasingly misunderstood, people who don't actually substantively disagree with us reflexively counter EA efforts, and people who would find EA ideas helpful don't engage with them because their osmosis-driven impression of EA is mistaken.

Comment by Jeff Kaufman (Jeff_Kaufman) on How my community successfully reduced sexual misconduct · 2023-03-16T11:16:46.407Z · EA · GW

The original text was ambigous between a description of policy and of outcomes. My reading now is that it was intended as the latter, though people are likely to interpret it as the former and think it's advocating not looking into accusation credibility?

Comment by Jeff Kaufman (Jeff_Kaufman) on GPT-4 is out: thread (& links) · 2023-03-15T19:17:51.823Z · EA · GW

I don't have GPT-4 access, but I was curious how GPT-3.5 got the first question wrong. I just tried it:

Me: can I drive from Boston to Portland, Maine without passing through New Hampshire?

GPT-3.5: Yes, you can drive from Boston to Portland, Maine without passing through New Hampshire by taking the coastal route. To do this, take I-93 North out of Boston and then take Exit 37A for I-95 North. Follow I-95 North until Exit 44 for I-295 North towards South Portland. Continue on I-295 North until Exit 7 for Route 1 North. Follow Route 1 North until you reach Portland. This route will take you along the coast and bypass New Hampshire. However, please note that this route may take longer than driving through New Hampshire on I-93.

It gets the bottom line correct, but the details are completely wrong.

Comment by Jeff Kaufman (Jeff_Kaufman) on Time Article Discussion - "Effective Altruist Leaders Were Repeatedly Warned About Sam Bankman-Fried Years Before FTX Collapsed" · 2023-03-15T17:53:15.725Z · EA · GW

Was this was all private individual warnings, or were the also public components?

Comment by Jeff Kaufman (Jeff_Kaufman) on Faunalytics Analysis on Reasons for Abandoning Veg*n Diets · 2023-03-15T15:43:02.989Z · EA · GW

I sent an email to Faunalytics and they've now fixed this post and the linked report.

Comment by Jeff Kaufman (Jeff_Kaufman) on If EAs won't go vegan what chance do animals have? · 2023-03-15T15:37:33.654Z · EA · GW

https://github.com/rethinkpriorities/ea_data_public has "The actual code and data is in the EA-data private repo. A line in the main_2020.R file there copies the content to this repo in a parallel folder on one's hard drive, to be pushed here. ... No data will be shared here, for now at least."

Comment by Jeff Kaufman (Jeff_Kaufman) on If EAs won't go vegan what chance do animals have? · 2023-03-15T15:35:56.888Z · EA · GW

Looking at the chart henrith posted, it looks to me like the GWWC=yes line crosses 10% just below $300k/y, which is still high but well below $1M/y.

Additionally, eyeballing the points on the chart, it looks to me like there's an issue with the way the fit works, where people earning less donating less makes it look like people who earn more also donate less?

It looks like the chart came from Rethink Priorities EA Survey 2020 Series: Donation Data. Maybe the data is public and I can check this...

Comment by Jeff Kaufman (Jeff_Kaufman) on Shallow Investigation: Stillbirths · 2023-03-15T00:08:31.863Z · EA · GW

Thanks! I missed that it was absolute. A rate would make more sense, though I guess absolute numbers do tell us the size of the opportunity.

Comment by Jeff Kaufman (Jeff_Kaufman) on Shallow Investigation: Stillbirths · 2023-03-14T21:10:35.779Z · EA · GW

Why is the rate in ocean Oceania so low? I would naively have expected it to be decent but not as good as in richer parts of the world? Is "no data" being coded as "zero occurrence"?

Comment by Jeff Kaufman (Jeff_Kaufman) on It's not all that simple · 2023-03-14T13:37:50.392Z · EA · GW

How much of this is explained by the proportion of in-person EA activity that is in the Bay Area?

Comment by Jeff Kaufman (Jeff_Kaufman) on How my community successfully reduced sexual misconduct · 2023-03-13T16:45:19.530Z · EA · GW

Posted: https://www.jefftk.com/p/decentralized-exclusion

Comment by Jeff Kaufman (Jeff_Kaufman) on Abuse in LessWrong and rationalist communities in Bloomberg News · 2023-03-13T14:46:41.649Z · EA · GW

Neither Scotts banning of Vassar nor the REACH banning was quiet.

I think these were relatively quiet. The only public thing I can find about REACH is this post where Ben objects to it, and Scott's listing was just as "Michael A" and then later "Michael V".

Comment by Jeff Kaufman (Jeff_Kaufman) on What does Silicon Valley Bank’s collapse mean for the funding situation of EA? · 2023-03-13T13:59:56.998Z · EA · GW

The FDIC, Federal Reserve, and Treasury announced that depositors will be fully protected, and have access to all of their money starting 2023-03-13, so any EA-destined funds at SVB should be fine.

Comment by Jeff Kaufman (Jeff_Kaufman) on How my community successfully reduced sexual misconduct · 2023-03-13T02:31:13.129Z · EA · GW

I'll probably expand this into a blog post. I'd like to include a bit about Michael Vassar, and am thinking about how to fit it in. I think his case is somewhere between Tsipursky/Williams/Dill and Reese's? There's Jax's 2018 thread which is tricky for the purpose of (2) because it (not faulting her) mixes several categories of things (very serious, distasteful, annoyances) and brings in a lot of different people. There's also this LW thread where in 2019 people were explaining why he needed to be kicked out, but then in 2020/2021 influential people changing their minds and regretting their decision. I'm not sure how much of this is disagreement about whether (1) was met vs not having a clear (2), though it seems like both contributed?

Comment by Jeff Kaufman (Jeff_Kaufman) on How my community successfully reduced sexual misconduct · 2023-03-13T01:41:30.325Z · EA · GW

Large decentralized communities can kick people out, but it's hard. In the successful cases I've seen it has required:

  1. The behavior is clearly unacceptable, in a way that nearly everyone would agree if they looked into it.

  2. Detailed public accusations, so that people can look into it if they doubt the consensus.

The combination of these means that you can have an initial burst of 'drama' in which lots of people learn about the situation and agree that the person should be kicked out, and then this can be maintained whenever they show up again.

Some successful examples:

  • 2016: The EA community kicking out Gleb Tsipursky for a range of shady things and a pattern of apologizing and then continuing (details).

  • 2017: The contra dance community kicking out Jordy Williams after accusations of grooming and rape (details).

  • 2018: The rationality community kicking out Brent Dill after accusations of sexual abuse, gaslighting, and more (details).

The main cases I've seen where this has not been successful are ones where the community didn't have (1) or (2). For example, I've seen people try to apply this playbook to Jacy Reese, but because exactly what he apologized for is vague it doesn't entirely stick.

Unfortunately this approach depends on people making public accusations, which is really hard. We should support people when they do and recognize their bravery, but people will often have valid reasons why they won't: fear of retaliation, unwilling to have that level of public scrutiny, risk of legal action. In those cases it's still possible to make some progress privately, especially in keeping the person out of official events, but the issues of decentralized communities and defamation law make it hard to go further.

Comment by Jeff Kaufman (Jeff_Kaufman) on Against EA-Community-Received-Wisdom on Practical Sociological Questions · 2023-03-10T19:29:59.084Z · EA · GW

If the issue is that CEA communicated poorly or you misunderstood the rejection, I agree that's not necessarily worth getting into. But you've made a strong claim about how CEA makes decisions based on the contents of a message, whose author is willing to make public. It looks to me like you essentially have two choices:

  • Agree to make the message public, or

  • Onlookers interpret this as an admission that your claim was exaggerated.

Comment by Jeff Kaufman (Jeff_Kaufman) on Racial and gender demographics at EA Global in 2022 · 2023-03-10T15:45:56.913Z · EA · GW

This is also an issue with "Asian" -- it is such a broad category (3/5 of people!) that it combines groups with very different experiences. For example, among Burmese Americans 25% are classified by the US as living in poverty, compared to 6% of Indian Americans.

The question of what "underrepresented in EA" means is also pretty tricky, especially when you're looking at conferences in multiple countries with different groups and histories. This summary seems to handle it by looking at the differences in breakdown between applicants, attendees, moderators, and speakers, but if people are left out of all of these groups that doesn't show up in these stats.

Comment by Jeff Kaufman (Jeff_Kaufman) on Make Conflict of Interest Policies Public · 2023-03-10T03:42:19.207Z · EA · GW

Thanks!

Comment by Jeff Kaufman (Jeff_Kaufman) on Against EA-Community-Received-Wisdom on Practical Sociological Questions · 2023-03-09T14:26:24.580Z · EA · GW

I think an exaggerated version of this is a big part of what went wrong with Leverage.

Comment by Jeff Kaufman (Jeff_Kaufman) on Suggestion: A workable romantic non-escalation policy for EA community builders · 2023-03-09T13:25:21.452Z · EA · GW

> Weekly events surely wouldn't be included

I don't think that's obvious from the post? It describes the AR policy as including "for one month after an evening event" and then recommends community builders adopt this policy.

But I'm not really sure that we disagree on anything -- I was trying to answer your "what makes you less convinced about applying it directly?"

Comment by Jeff Kaufman (Jeff_Kaufman) on Milk EA, Casu Marzu EA · 2023-03-09T13:19:19.976Z · EA · GW

The second objection does sound like saying it is harmful, thanks!

The first one is more mixed. My interpretation has always been that people were saying they didn't think it was very useful, not that it was harmful: I doubt the person making the objection thinks that all of the money will go to buy luxuries, and if some of the money goes to buy valuable things and some of it goes to buy luxuries that are essentially morally neutral then the effect is less positive than if it all went to buy valuable things. But maybe they think that providing luxuries is actually harmful, and not just neutral? (Which, conditional on thinking they spend lots of the money on drugs and alcohol, it could easily be, since it's funding people to buy addictive drugs they won't be able to continue consuming.)

Comment by Jeff Kaufman (Jeff_Kaufman) on Consent Isn't Always Enough · 2023-03-08T23:02:31.620Z · EA · GW

I mostly like and agree with Ozy's post (thanks for linking!) though it's a bit unclear which norm proposals it is objecting to. For example, if it's the "don't be poly" norm then totally, but the "sleep around less within the EA community on the margin" then it's less convincing. (Still may be right in this case, but it would help to address it specifically.)

Comment by Jeff Kaufman (Jeff_Kaufman) on FTX Poll Post - What do you think about the FTX crisis, given some time? · 2023-03-08T19:51:04.699Z · EA · GW

I think on priors we'd expect something like this in relation to an $8bn fraud to lead to the removal of someone from a random board.

In general, I think the traditional approach of sacking people when something they're associated with explodes is very harmful, and I'm glad that most places I've worked at explicitly don't do this.

Comment by Jeff Kaufman (Jeff_Kaufman) on Suggestion: A workable romantic non-escalation policy for EA community builders · 2023-03-08T19:18:55.459Z · EA · GW

Example issues with applying it directly:

  • If someone hosts weekly EA meetups for their community, mostly because they happen to be a local EA whose apartment/house has a good common space and is centrally located, they can't date anyone in their local EA community, since mostly someone will have attended one of their events within the last month.

  • If someone speaks at an EAGx they can't date anyone who attended for 3m, even if the attendee is of equal or higher status/power within EA.

Comment by Jeff Kaufman (Jeff_Kaufman) on Suggestion: A workable romantic non-escalation policy for EA community builders · 2023-03-08T19:11:43.340Z · EA · GW

My confusion is actually over the intent, and not the implementation. Like, both of these are things I could see people proposing:

  1. Facilitators don't make any advances towards attendees.

  2. Additionally attendees don't towards facilitators.

I currently am guessing Severin is proposing (2), but the "non-escalation" phrasing meant I wanted to check whether they actually meant (1).

Comment by Jeff Kaufman (Jeff_Kaufman) on Suggestion: A workable romantic non-escalation policy for EA community builders · 2023-03-08T18:19:33.830Z · EA · GW

Thanks for sharing this relevant experience from another community! I think there's probably something we can learn from this, though I'm less convinced about applying it directly.

In the AR community it seems to me like "facilitator" vs "attendee" is a good proxy for power differentials and the differential fades as time since the event passes. This seems much less applicable here: someone influential in a field (org leader, senior researcher, community builder, etc) in that people listen to their views on what work should be done, who is good, where funding should go, etc, has most of their influence/power independent of any specific event or interaction.

This seems most applicable to small intensive workshops: if A leads a weekend retreat for fifteen people then I think this rule is great. But if A speaks on a panel at a small session at a 1,000 person conference or is influential without any specific event, then I think we mostly need to handle this with different norms?

Comment by Jeff Kaufman (Jeff_Kaufman) on Suggestion: A workable romantic non-escalation policy for EA community builders · 2023-03-08T18:04:02.548Z · EA · GW

I'm confused about whether this is proposing "non escalation" or "non participation". From the body of the post I was guessing the latter ("facilitators are prohibited from engaging romantically, or even hinting at engaging romantically, with attendees"), but the title had the former?

If an attendee attempts to escalate with a facilitator is that allowed if the attendee makes all of the advances and the facilitator only responds by indicating whether those advances are welcome? Or does the facilitator reject the advances, perhaps citing the policy?

Comment by Jeff Kaufman (Jeff_Kaufman) on What are the EA movement's most notable accomplishments? · 2023-03-08T15:06:38.253Z · EA · GW

"Concrete problems in AI safety" was written by EAs and has 1.8k citations

Comment by Jeff Kaufman (Jeff_Kaufman) on Risks from solar flares? · 2023-03-07T12:16:48.680Z · EA · GW

In 2015 David Roodman looked into this for OpenPhil and wrote up a four-part report, but I don't know about anything more recent.

The Cause Prioritization Wiki only links that work which makes me think it's the only attempt within EA. There's probably non-EA work on this?