Posts

Establishing Oxford’s AI Safety Student Group: Lessons Learnt and Our Model 2022-09-21T07:57:13.294Z
Nick Bostrom - Sommar i P1 Radio Show 2022-06-12T18:04:14.432Z
Evidence, cluelessness, and the long term - Hilary Greaves 2020-11-01T17:25:47.589Z

Comments

Comment by juliakarbing on Think about EA alignment like skill mastery, not cult indoctrination · 2022-07-15T23:37:36.140Z · EA · GW

Hi :) I'm surprised by this post. Doing full-time community building myself, I have a really hard time imagining that any group (or sensible individual) would use these 'cult indoctrination techniques' as strategies to get other people interested in EA.

Was wondering if you could share anything more about specific examples / communities where you have found this happening? I'd find that helpful for knowing how to relate to this content as a community builder myself! :-) 


(To be clear, I could imagine repeating talking points and closed social circles happening as side effects of other things - more specifically of individuals often not being that good at following what a good argument is and therefore repeating something that seems salient to them, and of people naturally creating social circles with people they get along with. My point is that I find it hard to believe that any of this would be deliberate enough that this kind of criticism really applies! Which is why I'd find examples helpful - to know what we're specifically speaking about :) ) 

Comment by juliakarbing on Nick Bostrom - Sommar i P1 Radio Show · 2022-06-13T15:24:10.437Z · EA · GW

The transcript is here: https://docs.google.com/document/d/1l8-PEV0hVswDngYMtiJvoTZLACWeRzTreNHMFNxr0Ko/edit?usp=sharing

I'll add it to the post too :)