Posts

Acceptance and Commitment Therapy (ACT) 101 2022-09-24T19:07:06.623Z
Evie Cottrell's Shortform 2022-09-14T17:56:43.005Z
"Agency" needs nuance 2022-09-12T16:52:16.134Z
Networking for Nerds [linkpost] 2022-07-06T13:34:59.705Z
Seven ways to become unstoppably agentic 2022-06-18T09:30:22.132Z

Comments

Comment by Evie Cottrell on Any recommendation for how to explain EA funds grants to friend? · 2022-09-26T10:15:17.377Z · EA · GW

I’ve been considering writing a post about my experience of receiving a grant, and the downsides I didn’t anticipate beforehand. It would probably look similar to this comment but with more info.

Comment by Evie Cottrell on Giving opportunity: $50 'thank you' notes · 2022-09-26T10:07:48.941Z · EA · GW

General encouragement for having done something risky (a wacky title) and then deciding against it and changing it. The first sentence of the changed post made me laugh.

Comment by Evie Cottrell on Any recommendation for how to explain EA funds grants to friend? · 2022-09-26T10:02:38.321Z · EA · GW

I strong upvoted because I am still very confused about this as well (but for a slightly different reason).

I received EA funding earlier this year and many people in my hometown found it confusing at best, and that I was “stealing from a charity” at worst. I still find it very anxiety provoking and stressful to talk about the grant. I’m not really sure how to present it or explain it to people outside the community. The impression that some people in my hometown had was kinda like “why should a privileged white girl get money to do something so unnecessary as move to Oxford a few months early?? There are people who struggle to make ends meet here. There’s a war in Ukraine goddamn it! And this money is to “make the world better”? Who are these people? Who does Evie think she is?” (I realise that this sounds exaggerated but its unfortunately not. I received quite a lot of online hate from people from my hometown, which was pretty upsetting).

My grant wasn’t for a specific project, which makes it harder to explain. It was primarily so that I could leave my school and move to Oxford (to self-study the remainder of high school and take the exams independently in Oxford). I’ve used it to pay exam fees, living expenses in oxford (eg rent and food), buy textbooks, get a tutor. I guess the grant was kind of like an individual version of the Atlas scholarship. There were a number of reasons why I wanted to leave my school (which I won’t discuss here), but I was not doing well there at the time.

I honestly sometimes feel pretty uncertain about whether this was a good use of money for EAIF. At times, I have definitely felt deep shame and guilt for having applied for and taken the money.

When I was living in Oxford, I think I was less willing to socialise with non-EAs, because I felt so anxious about explaining why I was living away from my family to do my A-Levels. As a result, my whole social circle consisted of EAs (not ideal for diversity/ development/ self-growth/ life-robustness).

I’m going to uni soon and I’m again not sure how to explain to people why I’ve lived in Oxford before and why I self-studied my A-Levels (which is very unconventional in the UK).

Anyway, I appreciate that you posted this, and think it raises some valid concerns.

Comment by Evie Cottrell on EA forum content might be declining in quality. Here are some possible mechanisms. · 2022-09-25T14:38:55.344Z · EA · GW

“Better” could mean lots of things here. Including: more entertaining; higher quality discussion; more engagement; it’s surpassed a ‘critical mass’ of people to sustain a regular group of posters and a community; better memes; more intellectually diverse; higher frequency of high quality takes; the best takes are higher quality; more welcoming and accessible conversations etc.

The aims of EA Twitter are different to the forum. But I think the most important metrics are the “quantity of discussion” ones.

My impression is that:

  • There are more “high quality takes” on EA Twitter now than a year ago (mostly due to more people being on it and people posting more frequently).
  • The “noise:quality ratio” is pretty bad on EA Twitter. Most of the space seems dominated by shit posting and in-group memes to me.

Obvs, shit posting is fine if that’s what you want. But I think it’s useful to be clear what you mean when you say “better”. If someone was looking for high quality discussion about important ideas in the world, I would personally not recommend them EA Twitter.

Comment by Evie Cottrell on Acceptance and Commitment Therapy (ACT) 101 · 2022-09-25T07:09:40.110Z · EA · GW

Thanks for sharing! That's useful to know.

I'll look into adding to the post later today.

Comment by Evie Cottrell on Acceptance and Commitment Therapy (ACT) 101 · 2022-09-24T19:34:26.040Z · EA · GW

If I was going to spend longer on this post, I'd make it more empirical and talk through evidence for/against the effectiveness of ACT. 

As it is, I didn't want to spend significantly longer writing it, so I've gone for a summary of the core ideas -- so that readers can assess the vibe and see if it's something that sounds interesting to them.

This might have been the wrong call though.

Comment by Evie Cottrell on Case Study of EA Global Rejection + Criticisms/Solutions · 2022-09-23T13:33:52.512Z · EA · GW

I also wanna give general encouragement for sharing a difficult rejection story.

Comment by Evie Cottrell on Case Study of EA Global Rejection + Criticisms/Solutions · 2022-09-23T13:25:52.498Z · EA · GW

Sorry that your experience of this has been rough. 

Some quick thoughts I had whilst reading:

  • There was a vague tone of "the goal is to get accepted to EAG" instead of "the goal is to make the world better," which I felt a bit uneasy about when reading the post. EAGs are only useful in so far as they let community members to better work in the real world. 
    • Because of this, I don't feel strongly about the EAG team providing feedback to people on why they were rejected. The EAG team's goals isn't to advise on how applicants can fill up their "EA resume." It's to facilitate impactful work in the world. 
  • I remembered a comment that I really liked from Eli: "EAG exists to make the world a better place, rather than serve the EA community or make EAs happy."
  • [EDIT after 24hrs: I now think this is probably wrong, and that responses have raised valid points.] You say"[others] rely on EA grants for their projects or EA organizations for obtaining jobs and therefore may be more hesitant to directly and publicly criticize authoritative organizations like CEA."  I could be wrong, but I have a pretty strong sense that nearly everyone I know with EA funding would be willing to criticise CEA if they had a good reason to. I'd be surprised if {being EA funded} decreased willingness to criticise EA orgs. I even expect the opposite to be true.
    • (Disclaimer that I've received funding from EA orgs)

 

Sorry that the tone of the above is harsh -- I'm unsure if it's too harsh or whether this is the appropriate space for this comment. 

I've err-ed on the side of posting because it feels relevant and important.

Comment by Evie Cottrell on Case Study of EA Global Rejection + Criticisms/Solutions · 2022-09-23T12:37:57.167Z · EA · GW

(Currently reading the post and noticing that many of the links go to the top of the same google doc. I assume this isn’t supposed to be the case. This could be because I’m on mobile, but also could be an error with the links.)

(Also congrats on your first forum post! Go you :) )

Comment by Evie Cottrell on We’re still (extremely) funding constrained (but don’t let fear of getting funding stop you trying). · 2022-09-16T09:39:47.034Z · EA · GW

This poem really made me smile; thanks for writing it Luke :)

Comment by Evie Cottrell on Evie Cottrell's Shortform · 2022-09-14T18:02:14.236Z · EA · GW

This relates to a caveat in my recent post:

  • I’m concerned about too much EA meta conversation — about worlds where most of EA dialogue is talking about EAs talking about EAs (lots of social reality and not enough object-level). 
  • These sorts of convos are often very far removed from {concrete things that help the world}, and I worry about them taking away attention from more important ones. 
  • I think it’s probably much better (for the world) for conversations to stay focused on the real world, object-level claims and arguments.

Part of me wants to flesh this thought out properly soon. But even this conversation is meta! And I'm trying to encourage/ focus more on object-level ideas. So do I write it? I'm not sure. 

Comment by Evie Cottrell on Evie Cottrell's Shortform · 2022-09-14T17:56:43.105Z · EA · GW

A distinction I've found useful is "object-level" vs "social reality". They are both adjectives that describe types of conversation/ ideas.

Object-level discussions are about ideas and actions (e.g. AI timelines, the mechanics of launching a successful startup). Object-level ideas are technical, empirical, and often testable. Object-level refers to what ideas are important or make sense. It is focused on truth-seeking and presenting arguments clearly. 

Social reality discussions are about people and organisations (e.g. Will MacAskill, Open Philanthropy). Social reality is more meta, more abstract, and less testable than object-level. Social reality refers to which people are influential/powerful (and what they think), how to network with people, how to persuade people.

Object-level: What's the probability of extinction this century?

Social reality: What does Toby Ord think is the probability of extinction this century?

I have found it very helpful to start labelling whether I'm in object-level conversation mode vs social reality conversation mode. It helps me notice when I'm deferring without having thought about it (e.g. "well, Will MacAskill says [x]"  instead of asking myself what I think about [x]), or when I fall into a mode of chit-chatting about the who's-who of EA, instead of trying to truth-seek (of course, chit chatting sometimes is fine -- I just want to be intentional about when I'm doing it).

And social reality isn't necessarily bad, but it's helpful to flag when a conversation enters "social reality mode."

I do think it's good for many/more/most conversations to centre around the object-level. I am personally trying to move my ratio more towards object-level.

(This was a core theme of an Atlas camp I attended, which I found extremely valuable. The above definitions are loosely based on a message from Jonas, but I didn't run them by him before posting.)

Comment by Evie Cottrell on "Agency" needs nuance · 2022-09-14T17:06:13.648Z · EA · GW

Thanks for your comment!

this could've been mostly avoided by a consideration of Chesterton's Fence

Meh, I don't think so. This taken to its extreme looks like "be normie."

I'm now worried about e.g. high school summer camp programs that prioritize the development of unbalanced agency.

I'm pretty confident that (for ESPR at least) this was a one off fluke! I'm not worried about this happening again (see gavin's comment above).

Comment by Evie Cottrell on Where are the EA Influencers? · 2022-09-13T18:48:53.571Z · EA · GW

Hey, thanks for writing. 

I also used to feel extremely confused about this (e.g. I thought that in-person university groups were "woefully inefficient" compared to social media outreach). I did not understand why there weren't EA youtubers or social media marketing campaigns. Much of my own social conscious had been shaped by online creators (e.g. veganism and social justice ideas), and it felt like a tragedy that EA was leaving so much lying on the table. 

I now am less optimistic about short-form social media outreach. Mostly because:

  • It seems really hard to preserve epistemics in low fidelity mediums like TikTok;
  • I don't see that much value in EA being a house-hold name, if it's a meme-y, low-resolution version (but my mind could easily be changed on this); 
  • I care about selecting for nerdiness and intellectual curiosity;
  • I'm cautious of EA being associated too much with specific influencers;
  • I don't want EA to become (or be perceived as) a social media trend.

All that being said, I do think there are versions of social media outreach that could be great (and aren't currently being done).

I'm excited about more longer form youtube content (e.g. Rob Miles). It would be cool if one of the LEEP founders/ CE incubatees started vlogging about the experience of running a high impact charity (or something similar). 

Fwiw, youtuber Ali Abdaal has some videos promoting longtermism, 80k, and GWWC. And 80k is currently ramping up their marketing and starting to pay influencers to promote 80k.

Comment by Evie Cottrell on "Agency" needs nuance · 2022-09-13T18:09:57.286Z · EA · GW

If I ask three people for their time, they don't know whether they're helping me get from 0 to 1 person helping with this, 1 to 2 or 9 to 10 for all they know.

Nice, yup, agreed that the information asymmetry is a big part of the problem in this case. I wonder if it could be a good norm to give someone this information when making requests. Like, explicitly say in a message "I've asked one other person and think that they will be about as well placed as you to help with this" etc

I do also think that there's a separate cost to making requests, in that it actually does impose a cost on the person. Like, saying no takes time/energy/decision-power. Obviously this is often small, and in many cases it's worth asking. But it's a cost worth considering.

 

(Now I've written this out, I realise that you weren't claiming that the info asymmetry is the only problem, but I'm going to leave the last paragraph in).

to avoid resenting EA in the medium to long term

This is great and only something I've started modelling recently. Curious about what you think this looks like in practice. Like, is it more getting at a mindset of "don't beat yourself up when you fall short of your altruistic ideals"? Or does it also inform real world decisions for you?

 model EA as sort of an agent

Nice 

Comment by Evie Cottrell on "Agency" needs nuance · 2022-09-13T17:57:49.983Z · EA · GW

Hmm, interesting. Thanks for clarifying, that does work better in this context (although it's confusing if you don't have the info above)

Comment by Evie Cottrell on "Agency" needs nuance · 2022-09-13T17:54:59.850Z · EA · GW

Yup, another commenter is correct in that I am assuming that the goals are altruistic.

Comment by Evie Cottrell on "Agency" needs nuance · 2022-09-13T17:50:21.918Z · EA · GW

Hey, thanks for asking.

On the first point:

  • Throughout both of my posts, I've been using a nicher definition than the one given by the Stanford philosophy dictionary. In my last post, I defined it as "the ability to get what you want across different contexts – the general skill of coming up with ambitious goals [1] and actually achieving them, whatever they are."
  • But, as another commenter said, the term (for me at least) is now infused with a lot of community context and implicit assumptions. 
  • I took some of this as assumed knowledge when writing this post, so maybe that was a mistake on my part.

On the second point:

  • I'm a bit confused by the question. I'm not claiming that there's an ideal amount of agency or that it should be regulated. 
  • Saying that, I expect that some types of agency will be implicitly socially regulated. Like, if someone frequently makes requests of others in a community, other people might start to have a higher bar for saying yes. Ie, there might be some social forces pushing in the opposite direction. 
    • I don't think that this is what you were getting at, but I wanted to add.
Comment by Evie Cottrell on "Agency" needs nuance · 2022-09-13T17:36:29.287Z · EA · GW

Thanks, that's useful to know! :)

Comment by Evie Cottrell on "Agency" needs nuance · 2022-09-13T17:34:14.316Z · EA · GW

Thanks for your comment Aaron! :)

I don't think I ever got the sense-even intuitively or in a low-fidelity way- that "agency" was identical to/implied/strongly overlapped with "a willingness to be social domineering or extractive of others' time and energy" 

I wrote about this because it was the direction in which I noticed myself taking "be agentic" too far. It's also based on what I've observed in the community and conversations I've had over the past few months. But I would expect people to "take the message too far" in different ways (obvs whether someone has taken it too far is subjective, but you know what I mean).

But what you wrote seems to imply that there was no functional or causally attributable harm to anyone from your missing school.

Yeah, nobody was harmed, and I do endorse that I did it. It did feel like a big cost that my teachers trusted/liked me less though. 

Note that I was a bit reluctant to include the school example, because there's lots of missing context, so it's not conveying the full situation. But the main point was that doing unconventional stuff can make people mad, and this can feel bad and has costs.

Comment by Evie Cottrell on "Agency" needs nuance · 2022-09-13T17:21:45.564Z · EA · GW

Wait, I'm not actually sure I want to change the inside view thing, I'm confused. I was kinda just describing a meme-y version of hustling -- therefore the low-resolution version of "has inside views" is fine. 

has strong inside views which overrule the outside view

I'm not really sure what you mean by this.

Comment by Evie Cottrell on "Agency" needs nuance · 2022-09-13T17:19:12.305Z · EA · GW

Thanks :)

Your learned helplessness is interesting, because to me the core of agency is indeed nonsocial

Yeah, the learned helplessness is a weird one. I felt kinda sad about it because I used to really pride myself on being self sufficient. I agree that the social part of agency should only be a small part -- I think I leaned too far into it.

strong inside views which overrule the outside view

Thanks for the inside view correction. Changing that now, and will add that Owen originally coined social agency.

ESPR 2021 went a little too hard on agency

Fwiw I did get a lot of value out of the push for agency at ESPR. Before that,  I was too far in the other direction. Eg: I was anxious that others would think I was "entitled" if I asked for anything or just did stuff; felt like I had to ask for permission for things; cared about not upsetting authority figures, like teachers. I think that I also cared about signalling that I was agreeable -- and ESPR helped me get over this.

Comment by Evie Cottrell on My closing talk at EAGxSingapore · 2022-09-13T17:00:19.311Z · EA · GW

Ah nice, I was missing that context. Yup, the angle of confidence building seems good for this audience.

Comment by Evie Cottrell on My closing talk at EAGxSingapore · 2022-09-12T16:21:35.643Z · EA · GW

Congrats for organising EAGx -- that's huge! :)

Sorry for being a downer, but I want to push back on the subtext that it's (always) good for people to be willing to "lend a helping hand, whether it's sending a message, reviewing a draft or hopping onto a call?"

My rough thoughts:

  • Some people say yes to too many things and don't value their time highly enough. 
  • Sometimes, it's the right call for someone to say no to helping others in their immediate environment.
  • It's often hard to say no, even when it's the right call.
  • I'm worried about a culture where {saying yes to peoples' requests} --> {you're a nice and helpful person} --> {it's good that you're an EA because you're warm and welcoming}.
  • I'm worried about the message "EA is warm and welcoming because people are willing to give you their time" making it harder for people to say no.

This might not be super relevant -- especially if most of the audience would err on the side of not asking for help. 

But just wanted to comment because it came to mind. 

The overall message of "people are kind and not scary and probably willing to help" is a nice one though!

Comment by Evie Cottrell on Could it be a (bad) lock-in to replace factory farming with alternative protein? · 2022-09-11T12:47:06.077Z · EA · GW

Thanks! Also this is a small point but I find it easier to skim articles when they have formatted headings (so there's an overview of the article on the left hand side). You can do this using the forum formatting features.

Comment by Evie Cottrell on Could it be a (bad) lock-in to replace factory farming with alternative protein? · 2022-09-10T16:35:24.743Z · EA · GW

What does PB/CM mean?

Comment by Evie Cottrell on Valuing lives instrumentally leads to uncomfortable conclusions · 2022-09-05T10:41:15.745Z · EA · GW

“C1: A person in a poor country whose life is saved experiences less welfare than a person in a rich country whose life is saved”

(Asking a dumb question here, but,) is this true? Ie, does an increase in material wealth actually increase psychological wellbeing?

I have an intuition that psychological well-being is mostly affected by how wealthy you are compared to your peer group.

Maybe you’re talking about individuals poor countries who are below the poverty line (in which case, I agree that they would experience much less psychology wellbeing).

But I would be surprised if individuals in rich countries are actually happier than individuals in poor countries (who have all their basic needs met).

Comment by Evie Cottrell on Young EAs should choose projects more carefully · 2022-09-02T16:24:59.963Z · EA · GW

Thanks for writing this post ! It resonated and I feel like I've fallen into a similar mindset before. 

It reminds me of a point made here: "like, will we wish in 5 years that EAs had more outside professional experience to bring domain knowledge and legitimacy to EA projects rather than a resume full of EA things?"

 

When reading the post, this felt especially true and unfortunate: "They get the reputation as someone who can “get shit done” but in practice, they’re usually solving ops bottlenecks at the cost of building harder-to-acquire skills."

Comment by Evie Cottrell on Digital Networking for Dummies · 2022-07-08T15:00:24.635Z · EA · GW

I like this, and think that networking as a teen is super useful and high ROI. (Maybe I'm biased because networking opened up opportunities for me.)

I really like this post as a starting guide! Thanks for writing

Comment by Evie Cottrell on Doom Circles · 2022-07-08T14:26:23.626Z · EA · GW

I feel concerned about versions of this where there is implicit social pressure to:

  • stay;
  • seem fine with the critiques given;
  • participate in the first place.

Like, if its implicitly socially costly to opt out, it's pretty hard for an individual to opt out.

I also think that it is hard to avoid these pressure-y dynamics in practice. Especially when people really want to be included in the social group.

 

I can imagine a scenario where there is a subtext of: 

"You can opt out. Of course. But, as we know, the real hard-core and truth-seeking people stay. And this social group values truth-seeking. So... you can leave. But that is an out-group thing to do. Come on guys, it's virtuous to seek-truth! And we are just providing you with an opportunity to do that! Don't tell me that you'd rather hide from the truth than get your feelings hurt."

(I'm overdoing this a bit to illustrate my point.)

Comment by Evie Cottrell on Fill out this census of everyone who could ever see themselves doing longtermist work — it’ll only take a few mins · 2022-06-22T11:35:35.807Z · EA · GW

Given that I was aiming to spend only a few mins on the census, I don't expect that I would have scrolled through the post to find the description of the cause area. 

But some people might, so could be useful. 

Comment by Evie Cottrell on Fill out this census of everyone who could ever see themselves doing longtermist work — it’ll only take a few mins · 2022-06-22T08:39:58.362Z · EA · GW

Pretty confused by what some of the cause areas are (e.g. epistemic institutions). I expect my responses were less helpful/ accurate bc of not knowing what some of them meant.

Comment by Evie Cottrell on How to become more agentic, by GPT-EA-Forum-v1 · 2022-06-20T15:04:14.614Z · EA · GW

Wow haha this is pretty cool! And also an entertaining read

Comment by Evie Cottrell on Seven ways to become unstoppably agentic · 2022-06-19T11:50:45.467Z · EA · GW

Thank you for the comments!

I agree with some of what you wrote. I don't want the subtext of the post to be "you should amass social capital  so that senior people will do you favours."

Some thoughts:

  • It’s generally the case that ‘social domineeringness’ is a trait that is rewarded by society. Similar to intelligence, people who have this quality will probs be more likely to achieve their goals. (This makes me kinda uncomfortable, but I think it’s broadly true and it doesn’t seem good to ignore it).
  • Given that this is the case, I want to encourage this quality in EAs. 
  • However, I would rather see EAs have this quality when interacting with non-EAs. Like, if young EAs all start asking senior EAs for favours, the EA landscape will become competitive and zero-sum.
  • BUT it seems strictly good for EAs to be socially domineering in non-EA contexts. Like… I want young EAs to out-compete non-EAs for internship or opportunities that will help them skill build. (This framing has a bad aesthetic, but I can’t think of a nicer way to say it.)

I’m curious about the specific parts that you think people would be allergic to.

Comment by Evie Cottrell on Seven ways to become unstoppably agentic · 2022-06-19T11:07:21.777Z · EA · GW

My guess is that it’s just very context dependent — I’m not sure how generalisable these sorts of numbers are. 

It also seems like the size of favours would vary a ton and make it hard to give a helpful number.

Comment by Evie Cottrell on Seven ways to become unstoppably agentic · 2022-06-18T16:45:11.716Z · EA · GW

[On the title -- you gotta have fun with these things haha]

Thanks Gavin! 

Yes, the laws of equal and opposite advice defo apply here. 

I also wonder whether this sort of thing becomes zero sum within a small enough environment (e.g. if everyone starts lowering their bar for asking for help, people will raise their bar for saying yes, because they will be inundated with requests). Could lead to competitor dynamics (discussed in the comments of this post), which seems unfortunate.

I really like the point of spending years 'becoming yourself'. Like, I probs just want my younger siblings to chill out and spend a lot of time with their friends and doing stuff that feels hedonically good to them. I like the point about groundedness. I felt ungrounded and uncertain when I was first immersed in EA, and I think this could (?) have been less if I was older. I'm kinda unsure, and think it's maybe inevitable to feel unsettled when you are introduced to and immersed in a very new culture/worldview in a short space of time. 

Where is Elizabeth's post on being a potted plant? Could you send it?

Comment by Evie Cottrell on Seven ways to become unstoppably agentic · 2022-06-18T16:26:35.994Z · EA · GW

Thanks Kevin :)

Comment by Evie Cottrell on What’s the theory of change of “Come to the bay over the summer!”? · 2022-06-18T15:40:26.485Z · EA · GW

This comment is great, and resonates with a lot of the stuff I found hard when I was first immersed in the community at an EA hub.

Comment by Evie Cottrell on Three Reflections from 101 EA Global Conversations · 2022-05-05T17:43:09.273Z · EA · GW

I really really loved section 2 of this post!! It articulates a mindset shift that I think is important and valuable, and I've not seen it written out like that before.