Nathan Young's Shortform

post by Nathan Young (nathan) · 2019-11-05T09:11:05.415Z · score: 3 (1 votes) · EA · GW · 25 comments

25 comments

Comments sorted by top scores.

comment by Nathan Young (nathan) · 2020-09-25T22:28:06.125Z · score: 30 (11 votes) · EA(p) · GW(p)

Sam Harris takes Giving What We Can pledge for himself and for his meditation company "Waking Up"

Harris references MacAksill and Ord as having been central to his thinking and talks about Effective Altruism and exstential risk. He publicly pledges 10% of his own income and 10% of the profit from Waking Up. He also will create a series of lessons on his meditation and education app around altruism and effectiveness.

Harris has 1.4M twitter followers and is a famed Humanist and New Athiest. The Waking Up app has over 500k downloads on android, so I guess over 1 million overall. 

https://dynamic.wakingup.com/course/D8D148

I like letting personal thoughts be up or downvoted, so I've put them in the comments.

comment by Nathan Young (nathan) · 2020-09-25T22:36:07.270Z · score: 6 (4 votes) · EA(p) · GW(p)

Harris is a marmite figure - in my experience people love him or hate him. 

It is good that he has done this. 

Newswise, it seems to me it is more likely to impact the behavior of his listeners, who are likely to be well-disposed to him. This is a significant but currently low-profile announcement. As will the courses be on his app. 

 I don't think I'd go spreading this around more generally, many don't like Harris and for those who don't like him, it could be easy to see EA as more of the same (callous superior progessivism).

In the low probability (5%?) event that EA gains traction in that space of the web (generally called the Intellectual Dark Web - don't blame me, I don't make the rules) I would urge caution for EA speakers who might pulled into polarising discussion which would leave some groups feeling EA ideas are "not for them".

comment by MichaelDickens · 2020-09-26T03:55:18.374Z · score: 6 (4 votes) · EA(p) · GW(p)

Harris is a marmite figure - in my experience people love him or hate him.

My guess is people who like Sam Harris are disproportionately likely to be potentially interested in EA.

comment by David_Moss · 2020-09-26T09:24:01.380Z · score: 16 (6 votes) · EA(p) · GW(p)

This seems quite likely given EA Survey data where, amongst people who indicated they first heard of EA from a Podcast and indicated which podcast, Sam Harris' strongly dominated all other podcasts.

More speculatively, we might try to compare these numbers to people hearing about EA from other categories. For example, by any measure, the number of people in the EA Survey who first heard about EA from Sam Harris' podcast specifically is several times the number who heard about EA from Vox's Future Perfect. As a lower bound, 4x more people specifically mentioned Sam Harris in their comment than selected Future Perfect, but this is probably dramatically undercounting Harris, since not everyone who selected Podcast wrote a comment that could be identified with a specific podcast. Unfortunately, I don't know the relative audience size of Future Perfect posts vs Sam Harris' EA podcasts specifically, but that could be used to give a rough sense of how well the different audiences respond.

comment by Aaron Gertler (aarongertler) · 2020-09-28T08:42:46.322Z · score: 2 (1 votes) · EA(p) · GW(p)

Notably, Harris has interviewed several figures associated with EA; Ferriss only did MacAskill, while Harris has had MacAskill, Ord, Yudkowsky, and perhaps others.

comment by David_Moss · 2020-09-28T09:02:28.979Z · score: 3 (2 votes) · EA(p) · GW(p)

This is true, although for whatever reason the responses to the podcast question seemed very heavily dominated by references to MacAskill. 

This is the graph from our original post [EA(p) · GW(p)], showing every commonly mentioned category, not just the host (categories are not mutually exclusive). I'm not sure what explains why MacAskill really heavily dominated the Podcast category, while Singer heavily dominated the TED Talk [EA · GW] category.

comment by Nathan Young (nathan) · 2020-09-25T22:37:21.101Z · score: 4 (3 votes) · EA(p) · GW(p)

The address (in the link) is humbling and shows someone making a positive change for good reasons. He is clear and coherent.

Good on him.

comment by Nathan Young (nathan) · 2020-07-23T13:25:53.266Z · score: 14 (9 votes) · EA(p) · GW(p)

I strongly dislike the following sentence on effectivealtruism.org:

"Rather than just doing what feels right, we use evidence and careful analysis to find the very best causes to work on."

It reads to me as arrogant, and epitomises the worst caracatures my friends do of EAs. Read it in a snarky voice (such as one might if they struggled with the movement and were looking to do research) "Rather that just doing what feels right..."

I suggest it gets changed to one of the following:

  • "We use evidence and careful analysis to find the very best causes to work on."
  • "It's great when anyone does a kind action no matter how small or effective. We have found value in using evidence and careful analysis to find the very best causes to work on."

I am genuinely sure whoever wrote it meant well, so thank you for your hard work.

comment by Stefan_Schubert · 2020-07-23T14:26:46.542Z · score: 9 (7 votes) · EA(p) · GW(p)

Are the two bullet points two alternative suggestions? If so, I prefer the first one.

comment by Matt_Lerner (mattlerner) · 2020-07-23T14:42:06.207Z · score: 8 (7 votes) · EA(p) · GW(p)

I also thought this when I first read that sentence on the site, but I find it difficult (as I'm sure its original author does) to communicate its meaning in a subtler way. I like your proposed changes, but to me the contrast presented in that sentence is the most salient part of EA. To me, the thought is something like this:

"Doing good feels good, and for that reason, when we think about doing charity, we tend to use good feeling as a guide for judging how good our act is. That's pretty normal, but have you considered that we can use evidence and analysis to make judgments about charity?"

The problem IMHO is that without the contrast, the sentiment doesn't land. No one, in general, disagrees in principle with the use of evidence and careful analysis: it's only in contrast with the way things are typically done that the EA argument is convincing.

comment by Nathan Young (nathan) · 2020-07-25T12:12:27.305Z · score: 3 (2 votes) · EA(p) · GW(p)

I would choose your statement over the current one.

I think the sentiment lands pretty well even with a very toned down statement. The movement is called "effective altruism". I think often in groups are worried that outgroups will not get their core differences when generally that's all outgroups know about them.

I don't think that anyone who visits that website won't think that effectiveness isn't a core feature. And I don't think we need to be patronising (as EAs are charactured as being in conversations I have) in order to make known something that everyone already knows.

comment by Nathan Young (nathan) · 2020-10-10T16:02:31.487Z · score: 8 (3 votes) · EA(p) · GW(p)

Is there a way to sort shortform posts?

comment by Nathan Young (nathan) · 2020-09-18T15:21:26.101Z · score: 8 (6 votes) · EA(p) · GW(p)

EA short story competition?

Has anyone ever run a competition for EA related short stories?

Why would this be a good idea?
* Narratives resonate with people and have been used to convey ideas for 1000s of years
* It would be low cost and fun
* Using voting on this forum there is the same risk of "bad posts" as for any other post

How could it work?
* Stories submitted under a tag on the EA forum.
* Rated by upvotes
* Max 5000 words (I made this up, dispute it in the comments)
* If someone wants to give a reward, then there could be a prize for the highest rated
* If there is a lot of interest/quality they could be collated and even published
* Since it would be measured by upvotes it seems unlikely a destructive story would be highly rated (or as likely as any other destructive post on the forum)

Upvote if you think it's a good idea. If it gets more than 40 karma I'll write one. 
 

comment by Nathan Young (nathan) · 2020-10-07T11:43:12.138Z · score: 7 (6 votes) · EA(p) · GW(p)

EA Book discount codes.

tl;dr EA  books have a positive externality. The response should be to subsidise them

If EA thinks that certain books (doing good better, the precipice) have greater benefits than they seem, they could subsidise them.

There could be an EA website which has amazon coupons for EA books so that you can get them more cheaply if buying for a friend, or advertise said coupon to your friends to encourage them to buy the book.

From 5 mins of research the current best way would be for a group to buys EA books and sell them at the list price but provide coupons as here - https://www.passionintopaychecks.com/how-to-create-single-use-amazon-coupons-promo-codes/

Alternatively, you could just sell them at the coupon price.
 

comment by Ozzie Gooen (oagr) · 2020-10-11T22:22:25.338Z · score: 2 (1 votes) · EA(p) · GW(p)

I think people have been taking up the model of open sourcing books (well, making them free). This has been done for [The Life You can Save](https://en.wikipedia.org/wiki/The_Life_You_Can_Save) and [Moral Uncertainty](https://www.williammacaskill.com/info-moral-uncertainty). 

I think this could cost $50,000 to $300,000 or so depending on when this is done and how popular it is expected to be, but I expect it to be often worth it.

comment by Nathan Young (nathan) · 2020-10-12T16:16:28.994Z · score: 1 (1 votes) · EA(p) · GW(p)

Seems that the Ebook/audiobook is free. Is that correct?

I imagine being able to give a free physcial copy would have more impact.

comment by SamiM · 2020-10-12T17:06:44.887Z · score: 1 (1 votes) · EA(p) · GW(p)

Yes, it's free.

comment by alexrjl · 2020-10-07T20:04:31.510Z · score: 2 (2 votes) · EA(p) · GW(p)

I like this idea and think it's worth you taking further. My initial reactions are:

  • Getting more EA books into peoples hands seems great and worth much more per book than the cost of a book.
  • I don't know how much of a bottleneck the price of a book is to buying them for friends/club members. I know EA Oxford has given away many books, I've also bought several for friends (and one famous person I contacted on instagram as a long shot who actually replied.
  • I'd therefore be interested in something which aimed to establish whether making books cheaper was a better or worse idea than just encouraging people to gift them.
  • John Behar/TLYCS probably have good thoughts on this.
comment by Nathan Young (nathan) · 2020-10-10T15:58:05.955Z · score: 1 (1 votes) · EA(p) · GW(p)

Do you have any thoughts as to what the next step would be. It's not obvious to me what you'd do to research the impact of this.

Perhaps have a questionnaire asking people how many people they'd give books to at different prices. Do we know the likelihood of people reading a book they are given?

comment by Nathan Young (nathan) · 2020-09-18T12:13:58.117Z · score: 6 (4 votes) · EA(p) · GW(p)

This perception gap site would be a good form for learning and could be used in altruism. It reframes correcting biases as a fun prediction game.

https://perceptiongap.us/

It's a site which gets you to guess what other political groups (republicans and democrats) think about issues.

Why is it good:

1) It gets  people thinking and predicting. They are asked a clear question about other groups and have to answer it.
2) It updates views in a non-patronising way - it turns out dems and repubs are much less polarised than most people think (the stat they give is that people predict 50% of repubs hold extreme views, when actually it's 30).  But rather than yelling this, or an annoying listicle, it gets people's consent and teachest something.
3) It builds consensus. If we are actually closer to those we disagree with than we think, perhaps we could work with them.
4) It gives quick feedback. People learn best when given feedback which is close to the action. In this case, people are rapidly rewarded for thoughts like "probably most of X group" are more similar to me that I first think.

Imagine:

What percentage of neocons want insitutional reform?
What % of libertarians want an end to factory farming?
What % of socialists want an increase in foreign direct aid?

Conlusion

If you want to change people's minds, don't tell them stuff, get them to guess trustworthy values as a cutesy game.

comment by Nathan Young (nathan) · 2020-02-04T16:25:05.211Z · score: 5 (4 votes) · EA(p) · GW(p)

Does anyone know people working on reforming the academic publishing process?

Coronavirus has caused journalists to look for scientific sources. There are no journal articles because of the lag time. So they have gone to preprint servers like bioRxiv (pronounced bio-archive). These servers are not peer reviewed so some articles are of low quality. So people have gone to twitter asking for experts to review the papers.

https://twitter.com/ryneches/status/1223439143503482880?s=19

This is effectively a new academic publishing paradigm. If there were support for good papers (somehow) you would have the key elements of a new, perhaps better system.

Some thoughts here too: http://physicsbuzz.physicscentral.com/2012/08/risks-and-rewards-of-arxiv-reporting.html?m=1

With Coronavirus providing a lot of impetus for change, those working in this area could find this an important time to increase visibility of their work.

comment by Sanjay · 2020-02-04T21:32:12.079Z · score: 2 (2 votes) · EA(p) · GW(p)

HaukeHillebrandt has recommended supporting Prof Chris Chambers to do this: https://lets-fund.org/better-science/

comment by Nathan Young (nathan) · 2020-10-14T17:30:17.352Z · score: 4 (3 votes) · EA(p) · GW(p)

UK government will pay for organisations to hire 18-24 year olds who are currently unemployed, for 6 months. This includes minimum wage and national insurance.

 

I imagine many EA orgs are people constrained rather than funding constrained but it might be worth it. 



And here is a data science org which will train them as well https://twitter.com/John_Sandall/status/1315702046440534017

 

Note: applications have to be for 30 jobs, but you can apply over a number of organisations or alongside a local authority etc. 



https://www.gov.uk/government/collections/kickstart-scheme

comment by Nathan Young (nathan) · 2020-06-23T13:36:00.729Z · score: 3 (3 votes) · EA(p) · GW(p)

Plant-based meat. Fun video from a youtuber which makes a strong case. Very sharable. https://youtu.be/-k-V3ESHcfA

comment by Nathan Young (nathan) · 2020-06-03T11:05:57.230Z · score: 1 (1 votes) · EA(p) · GW(p)

Mailing list for the new UK Conservative Party group on China.

Will probably be worth signing up to if that's your area of interest.

https://chinaresearchgroup.substack.com/p/coming-soon

Please comment any other places people could find mailing lists or good content for EA related areas.