Concrete project lists

post by Richard_Batty · 2017-03-25T18:12:50.765Z · score: 32 (34 votes) · EA · GW · Legacy · 44 comments

There are lots of important project ideas in EA that people could work on, and I’d like to encourage people to explore more. When I was looking for projects to work on, I had difficulty thinking of what needed doing apart from obvious projects like raising money for GiveWell-recommended charities. I even had a sense that all the organisations that needed to exist existed, which is obviously not correct.

 

Fortunately many people have put together project ideas in important cause areas:

 

 

This is far from exhaustive, but it’s a start.

 

However, it’s not clear whether lack of ideas is actually what’s stopping people from working on new projects. So I’d be interested to know:

 

 

[This came out of this thread on why things don’t get done in the EA community. Thanks to John Maxwell for being a commitment device.]

 

44 comments

Comments sorted by top scores.

comment by lukeprog · 2017-03-26T00:14:24.081Z · score: 16 (15 votes) · EA · GW

Several additional suggestions in Technical and Philosophical Questions That Might Affect Our Grantmaking.

comment by RyanCarey · 2017-03-25T19:59:13.222Z · score: 15 (14 votes) · EA · GW

It would be good to have a longer list of important research areas, so that we can all walk around with a cache of important research topics in case we run into EAs working in nearby areas. Then it can become common knowledge that it is useful for such people to perform literature reviews or to settle into those fields. Personally, I'm interested in the domain of risky emerging technologies, so I can list some related areas:

comment by Daniel_Eth · 2017-03-26T02:04:37.711Z · score: 4 (4 votes) · EA · GW

Good list!

Since we're on the topic of brain emulation, I feel the need to plug my paper: https://www.degruyter.com/view/j/jagi.2013.4.issue-3/jagi-2013-0008/jagi-2013-0008.xml Which is a fair bit shorter than the Sandberg/ Bostrom paper, and also I think presents a more feasible path to WBE. My paper suggests scanning the brain via nanotechnology, while they suggest scanning through methods that seem like simple extensions of current brain scanning techniques.

comment by Mati_Roy · 2018-07-17T08:59:35.432Z · score: 0 (0 votes) · EA · GW

Any resources to recommend on "Changes in lie detection (medical imaging analysis, neuroscience)"?

comment by Daniel_Eth · 2017-03-26T01:56:12.149Z · score: 11 (10 votes) · EA · GW

This makes me wish we had basic income - I feel like the need for some income to fulfill basic needs stops people from "taking risks" and pursuing these sorts of projects.

comment by Peter_Hurford · 2017-03-26T04:07:24.501Z · score: 9 (9 votes) · EA · GW

I think it could be possible to set up a general sort of EA Fund for this sort of thing, sort of like how there is one for political activism. That could be a missing step on our quest to turn money into talent.

How long do you think someone would have basic income for before they could either "prove" their project and get actual donations / fundraising based on merit or they could go back to a day job? How much funding do you think this would take?

comment by RyanCarey · 2017-03-26T06:33:01.593Z · score: 4 (6 votes) · EA · GW

You could do unconditional basic income but why would you start with that when we haven't even created a facility for people to fund credible proposals yet? Seems better to reboot EA Ventures or Impact Certificates first (given that the EA community is a bit bigger, and that some of the reasons for previous failure were related to circumstance).

comment by Peter_Hurford · 2017-03-26T14:48:03.881Z · score: 18 (17 votes) · EA · GW

I guess another important next step would be learning from why similar things like EA Ventures, Impact Certificates, and the Pareto Fellowship didn't get more traction and were shut down.

comment by Zeke_Sherman · 2017-03-28T02:36:30.192Z · score: 4 (3 votes) · EA · GW

Pareto Fellowship was shut down? When? What happened?

comment by Peter_Hurford · 2017-03-28T02:52:21.109Z · score: 10 (9 votes) · EA · GW

We do not plan to continue the Pareto Fellowship in its current form this year. While we thought that it was a valuable experiment, the cost per participant was too high relative to the magnitude of plan changes made by the fellows. We might consider running a much shorter version of the program, without the project period, in the future. The Pareto Fellowship did, however, make us more excited about doing other high-touch mentoring and training with promising members of the effective altruism community.

From CEA's 2017 Fundraising Report.

comment by AGB · 2017-03-26T10:04:11.980Z · score: 12 (12 votes) · EA · GW

I'm sympathetic to this view, though I think the EA funds have some EA-Ventures-like properties; charities in each of the fund areas presumably can pitch themselves to the people running the funds if they so choose.

One difference that has been pointed out to me in the past is that for (e.g.) EA Ventures you have to put a lot of up-front work into your actual proposal. That's time-consuming and costly if you don't get anything out of it. That's somewhat different to handing some trustworthy EA an unconditional income and saying 'I trust your commitment and your judgment, go and do whatever seems most worthwhile for 6/12/24 months'. It's plausible to me that the latter involves less work on both donor and recipient side for some (small) set of potential recipients.

With that all said, better communication of credible proposals still feels like the relatively higher priority to me.

comment by RyanCarey · 2017-03-26T18:38:52.604Z · score: 2 (1 votes) · EA · GW

Agreed!

comment by Michael_PJ · 2017-03-26T18:35:05.810Z · score: 2 (2 votes) · EA · GW

One important thing to remember is that important projects may not look very credible initially. Any early-stage EA funding body needs to ask itself "would we fund an early-stage 80k?".

comment by Benjamin_Todd · 2017-03-26T21:12:25.850Z · score: 5 (9 votes) · EA · GW

Bear in mind that before we spent any money we had: been involved in 10-20 important plan changes that would already justify significant funding; built a website with thousands of views per month; and received top level press coverage.

comment by Michael_PJ · 2017-03-27T04:37:46.267Z · score: 3 (3 votes) · EA · GW

Fair!

I think I'm thinking of funding even earlier than 80k got money, though. 80k had presumably had very many hours of volunteer labour before it got to that point - we might want to fund things earlier than that.

comment by Daniel_Eth · 2017-03-26T05:28:45.039Z · score: 3 (3 votes) · EA · GW

Hmm, actually I like this idea. I'd assume that if someone's been working for 6 months, then they should have something to show for it. And maybe 2 years to actually get a project to the point that it's either succeeding/ failing on its own. Since most EAs live in expensive cities, that could be around $2k per month minimum.

So that would be around $12k for someone to "try a project out," and then if the project is doing well then around $50k per capita to see if the project can be "successful" or not. That's plausibly worth it.

So I guess we should add [EA fund for people to start new projects] to the list of projects that EAs maybe should start. One thing we'd have to consider is how do we make sure we don't just get people conning us for free money?

comment by rochelleh · 2017-03-26T19:07:52.485Z · score: 0 (0 votes) · EA · GW

Some EA projects may fall within the scope of that existing political activist funding opportunity as well.

comment by Daniel_Eth · 2017-03-27T02:42:58.284Z · score: 1 (1 votes) · EA · GW

Any ideas of which projects in particular?

comment by remmelt · 2017-04-06T19:22:01.123Z · score: 2 (2 votes) · EA · GW

My approach here is to look for ways to help people in the EA community save money on basic needs. A pattern I'm noticing is that they often seem to be good for community building too.

Examples of this:

1) The EA Safety net project, which I've just started working on with dedicated others.

2) Shared housing for people involved with EA & rationality. An especially promising example is the Accelerator Project, I think. I've also found 19 rationality/EA houses around the world so far (I'm slowly working on getting one going in the Netherlands).

3) Even simpler: couchsurfing

I think that scaling cost-saving solutions like these are a more promising area to explore than funding basic incomes (depending on how many people take part for the time put into kickstarting the project). Whether spending time on starting a cost-saving project yourself is worth it does depend on your skills and opportunities.

For me, funding movement building/far future orgs generally makes more sense than a basic income (most of which goes to giving a coordinated group of people incomes so they can take risks) unless a basic income would target high-potential people only. Or perhaps you could fund someone to start a cost-savings project. :-)

comment by Greg_Colbourn · 2017-04-07T13:04:47.994Z · score: 2 (2 votes) · EA · GW

There is also the Kernel Project (Manchester, UK) - rationalist & rationalist-adjacent low cost living and community building. I would be happy to see more EAs involved.

comment by Raemon · 2017-03-25T22:32:07.056Z · score: 10 (9 votes) · EA · GW

Thanks for doing this!

My sense is what people are missing is a set of social incentives to get started. Looking at any one of these, they feel overwhelming, they feel like they require skills that I don't have. It feels like if I start working on it, then EITHER I'm blocking someone whose better qualified from working on it OR someone who's better qualified will do it anyway and my efforts will be futile.

Or, in the case of research, my bad quality research will make it harder for people to find good quality research.

Or, in the case of something like "start one of the charities Givewell wants people to start", it feels like... just, a LOT of work.

And... this is all true. Kind of. But it's also true that the way people get good at things is by doing them. And I think it's sort of necessary for people to throw themselves into projects they aren't prepared for, as long as they can get tight feedback looks that enable them to improve.

I have half-formed opinions about what's needed to resolve that, that can be summarized as "better triaged mentorship." I'll try to write up more detailed thoughts soon.

comment by DonyChristie · 2017-03-27T18:48:24.349Z · score: 2 (3 votes) · EA · GW

Please do! Have you gotten started yet? :-) #humancommitmentdevice

comment by Zeke_Sherman · 2017-03-28T02:47:01.234Z · score: 1 (1 votes) · EA · GW

This is odd. Personally my reaction is that I want to get to a project before other people do. Does bad research really make it harder to find good research? This doesn't seem like a likely phenomenon to me.

comment by Raemon · 2017-03-29T23:09:06.589Z · score: 1 (1 votes) · EA · GW

How could bad research not make it harder to find good research? When you're looking for the research, you have to look through additional things before you find the good research, and good research is fairly costly to ascertain in the first place.

comment by Leah_E · 2017-03-27T17:05:16.550Z · score: 5 (5 votes) · EA · GW

Animal Charity Evaluators also has a post of Charities We'd Like To See.

comment by lukeprog · 2017-06-12T19:57:21.319Z · score: 4 (4 votes) · EA · GW

Several additional research project ideas are now listed in section 5 of my new report on consciousness and moral patienthood.

comment by John_Maxwell_IV · 2017-03-27T02:23:23.508Z · score: 4 (4 votes) · EA · GW

Thanks for doing this!

Some more relevant links:


comment by John_Maxwell_IV · 2018-06-02T11:07:53.776Z · score: 2 (2 votes) · EA · GW

For someone interested in doing research, especially if they're comfortable formulating their own research question, I think just having a list of topics can be helpful. Here is a list of lists of EA topics:

Other directories of EA content:

comment by John_Maxwell_IV · 2018-06-02T07:52:38.224Z · score: 1 (1 votes) · EA · GW

Here are some more AI safety problem lists which don't appear in the main post (there is probably lots of redundancy between these lists):

I agree with Jessica Taylor that one should additionally aim to acquire one's own perspective about how to solve the alignment problem.

comment by John_Maxwell_IV · 2017-08-06T21:44:12.535Z · score: 0 (0 votes) · EA · GW

This comment also has some interesting links.

Here are a couple more:

https://www.lesswrong.com/posts/CmRxryEbvAHcuaPuR/information-generating-research-projects

https://guzey.com/personal/what-should-you-do-with-your-life/

comment by Tor · 2017-03-27T01:00:55.952Z · score: 4 (4 votes) · EA · GW

I wanted to dedicate myself to making youtube-videos at some point, but I have another project that I'm prioritizing instead so I'm not spending much time on this at the moment. However, with enough outside help I think that together we might achieve a good impact.

Making videos for existing channels (like e.g. Vox) without expecting payment from them, is a possibility to look into, although convincing them probably would be hard, and the requirements for quality would be challenging. The most likely scenario is publishing to a channel of our own though.

Here is a google-document with some relevant thoughts (ideas and drafts for episodes, etc): https://docs.google.com/document/d/1iVRb85Dkb6e04M0mAkiCJwPg7Y29cjUQboXy-OdCcus/edit#

Video-production is a kind of task where a large fraction of the work can be done while listening to podcasts and such once one has done the necessary learning, but many hours are required. If anyone could be interested in working on this, and could see themselves potentially making this project a significant priority in their lives for several years, then feel very free to search up "Tor Barstad" on Facebook and get in touch for a video-conversation or something :)

comment by Peter_Hurford · 2017-03-27T01:35:58.642Z · score: 2 (2 votes) · EA · GW

You and others considering prioritizing this may be interested in Charity Science Outreach's shallow review of content marketing.

comment by lifelonglearner · 2017-03-29T02:06:31.379Z · score: 1 (1 votes) · EA · GW

Just want to respond that I'd be interested in doing this sort of thing for a short period of time (a few months) to test to waters.

comment by Zeke_Sherman · 2017-03-28T02:45:09.491Z · score: 3 (3 votes) · EA · GW

I think we need more reading lists. There have already been one or two for AI safety, but I've not seen similar ones for poverty, animal welfare, social movements, or other topics.

comment by Benjamin_Todd · 2017-06-15T06:28:56.877Z · score: 0 (0 votes) · EA · GW

Here's a general purpose one: https://80000hours.org/articles/further-reading/

comment by casebash · 2017-03-27T04:44:12.305Z · score: 1 (1 votes) · EA · GW

Many of those seem like individual projects. Does anyone have any suggestions for projects that would be particularly good for EA groups?

comment by Richard_Batty · 2017-03-27T13:37:27.397Z · score: 3 (3 votes) · EA · GW

A lot of these would be good for a small founding team, rather than individuals. What do you mean by 'good for an EA group?'

comment by casebash · 2017-03-27T13:56:51.522Z · score: 1 (1 votes) · EA · GW

Like a local university group or local city meetup.

comment by Richard_Batty · 2017-03-27T14:33:33.616Z · score: 0 (0 votes) · EA · GW

Not sure, it's really hard to make volunteer-run projects work and often a small core team do all the work anyway.

This half-written post of mine contains some small project ideas: https://docs.google.com/document/d/1zFeSTVXqEr3qSrHdZV0oCxe8rnRD8w912lLw_tX1eoM/edit

comment by BartCraft · 2017-10-25T11:13:42.168Z · score: 0 (0 votes) · EA · GW

Thanks for the info.