Some longtermist fiction 2021-08-10T10:59:42.596Z
Are mice or rats (as pests) a potential area of animal welfare improvement? 2021-04-27T11:30:33.489Z
[New org] Canning What We Give 2021-04-01T06:32:07.207Z
My job is available: Finance and Data Lead at CEA! 2021-02-05T17:23:55.186Z
Paris-compliant offsets - a high leverage climate intervention? 2020-10-05T16:55:32.163Z
Does using the mortality cost of carbon make reducing emissions comparable with health interventions? 2020-09-21T19:47:42.434Z
A few really quick ideas about personal finance 2020-09-03T10:29:16.427Z
[Linkpost] Global death rate from rising temperatures to exceed all infectious diseases combined in 2100 2020-08-15T17:04:26.959Z
Covid offsets and carbon offsets 2020-07-23T21:19:08.929Z
Scrutinizing AI Risk (80K, #81) - v. quick summary 2020-07-23T19:02:55.558Z
"Music we lack the ears to hear" 2020-04-19T14:23:07.346Z
[Notes] Climate Shock by Wagner and Weitzman 2020-04-13T09:41:11.459Z
[Review and notes] How Democracy Ends - David Runciman 2020-02-13T22:30:49.548Z
[Notes] Steven Pinker and Yuval Noah Harari in conversation 2020-02-09T12:49:14.466Z
[Notes] Could climate change make Earth uninhabitable for humans? 2020-01-14T22:13:41.919Z
What should EAs interested in climate change do? 2020-01-10T17:34:59.650Z
What are good options for giving later? 2019-12-25T11:53:04.846Z
How meditation helps me be a more effective altruist 2019-09-23T15:09:51.756Z
Does climate change deserve more attention within EA? 2019-04-17T06:50:12.308Z


Comment by Ben (bdixon) on I'm interviewing prolific AI safety researcher Richard Ngo (now at OpenAI and previously DeepMind). What should I ask him? · 2022-09-29T17:24:21.971Z · EA · GW

Sounds interesting! I'd be interested in:

  1. Could Richard give a summary of his conversation with Eliezer, and on what points he agrees and disagrees with him? 
  2. (Perhaps this has been covered somewhere else) Could Richard give an broad overview of different approaches to AI alignment and which ones he thinks are most promising?


Comment by Ben (bdixon) on The Threat of Climate Change Is Exaggerated · 2022-07-12T13:22:01.285Z · EA · GW

Thanks for sharing this piece and looking for constructive feedback. I'd agree with most of the points made by other commenters.  I would also suggest:

  • Engage more with primary sources and more things  written by people outside of effective altruism. There are thousands of climate scientists with interesting things to say, and a relatively small number of people in EA thinking about this. 
  • General humility about this field - we don't have great data on what the climate and society will be like in 50, 100, 200, 500+ years time, and it's hard to know what the limits for habitation / adaption will be.
  • How would you define existential threat? I've heard David Wallace-Wells say that he thinks climate change is already an existential threat because it's already leading us to change how we live our lives. You seem to use Bostrom's definition. Why do you think it's better?  
Comment by Ben (bdixon) on What examples are there of (science) fiction predicting something strange/bad, which then happened? · 2022-04-27T20:52:56.074Z · EA · GW

Neuralink, and the Culture series also has voice-activated assistants that are a bit like Alexa 

Comment by Ben (bdixon) on Some longtermist fiction · 2021-08-19T16:32:55.327Z · EA · GW

Thanks! Did you think it was worth a read?

Comment by Ben (bdixon) on Some longtermist fiction · 2021-08-10T14:50:56.076Z · EA · GW

Interesting. OK, I added a link to this as an answer. Thanks for suggesting!

Comment by Ben (bdixon) on What novels, poetry, comics have EA themes, plots or characters? · 2021-08-10T14:50:23.827Z · EA · GW

I put down some fiction with a bit of a longtermist bent that I enjoyed here

Comment by Ben (bdixon) on Some longtermist fiction · 2021-08-10T14:20:19.069Z · EA · GW

I don't think any of the protagonists / characters in these books are "an EA" (whatever that means) in the way that question seems to be looking for. 

Comment by Ben (bdixon) on Some longtermist fiction · 2021-08-10T14:18:39.564Z · EA · GW

Fascinating, thanks for sharing!

Comment by Ben (bdixon) on MSc in Risk and Disaster Science? (UCL) - Does this fit the EA path? · 2021-05-26T11:07:06.570Z · EA · GW

Thanks - happy to help. 

1) You're right that pandemics and climate change are both part of the course. Taking the figures in the Precipice at face value,  the biggest risks are unaligned AI and engineered pandemics. From the unit list at UCL, and the biographies of the unit leaders on the 'natural and anthropogenic risks' unit, Joanna Faure Walker (a geophysicist) and Punam Yadav (who focuses on humanitarian projects), I couldn't see any specific content on weaponisation and conflict, which are topics I'm more interested in. That is not to say the course is not valuable - and there is no one EA route - but from my own perspective I think there's a lot of technical background I'd like to cover. Also I could see nothing on risks from AI anywhere in the UCL course, which seems like an oversight given advances in autonomous weapons today.  

2) Yes I've heard good things about the course. Having worked at CEA, I think it's worth dispelling the myth that there are recommended courses by the whole EA community - it seems to me that EA is a lot more disparate than people think. And you might disagree with me, and decide to do the course anyway - or do many other totally different things!

3) It took me about 2 years to gradually move my interests over from international relations and conflict theory to wanting to study computer science. 

Applying to masters courses is quite costly in time and fees - they require a personal statement, references from  your undergraduate degree, and something like a £150 application fee, per application. 

Bootcamps are typically rolling applications throughout the year, and if you did the masters you'd probably start Sep 2022, which means applying around Nov 2021 - June 2022 - quite a while away.

If you're submitting masters applications and considering coding bootcamps at the same time, it seems to me like it's worth spending a few months thinking about what you're interested in, and your comparative advantage might be first.  One way of doing this could be to try some online coding courses, and also read some IR books (e.g. Destined for War) and maybe write up a summary and review on the forum. Each of those would probably give you more information about each route and be a cheaper test than submitting applications. 

Comment by Ben (bdixon) on MSc in Risk and Disaster Science? (UCL) - Does this fit the EA path? · 2021-05-25T17:26:41.784Z · EA · GW

Hi there, 

[Some unofficial thoughts from my own research before considering whether I should do a course like this one to be a civil servant. Other people come from different perspectives which could change the conclusions for them]

I wanted to learn more about global risks, and had the aim of working on security policy. I spent several months researching courses, speaking to people at the departments. There are quite a lot that I think could be good - this list are all places in London, but they seemed to be the best UK ones I could find when I was looking last year.

For the UCL courses, I found that the UCL Institute for Risk and Disaster Reduction has a big focus on natural risks, and so the degrees have a comparatively small amount of content on the anthropogenic risks (caused by humans) - see the unit guide for the course above here. In my view, in agreement with Toby Ord and much of the EA community, I think that anthropogenic risks are a much larger risk factor and so I felt the UCL course was not well targeted. For example there appeared to be more content on space weather (which is still important) than on nuclear security (which I think is far more important). I contacted the department at UCL to ask what they thought about their focus, given Toby Ord's arguments, and didn't get a response. Still, there could be something useful in that course. It seemed to me that this group at UCL was more focused on geography and physical risks rather than conflict, which I saw as a weakness.  

I was much more impressed by the Kings College London faculty of War Studies, which has at least a dozen different degrees and as you might guess from the name, leans a lot more heavily on the anthropogenic risks, and looks at both state and non-state actors. Of all of their courses, I was most interested in the Science and International Security MA, and I've heard that one of the advisors at the Open Philanthropy Project did the course, and also a friend I made at EA London, who recommended the course. I found out more about the reading list, and I applied for the course and was made an offer. But when I went for the open day I didn't find the other students interested in the course that motivating - I'd been working for about six years and most of the other people had come straight from university, though there were exceptions. 

At the time I was (and still am) interested in lots of things: risks from new technologies, which included things like cybersecurity, encryption, autonomous weapons, and synthetic biology. 

I also thought more about what I'd be learning on the course, and I had big update when I heard this in the 80K interview with Stuart Russell, a hugely influential figure in AI safety. 

Stuart Russell: But in all of these cases, the shortage is always people who understand AI. There’s no shortage of people who have degrees in international relations or degrees in political science or whatever.


Stuart Russell: So my view is I think you need to understand two things. You need to understand AI. I think that’s really important. Actually understand it. Actually take a course. Learn how to program. 

I realised that most of these courses would probably give me a good exposure to the key concepts in these areas, but having lurked around the EA forum and read a few books in these areas, and also worked in consulting, that I wasn't likely to develop any radically new skills that would differentiate me as a graduate. I also looked up graduates from the courses on LinkedIn (a great trick, this one) and found that they often went into roles more junior than the one I was currently in. So I decided against the program, and accepted a job offer to work at CEA instead. 

A few months ago, I stumbled across the Security and Resilience: Science and Technology at Imperial College London. This looks like a really interesting course, but I don't have the technical knowledge to start on it. 

I still want to develop technical skills, and I decided the best way to learn them was through doing some coding myself, and also doing the UCL MSc in Computer Science, which I was very pleased to get on to, and am looking forward to starting at the end of September. I think this one could potentially have a lot more value. I think this course has a lot of career capital which could be deployed in all of the many applications of ML, on information security, pandemic modelling, and lots of other things. 

Other people will be in different positions to me, and I know several really smart people who've done these courses and got a lot of value out of them. For the position I was at with my career I ultimately decided they weren't worth it for me personally, but other people in similar positions to me might have come to different conclusions. 

Comment by Ben (bdixon) on Are mice or rats (as pests) a potential area of animal welfare improvement? · 2021-04-28T10:36:34.205Z · EA · GW

Great, thanks for sharing!

Comment by Ben (bdixon) on Are mice or rats (as pests) a potential area of animal welfare improvement? · 2021-04-27T17:58:36.843Z · EA · GW

Super useful, thanks!

Comment by Ben (bdixon) on Mundane trouble with EV / utility · 2021-04-08T11:06:31.338Z · EA · GW

Thanks for posting this Ben, and great to see the discussions. Wishing you all the best!

Comment by Ben (bdixon) on New Top EA Causes for 2021? · 2021-04-01T12:09:33.487Z · EA · GW

Strongly upvoted for the link to the Castle. Btw in one podcast I'm pretty sure I heard Wiblin say "the general vibe of the thing"

Comment by Ben (bdixon) on European Master's Programs in Machine Learning, Artificial Intelligence, and related fields · 2021-03-31T12:51:31.747Z · EA · GW

Ah ok, no worries. I'm considering the course - do you know anyone who's currently on it?

Comment by Ben (bdixon) on A ranked list of all EA-relevant (audio)books I've read · 2021-03-09T11:50:39.801Z · EA · GW

Thanks Aaron. Sure - "perhaps you're not aware" was not intended to be condescending at all. And yes, the later sentence you wrote was the tone I was hoping for. 

Comment by Ben (bdixon) on European Master's Programs in Machine Learning, Artificial Intelligence, and related fields · 2021-03-07T15:55:20.544Z · EA · GW

Thanks for putting this together. I'd be interested in the UCL write-up - is there an estimate on when that might be out please? 

Comment by Ben (bdixon) on A ranked list of all EA-relevant (audio)books I've read · 2021-02-17T16:48:49.057Z · EA · GW

This is a nice idea,  but I agree with Hauke that this risks increasing the extent to which EA is an echo chamber.  Perhaps you're not aware of the (over)hype around some of these books in EA.  

 I think  Rationally Speaking is particularly good at engaging with a range of people and perspectives.  

Comment by Ben (bdixon) on Books on authoritarianism, Russia, China, NK, democratic backsliding, etc.? · 2021-02-03T09:03:36.446Z · EA · GW

Yeah I haven't read any of his stuff, just mentioning that he works on totalitarianism and authoritarianism. Not having read The Road to Unfreedom, it looks like he identifies trends in several geographies which could be useful for the questions you're looking at. 

Comment by Ben (bdixon) on Books on authoritarianism, Russia, China, NK, democratic backsliding, etc.? · 2021-02-03T00:00:35.809Z · EA · GW

There's also a chapter in Global Catastrophic Risks on the topic, though I forget who wrote it

Comment by Ben (bdixon) on Books on authoritarianism, Russia, China, NK, democratic backsliding, etc.? · 2021-02-02T23:59:39.406Z · EA · GW

Timothy Snyder is an academic looking at this question who has written several books on the topic, e.g. this and this

Comment by Ben (bdixon) on Investing to Give Beginner Advice? · 2021-01-07T14:07:49.145Z · EA · GW

Someone shared this link with me which supports your view  that lump-sum is generally better, especially if you don't have diminishing utility.  

Comment by Ben (bdixon) on Donating to EA funds from Germany · 2021-01-06T22:08:03.982Z · EA · GW

Hey - I'm the finance lead at CEA, of which EA funds is one part. Anyone can donate to EA Funds, and you should be able to do this from Germany. Are you concerned that doing so means you're not donating tax-efficiently since the funds aren't registered charities in Germany? 

If so, my colleague Sam wrote this post arguing that donating effectively might mean ignoring tax efficiency, and I agree, depending on your alternative. 

For example, if you were planning to donate only to AMF, and if there was a German AMF you were planning to donate to (I don't know if there is) which would you give tax deductibility, then I think it's better to donate to the German AMF and get the tax deductibility. 

But if you were comparing a general charity that would give you tax deductibility in Germany, and if you thought the EA Funds option (e.g. the Founders Pledge climate fund, the Long-Term Future Fund) was >50% better value for money than your alternative in Germany, then my view is that it'd be better to just lose the tax efficiency and donate directly. 

Comment by Ben (bdixon) on Careers Questions Open Thread · 2020-12-24T23:06:19.142Z · EA · GW

No idea - I think it most depends on the specifics of your situation. On average I think people who start organisations later in their life using their experience and contacts are likely to be more successful.  

Comment by Ben (bdixon) on Careers Questions Open Thread · 2020-12-16T22:54:26.304Z · EA · GW

I think staying where you are seems like a good option. There seems to be an assumption that just because you stay in the same job for the next few years you'll automatically be there for the next 15 - is that really true? Also maybe you could make a public commitment to leave after X years, or donate your income above a certain level to avoid getting lured in by the money side of it. 

Comment by Ben (bdixon) on fiction about AI risk · 2020-12-16T21:13:05.067Z · EA · GW

Interesting, thanks for posting about it! 

Comment by Ben (bdixon) on Careers Questions Open Thread · 2020-12-15T18:27:04.896Z · EA · GW

It sounds like if you've been rejected from studying masters courses then that's useful feedback. Even for people who have done well on those courses, I think there are many more applicants than places to work in philosophy. 

And if you're already trying to overcome really steep odds, by working in academia with eight years working independently, then this might not be the area you're best suited to. 

I don't know about it but there could be work in neuroscience or psychology that you might find interesting. 

Comment by Ben (bdixon) on Careers Questions Open Thread · 2020-12-15T18:00:35.630Z · EA · GW

Also have you seen this

Comment by Ben (bdixon) on Careers Questions Open Thread · 2020-12-15T17:57:16.825Z · EA · GW

Some jobs here

Comment by Ben (bdixon) on Careers Questions Open Thread · 2020-12-15T17:56:52.281Z · EA · GW

I don't think anyone can give you a direct answer - it'll depend on your own personal circumstances, but if you've got savings then I can vouch that option 2 could be good. Have you tried applying for any jobs in that space? 

Comment by Ben (bdixon) on Careers Questions Open Thread · 2020-12-15T17:53:30.706Z · EA · GW

Hi Daniel, 

Great to hear from you. Here are some of my own thoughts (not official in any way at all, and people in the community have all sorts of different views). I share your interest in AI safety and climate change. Is there an AI safety community in Germany? If not people thinking about AGI, there's probably some big universities working on near-term control problems that could be interesting. It might be good to meet some other people working in the area. 

Also if you're interested in climate and AI, this a huge field of people working on everything from forecasting to flood prediction to increasing crop yields - have you tried applying for any jobs in that space?

Comment by Ben (bdixon) on Careers Questions Open Thread · 2020-12-15T17:50:35.287Z · EA · GW

Hi Jack! I worked in consulting at EY for four years before joining CEA in operations, and you might find a role in management/leadership/operations interesting. You might find a direct role that's more fun than ETG (I think I did!)

I don't really know anything about the best ETG routes, but one that strikes me that could be big business at the moment is insolvency and restructuring - lots of organisations will be unfortunately going through that so there could be quite  a few roles. 

Also a few friends of mine who were at PwC worked in deals/valuations and then left to get high-paying jobs doing (I think) the same thing at smaller  places where the partners took less of the margin. Maybe that could be an option.

Comment by Ben (bdixon) on Careers Questions Open Thread · 2020-12-15T17:47:23.172Z · EA · GW

That's a tricky one, sorry to hear it. It could just be random chance - plenty of jobs have churn, and I think a surprising amount of doing well in jobs is getting along with other people. I wonder if you had any managers, colleagues, or friends who you think might have any specific feedback? On the other hand, this could just be chance. 

Comment by Ben (bdixon) on Careers Questions Open Thread · 2020-12-15T17:45:45.080Z · EA · GW

I think this is an interesting area of research - I'm not aware of much writing by EAs, but bear in mind the EA community is pretty small compared to the total number of people researching this and related fields across the world - you might find some other organisations or researchers who've looked into this more.

Comment by Ben (bdixon) on Careers Questions Open Thread · 2020-12-15T17:43:12.022Z · EA · GW

That sounds like a  huge range of options. With an MPP-MBA you might do well in policy. Are there any government or related roles you think you'd be interested in? And is there a particular area you'd like to work in? E.g. if you were more passionate about animal welfare than nuclear security, that would suggest some pretty different career routes. 

Comment by Ben (bdixon) on Careers Questions Open Thread · 2020-12-15T17:40:02.470Z · EA · GW

Different people in the community will have different views, but my own take is that the capitalism and markets can be great for growth and improving productive capacity but you want to make sure that the benefits are spread throughout society (see the book Why Nations Fail). 

I'm sorry to hear that you're feeling overwhelmed by things. I've felt the same way at time. It's important to look after yourself, take time off, and connect with other people. For me, I love watching the Simpsons, going for runs with my friends, and drinking coffee!

My own take on this is that the world is big and messy, and there are lot of bad things we each as individuals have to accept we can't control. But if you can find a niche doing something which hits the sweet spot of being both enjoyable and improving the world, then you can have a pretty good time! 

I suspect you might be able to find lots of ways to use AI to make things better - I've seen some great work in improving agricultural production using machine learning which seems pretty good. And I'm sure there are lots of businesses and charities that would be interested in someone with your skillset. 

Comment by Ben (bdixon) on Careers Questions Open Thread · 2020-12-15T17:35:12.446Z · EA · GW

Have you tried applying for any roles in clean energy? It's an absolutely booming sector, especially if Biden gets the US to rejoin and more things start happening in the US. 

Comment by Ben (bdixon) on Careers Questions Open Thread · 2020-12-15T17:33:57.000Z · EA · GW

Sounds interesting! I'd say it's worth doing the easy and reversible things first (e.g. trying out stuff within your company), and maybe put in a few applications to jobs like these. You could study international development, but you might get some job offers without needing the masters course. You can always apply to some masters' courses anyway and see what happens. 

Comment by Ben (bdixon) on Careers Questions Open Thread · 2020-12-15T17:30:48.770Z · EA · GW

They seem like fairly different job offers - are there any other things you might prefer to do? This should depend on how much runway you have and how much income you think you need, but of those two roles it sounds like you're more interested in the health care one ("I'm just not passionate about being in a consumerism industry") and you might learn a lot there. Also if after two years you get bored, then you could always move on, and your role might be quite different if they go through an IPO. 

Comment by Ben (bdixon) on Careers Questions Open Thread · 2020-12-10T10:23:30.334Z · EA · GW

My rough guess is that option 2 would be more fun and since a lot of these areas have quite a lot of funding, maybe it'd be more your comparative advantage. You mention general management and operations, but I wonder if you have any health/lab-specific knowledge that could be used to work in these areas. I guess Covid has changed this a bit but my guess is that pandemic preparedness, especially in the developing world, is still terribly neglected.

Comment by Ben (bdixon) on Careers Questions Open Thread · 2020-12-10T10:17:52.645Z · EA · GW

Yep - Lucius Caviola and Stefan Schubert, and also Joshua Greene at Harvard. Lucius and Stefan have a bunch of their videos on YouTube. Also have you considered applying to GPI?

Comment by Ben (bdixon) on Careers Questions Open Thread · 2020-12-10T10:13:20.715Z · EA · GW

I would say don't get an MBA unless you are really really sure, as they are mega-expensive and I think marketed very broadly to people who often don't benefit from them

Comment by Ben (bdixon) on Careers Questions Open Thread · 2020-12-10T10:12:18.235Z · EA · GW

Hey Jeremy! Myself and Joan Gass at CEA, and Markus Anderljung at FHI,  all use skills like the ones you mention above, from our consulting backgrounds, at non-profits. 

I sometimes look at this filter on the 80K job board and one example of a role you might like is this one. I also think that working in government is often a good thing to do, and so maybe there could be some US trade/aid organisations which you might find interesting, and also this talk. If you think that consulting means you can boost the productivity of companies and lead to economic growth overall, then that could be interesting.

Comment by Ben (bdixon) on Careers Questions Open Thread · 2020-12-10T10:04:17.708Z · EA · GW

It's great that you've been so persistent! It seems like you're fairly set on politics - what is it that motivates you to work on that, and are there any other routes to do something similar?

Comment by Ben (bdixon) on Careers Questions Open Thread · 2020-12-10T10:02:34.689Z · EA · GW

On a) I think it depends on how well suited you are to the role and on b) Have you tried applying for roles in emerging technologies or security? This could be a cheap test to see if you might like working there, and whether you'd actually need to do further study.

Comment by Ben (bdixon) on AMA: Jason Crawford, The Roots of Progress · 2020-12-07T12:44:42.974Z · EA · GW

R&D is a public good, and so we'd expect it to be systemically underfunded by the private sector and provided in some part at least by governments. Some economists, such as Mariana Mazzucato argue that government plays a key role in both funding R&D and in applying it for public benefit. Lant Pritchett argues that development comes through interlocking transformations, including the build-up of state capability.   

But in your comments below, and from having read through your blog, it seems like you're not such a fan of government or even alliances between the public and private sectors. 

A root-cause analysis on most human suffering, if it went deep enough, would blame government and cultures that don't foster science, invention, industry, and business. It seems that the most high-leverage long-term plan to reduce human suffering would be to spread global rationality and capitalism. 

Do you think governments have a role to play in improving human progress? And if not, why not?

Comment by Ben (bdixon) on Introducing High Impact Athletes · 2020-11-30T15:45:57.149Z · EA · GW

By the way EA Funds now includes the Founders Pledge climate fund which I think is a bit more straightforward than the animal welfare argument

Comment by Ben (bdixon) on If someone identifies as a longtermist, should they donate to Founders Pledge's top climate charities than to GiveWell's top charities? · 2020-11-26T22:29:25.244Z · EA · GW

In my view yes, for the reasons Ben Todd gives below. I also did some brief back of the envelope calculations using Danny's Bressler's mortality cost of carbon here. This is also something Will MacAskill has been talking about a lot more recently, and he talks about the long-term importance of climate change here. And also as Ben Todd and Max say below - I also agree that it's possible there's longtermist work, e.g. on GCBRs and maybe AI, that has a higher expected impact. But I think climate change is a fairly straightforward longtermist bet. We've recently added the Founders Pledge climate fund to EA Funds here.  

Comment by Ben (bdixon) on Progress Open Thread: October // Student Summit 2020 · 2020-10-21T13:42:26.007Z · EA · GW

I've been learning to code with Python and I did my first tiny bit of machine learning - I figured out how to do a polynomial regression to look at global average sea surface temperatures!

Comment by Ben (bdixon) on Is there a positive impact company ranking for job searches? · 2020-10-15T15:41:39.695Z · EA · GW

What jobs are you thinking about? You could always post some of your thoughts in a forum post and people might be keen offer suggestions.