Should recent events make us more or less concerned about biorisk?

post by Linch · 2020-03-19T00:00:57.476Z · score: 20 (8 votes) · EA · GW · 4 comments

This is a question post.

Contents

  Answers
    6 willbradshaw
    3 MichaelA
None
4 comments

On the one hand, I think we've seen enough evidence that government, etc., institutions are surprisingly inadequate at even dealing with a natural pandemic that by all accounts have substantially less concerning properties than a severe bio-engineered pandemic.

On the other hand, a classical reason given for being less concerned about biorisk historically is that we'll see "warning shots" before the real thing (in a way that we're less likely to see with, eg, AI). In a way, COVID-19 is one such "warning shot." So I expect governments, large non-EA donors, public health people and the current generation of smart young people, etc., to all make the update to be much more concerned about pandemics and institutional resilience to them.

On balance, I weakly think current events should lead us to be less concerned about future biorisk. What do you guys think?

Answers

answer by willbradshaw · 2020-03-18T20:58:38.603Z · score: 6 (5 votes) · EA(p) · GW(p)

I think you're probably right that society is likely to respond by increasing our ability to respond to natural pandemics in various ways. There's a lot of great people who are now way more interested in pandemics than they were before.

(Come to think of it, putting some thought now into how to mobilise those forces to avert the next pandemic is probably warranted, since I think there's a pretty good chance all that energy dissipates without much to show for it within a few years of this pandemic ending.)

When it comes to biorisk as a whole, the picture is less clear (though my guess is still probably positive?). There does seem to be some danger that people neglect considerations around engineered pandemics (DURC, info hazards, etc.) in their rush to tackle natural pandemics. I think a lot of work done on the latter is still useful for preventing the former, but they don't always run in the same direction, and since engineered pandemics seem to be the greatest concern from a longtermist perspective, this could be a significant concern.

comment by MichaelA · 2020-03-19T05:57:29.588Z · score: 4 (4 votes) · EA(p) · GW(p)
(Come to think of it, putting some thought now into how to mobilise those forces to avert the next pandemic is probably warranted, since I think there's a pretty good chance all that energy dissipates without much to show for it within a few years of this pandemic ending.)

I agree with this. I generally suspect it's important to give people "things to do" when they're currently riled up/inspired/motivated about something, and that if the absence of things to do they'll just gradually revert to their prior sets of interests and focuses (or those of the people they're around). I suspect it would be very valuable for people to currently think of concrete things that a wide range of people (not just biorisk experts) can productively do in relation to biorisk after this pandemic has been handled, and be ready to spread the word about those things during and right after the pandemic, so we can capitalise on the momentum.

(I have no firm data or expertise to back this view up.)

answer by MichaelA · 2020-03-19T06:24:35.751Z · score: 3 (2 votes) · EA(p) · GW(p)

I think there are sort-of four subquestions here:

1. Do these events provide evidence that we should've been more worried all along about pandemics in general (not necessarily from a longtermist/x-risk perspective)?

2. Do these events provide evidence that we should've been more worried all along about existential risk from pandemics?

3. Do these events increase the actual risk from future pandemics in general (not necessarily from a longtermist/x-risk perspective)?

4. Do these events increase the actual existential risk from future pandemics?

With that in mind, here are my wild speculations as to the answers, informed by very little actual expertise.

I'm fairly confident the answer to 3 is no. It seems quite likely to me that these events will at least somewhat decrease the actual risk from future pandemics in general, because of the "warning shot" effect you mention.

I think 4 is a very interesting question. I would guess that there's enough overlap between what's good for pandemics in general and what's good for existential risks from pandemics that these events will reduce those risks, again due to the "warning shot" effect.

I would also guess that we'll see something more like resources being added to the pool of pandemic preparedness, rather than resources being taken away from longtermist-style pandemic preparedness in order to fuel more "small scale" (by x-risk standards) or "short term" pandemic preparedness. This is partly informed by my second-hand impression that there's currently not many resources in specifically longtermist-style pandemic preparedness anyway (to the extent that the two categories are even separate).

But I could imagine being wrong about all of that.

I think the answers to 1 and 2 depend what you previously believed. I think for most people, the answer to both should be "yes" - most people seemed to have very much dismissed, or mostly just not thought about, risks from pandemics, so a very real example seems likely to remind them that things that don't usually happen really do happen sometimes.

But it seems to me that what we're seeing here is remarkably like what I've been hearing from EAs, longtermists, and biorisk people since I got into EA, from various podcasts and articles and conversations. So for these people, it might not be "new evidence", just something that fits with their existing models (which doesn't mean they expected precisely this to happen at precisely this point).

4 comments

Comments sorted by top scores.

comment by Linch · 2020-03-19T03:22:01.581Z · score: 3 (2 votes) · EA(p) · GW(p)

One reason to believe otherwise is because you think existential GCBRs will looks so radically different that any broader biosecurity preparatory work won't be useful.

comment by Rook · 2020-03-20T16:13:48.344Z · score: 3 (2 votes) · EA(p) · GW(p)

This was basically going to be my response -- but to expand on it, in a slightly different direction, I would say that, although maybe we shouldn't be more concerned about biorisk, young EAs who are interested in biorisk should update in favor of pursuing a career in/getting involved with biorisk. My two reasons for this are:

1) There will likely be more opportunities in biorisk (in particular around pandemic preparedness) in the near-future.

2) EAs will still be unusually invested in lower-probability, higher-risk problems than non-EAs (like GCBRs).

(1) means talented EAs will have more access to potentially high-impact career options in this area, and (2) means EAs may have a higher counterfactual impact than non-EAs by getting involved.

comment by willbradshaw · 2020-03-18T09:17:12.840Z · score: 1 (1 votes) · EA(p) · GW(p)

Is this a cunning scheme to ask private questions on the Forum, or is this actually going to go public at some point? :P

comment by Linch · 2020-03-18T18:24:53.481Z · score: 2 (1 votes) · EA(p) · GW(p)

It's going to go public! Want people to review it lightly in case this type of question will lead to information-hazard territory in the answers.