Develop and deploy a system to protect Earth from impacts from large asteroids, etc.
comment by Ben_Harack
· score: 14 (7 votes) · EA
) · GW
While I'm sympathetic to this view (since I held it for much of my life), I have also learned that there are very significant risks to developing this capacity naively.
To my knowledge, one of the first people to talk publicly about this was Carl Sagan, who discussed this in his television show Cosmos (1980), and in these publications:
Harris, A., Canavan, G., Sagan, C. and Ostro, S., 1994. The Deflection Dilemma: Use Vs. Misuse of Technologies for Avoiding Interplanetary Collision Hazards.
- Their primary concern and point is that a system built to defend humanity from natural asteroids would actually expose us to more risk (of anthropogenic origin) than it would mitigate (of natural origin).
- Opportunities for misuse of the system depend almost solely on the capability of that system to produce delta-V changes in asteroids (equivalently framed as “response time”). A system capable of ~1m/s delta V would be capable of about 100 times as many misuses as its intended uses. That is, it would see ~100 opportunities for misuse for each opportunity for defending Earth from an asteroid.
- They say that a high capability system (capable of deflection with only a few days notice) would be imprudent to build at this time.
Sagan, C. and Ostro, S.J., 1994. Dangers of asteroid deflection. Nature, 368(6471), p.501.
Sagan, C., 1992. Between enemies. Bulletin of the Atomic Scientists, 48(4), p.24.
Sagan, C. and Ostro, S.J., 1994. Long-range consequences of interplanetary collisions. Issues in Science and Technology, 10(4), pp.67-72.
Two interesting quotes from the last one:
- “There is no other way known in which a small number of nuclear weapons can destroy global civilization.”
- “No matter what reassurances are given, the acquisition of such a package of technologies by any nation is bound to raise serious anxieties worldwide.”
More recently, my collaborator Kyle Laskowski and I have reviewed the relevant technologies (and likely incentives) and have come to a somewhat similar position, which I would summarize as: the advent of asteroid manipulation technologies exposes humanity to catastrophic risk; if left ungoverned, these technologies would open the door to existential risk. If governed, this risk can be reduced to essentially zero. (However, other approaches, such as differential technological development and differential engineering projects do not seem capable of entirely closing off this risk. Governance seems to be crucial.)
So, we presented a poster at EAG 2019 SF: Governing the Emerging Risk Posed By Asteroid Manipulation Technologies where we summarized these ideas. We're currently expanding this into a paper. If anyone is keenly interested in this topic, reach out to us (contact info is on poster).
comment by MichaelA
· score: 1 (1 votes) · EA
) · GW
You may already be aware of this, and/or the window of relevance may have passed, but just thought I'd mention that Toby Ord discusses a similar matter in The Precipice. He seems to come to roughly similar conclusions to you and to Sagan et al., assuming I'm interpreting everyone correctly.
E.g. he writes:
There is active debate about whether more should be done to develop deflection methods ahead of time. A key problem is that methods for deflecting asteroids away from Earth also make it possible to deflect asteroids towards Earth. This could occur by accident (e.g. while capturing asteroids for mining), or intentionally (e.g. in a war, or in a deliberate attempt to end civilization). Such a self-inflicted asteroid impact is extremely unlikely, yet may still be the bigger risk.
This seems like an interesting and important point, and an example of how important it can be to consider issues like downside risks [? · GW], the unilateralist’s curse, etc. - perhaps especially in the area of existential risk reduction. And apparently even with what we might see as one of the rare "obviously" good options!
Something I find slightly odd, and that might conflict with yours or Sagan et al.'s views, was that Ord also wrote:
One reason [such a self-inflicted asteroid impact] is unlikely is that several of the deflection methods (such as nuclear explosions) are powerful enough to knock the asteroid off course, but not refined enough to target a particular country with it. For this reason, these might be the best methods to pursue.
I don't really know anything about this area, but it seems strange to hear that the option involving nuclear explosions is the safer one. And I wonder if the increased amount of explosives, development of tech for delivering it to asteroids, etc., could increase risks independently of asteroid-deflection, such as if it can be repurposed for just directly harming countries on Earth. Or perhaps it could reduce the safety benefits we'd get from having colonies on other moons/planets/asteroids/etc.?
Again, though, this is a field I know almost nothing about. And I assume Ord considered these points. Also, obviously there are many nuclear weapons and delivery mechanisms already.