Monday, August 27, 2012

Radical Moral Ideas: X-Risks

How much do you care about future people?

In my previous post it looked like the extreme ease of helping people through certain charities suggests that we should give them most of our money. However, surely some consideration should be given to the fate of future people. You might think that their lives should be given somewhat less moral weight than those of currently living people, although to me the prevention of an entire lifetime's worth of experiences seems as bad as killing someone alive today, but either way a future with lots of happy people seems better than one with fewer sadder ones.

If you give a non-trivial moral weight to the lives of future persons, however, you might find that you have to take into account a new set of concerns: X-risks, or extermination risks. In the past, many species have gone extinct. When carbon dioxide munching plants first filled the atmosphere with oxygen, the pre-existing species of bacteria had to adapt or die. Notoriously, the dinosaurs were (probably) put to rest by a deadly asteroid impact, only the mammals' smaller size allowing them to survive. Unfortunately, mammals are no longer tiny, and the risk of catastrophic asteroid impact has not gone away.

If humans were wiped out by an asteroid, several billion people currently alive on Earth would die. However, this is not the end of the catastrophe, if we value the lives of future persons. All of humanity's potential future descendants would be prevented from living. This could run into the trillions. In fact, if you think that humanity would otherwise survive for billions of years (potentially colonizing other planets), then the number of beings whose existence would be prevented by human extinction is unfathomably large.

Any other moral considerations surely pale by comparison. Ensuring that the human race is not wiped out in the foreseeable future becomes a moral imperative, even if you can only contribute to this cause by the tiniest of amounts. A number of potential factors that might exterminate the human race come to mind. A weaponised virus, for example, might be manufactured by a genocidal lunatic that could be exceptionally infectious and deadly. At the extreme end of the probability curve of predictions of the impact of global warming are temperatures that might render Earth uninhabitable to humans. A particularly interesting possibility that I hope to discuss in the future is artificial intelligence.

In short, when the future is at stake, nothing else matters. Read Nick Bostrom's essay for more.

No comments:

Post a Comment