An existential risk is a risk which poses irrecoverable damage to humanity. In his foundational paper Existential Risks, Nick Bostrom defines an existential risk as a calamity which “would either annihilate Earth-originating intelligent life or permanently and drastically curtail its potential.”

Potential existential risks include the following: severe nuclear war, weaponized biotechnology, the runaway greenhouse effect, asteroid impacts, the creation of a superintelligence, and the development of self-replicating Drexlerian nanotechnology.