Cryonics is the practice of preserving people who are dying in liquid nitrogen soon after their heart stops. The idea is that most of your brain's information content is still intact right after you've "died". If humans invent molecular nanotechnology or brain emulation techniques, it may be possible to reconstruct the consciousness of cryopreserved patients.

Contents

Cryonics-associated issues commonly raised on Less Wrong

Advanced reductionism/physicalism (because of the issues associated with personal identity with continuity of brain information).

Whether an extended healthy lifespan is worthwhile (relates to Fun Theory, religious rationalizations for 70-year lifespans, "sour grapes" rationalizations for why death is actually a good thing).

The "shut up and multiply" aspect of spending $300/year (as Eliezer Yudkowsky quotes his costs for Cryonics Institute membership ($125/year) plus term life insurance ($180/year)) for a probability (how large being widely disputed) of obtaining many more years of lifespan. For this reason, cryonics advocates regard it as an extreme case of failure at rationality - a low-hanging fruit by which millions of deaths per year could be prevented at low cost.

Blog posts

We Agree: Get Froze by Robin Hanson. "My co-blogger Eliezer and I may disagree on AI fooms, but we agree on something quite contrarian and, we think, huge: More likely than not, most folks who die today didn't have to die! ... It seems far more people read this blog daily than have ever signed up for cryonics. While it is hard to justify most medical procedures using standard health economics calculations, such calculations say that at today's prices cryonics seems a good deal even if you think there's only a 5% chance it'll work."

You Only Live Twice by Eliezer Yudkowsky. "My co-blogger Robin and I may disagree on how fast an AI can improve itself, but we agree on an issue that seems much simpler to us than that: At the point where the current legal and medical system gives up on a patient, they aren't really dead."

The Pascal's Wager Fallacy Fallacy - the fallacy of Pascal's Wager combines a high payoff with a privileged hypothesis, one with low prior probability and no particular reason to believe it. Perceptually seeing an instance of "Pascal's Wager" just from the high payoff, even when the probability is not small, is the Pascal's Wager Fallacy Fallacy.

Normal Cryonics - On the shift of perspective that came from attending a gathering of normal-seeming young cryonicists.

That Magical Click - What is the unexplained process whereby some people get cryonics, or other frequently-derailed chains of thought, in a very short time?

Quantum Mechanics and Personal Identity by Eliezer Yudkowsky. A shortened index into the Quantum Physics Sequence describing only the prerequisite knowledge to understand the statement that "science can rule out a notion of personal identity that depends on your being composed of the same atoms - because modern physics has taken the concept of 'same atom' and thrown it out the window. There are no little billiard balls with individual identities. It's experimentally ruled out." The key post in this sequence is Timeless Identity, in which "Having used physics to completely trash all naive theories of identity, we reassemble a conception of persons and experiences from what is left" but this finale might make little sense without the prior discussion.

A survey of anti-cryonics writing by ciphergoth - an attempt to find quality criticism of cryonics, with a surprising result that "there is not one person who has ever taken the time to read and understand cryonics claims in any detail, still considers it pseudoscience, and has written a paper, article or even a blog post to rebut anything that cryonics advocates actually say".