Entries tagged with transhumanism

Here's a link to a fascinating article in the New Scientist. The entire article is somewhat interesting, but most of it is about the current frontiers of human knowledge and what we can learn and the few things that relativity and similar external limitations may prevent us from learning. However, the truly fascinating part of the article is this bit from the beginning:

YOU might not expect the UK's Astronomer Royal to make too many pronouncements about what chimpanzees think, but that is one of Martin Rees's favourite topics. He reckons we can learn a lesson from what they understand about the world - or, rather, what they don't. "A chimpanzee can't understand quantum mechanics," Rees points out.

That might sound like a statement of the obvious. After all, as Richard Feynman famously said, nobody understands quantum mechanics. The point, though, is that chimps don't even know what they don't understand. "It's not that a chimpanzee is struggling to understand quantum mechanics," Rees says. "It's not even aware of it." The question that intrigues Rees is whether there are facets of the universe to which we humans are similarly oblivious. "There is no reason to believe that our brains are matched to understanding every level of reality," he says.

We live in an age in which science enjoys remarkable success. We have mapped out a grand scheme of how the physical universe works on scales from quarks to galactic clusters, and of the living world from the molecular machinery of cells to the biosphere. There are gaps, of course, but many of them are narrowing. The scientific endeavour has proved remarkably fruitful, especially when you consider that our brains evolved for survival on the African savannah, not to ponder life, the universe and everything. So, having come this far, is there any stopping us?

The answer has to be yes: there are limits to science. There are some things we can never know for sure because of the fundamental constraints of the physical world. Then there are the problems that we will probably never solve because of the way our brains work. And there may be equivalents to Rees's observation about chimps and quantum mechanics - concepts that will forever lie beyond our ken.

This statement lies at the heart of some of my beliefs about humanity and the universe. As I mentioned before, in some ways (which are to me not remotely nihilistic or even pessimistic), I often describe my worldview as somewhat Lovecraftian. I do not see humanity as the microcosm of the universe or in any way special or wondrous except in the ways that all life and all intelligent life is inherently special and wondrous.

To me, we are limited beings, as are all beings, and so like chimps, dogs, brine shrimp, and blue whales, there are things we are capable of understanding and then there is so much of the complex universe that we cannot, and may never know that we cannot understand it, because we can't even conceive of it. Of course, one of the wonders of this age is that we are on the edges of technologies that may be able to change these limits and allow us to understand more than we could otherwise could. I believe that doing this would make someone something other than human, but I also embrace the idea of this change and very much look forward to seeing it happen and preferably experiencing it myself.

Alternately, we may never achieve this, but may creature beings who can. I suspect that improving ourselves will prove easier than creating new and greater digital intelligence. However, the answer is at this point impossible to know, as are what the result of either might be like – it's not like chimps can understand how we think either.

When I think about such topics, two quotes come to mind for me – the first by David Zindell from his novel The Broken God: "What is a human being, then?' A seed' A... seed?' An acorn that is unafraid to destroy itself in growing into a tree."The second is my own – "I have no interest in worshiping gods, what I want to is for more people to play god." To me the idea that "playing god" is thought by some as something bad or wrong baffles and saddens me.

Charles Stross wrote an interesting short essay on why the Fermi paradox may be completely illusory. I agree with his analysis, in large part because it makes absolutely no sense to me that humans are the only intelligent technological species in the galaxy, and like Stross, I find the argument that if other older civilizations existed at least one would have already colonized the galaxy to reveal vastly more about the beliefs and prejudices of the person holding them that opinion than any actual truths.

In any case, Stross's essay also has several fascinating links. astronomer Milan M. Cirkovic's article Against the Empire is an interesting discussion of why highly advanced expansionist civilizations may well be vanishingly rare or non-existent and since only highly advanced civilizations have a hope of accomplishing interstellar travel, the fact that the galaxy has not already been colonized by one or more civilizations likely means absolutely nothing about existence of such civilizations.

Calling either of these works science is stretching that term well beyond all usefulness, they are a mixture of thought experiments and statements of belief. However, what impresses me about them, and especially about the second piece is that it so closely sums up my own beliefs about intelligence, complexity, and the inner workings of the universe. Something remarkably close to the developmental singularity hypothesis is one of the core tenets of my highly idiosyncratic personal spirituality.

Seeing what amounts to a description of the core of my spiritual beliefs detailed and analyzed in this fashion was truly fascinating, especially when the various implications of the idea are explored. It will likely eventually be possible to test the Developmental Singularity Hypothesis, and (unsurprisingly) I believe it will be proven correct. However, for now it is merely a statement of belief, and but it is one that I fervently share.

Essentially, I consider humanity (like all life-forms) to be a deeply flawed species. This is and obvious truism known by theologians, philosophers, biologists, and social scientists. To roughly paraphrase something byzantine_ruins wrote on usenet many years ago, we are insufficiently smart apes that live a pathetically short time.

On a personal front, I have absolutely no interest in ever dying, I believe in souls, reincarnation, and suchlike, but it is clear to me that retaining even a fraction of one's memories after death is exceptionally rare and I firmly believe that my personality w/o my memories is in no useful way me, so I believe that death is a final ending for me. As such, I have no desire for it to ever occur. Also, the world is full of many wonders and joys [1] and it seems to me that I would require a minimum of several thousand years to experience them (at which point, I would likely be very different than I am now, at which point I am almost certain that the me I will be then will find many other interests to pursue. Also, while all test scores indicate that I am smarter (or at least better at taking standardized tests) than 99.9% of people in the US I am continually frustrated with the fact that I forget facts I've read or minute details of my life. I would strongly prefer to be able to do complex math in my head and to never forget anything unless I wanted to. Given various advances being worked on now (like an artificial hippocampus), all this may soon be possible (in addition to genetic therapy to introduce genes for improved memory).

However, my fascination with transhumanism is more than simply a desire to be smarter and live forever. I think this is clearly the best path for our species. Far too many people do not sufficiently consider the consequences of their actions and short-term thinking is disturbingly widespread, even among the heads of governments and transnational corporations. Increased intelligence and increased lifespan would almost certainly improve both. Our world is changing increasingly rapidly and w/o some sort of horrid disaster, these changes aren't going to slow down anytime soon. I do not believe in paradise or utopia and in fact find the enire concept stifling, since nothing is ever perfect and everything can be improved and tinkered with in some useful fashion, but the world can clearly be far better than it is now and transhumanism seems to obvious path.

What puzzles me is even with something that is as much of an obvious good as increased longevity, there are people who disapprove of the concept, including to an extent the author of the Popular Science article:

If people don’t exit the stage for 5,000 years or so, there’s not much room for babies, not unless you want to contemplate a population bomb of massive proportions. Human life would become something like a union closed shop or a Senate subcommittee, where seniority rules and newcomers aren’t welcome. De Grey has thought hard about this, and his answer is unflinching: "We have a long tradition of prioritizing the rights of people who are alive over [those of] the unconceived."

Personally, I cringe at the cultural stasis of a world without a steady infusion of young people. (I believe John Archer, the bioremediation expert, put it best when he said, "If we’re still listening to Britney Spears in 5,000 years, we really will be buggered.") I go on for a bit about the importance of finitude, how we are meaningfully defined by the things we don’t get a chance to do—plumping for a more wistful, tragically tinged view of life, one that would be right at home in a college literature essay. “Death is the mother of beauty,” the poet Wallace Stevens wrote, and so on. (Later I discover that my rap on choices and the human condition has been advanced more skillfully by Leon Kass, the University of Chicago professor turned head of President George W. Bush’s Council on Bioethics.)

Such thinking both baffles and appalls me. As I have said before, I see absolutely no innate virtue in any sort of "natural" methods or ways of life, if we all lived in a more natural environment, most of those of us older than 35 would already be dead and all of us would have gone hungry more than once.

In short, I am a transhumanist because I firmly believe that if we can change or improve something in a way that would aid both ourselves individually and our species as a whole, then we should do so.

Even if there turn out to be absolute limits to human intellectual improvement, we might at least be able to create hyper-intelligent A.I.s (current research on quantum computers seems like a possible way to achieve create such beings), and I'd far rather be governed by a hopefully benign A.I. hundreds of times smarter and more knowledgeable than I am, especially when the option is a human ruler who is very likely dumber and less knowledgeable than me and who is no more likely to be benign.

[1] The idea found in almost all older religions, from Buddhism to Christianity, that the world is either innately evil or a place innately filled with suffering seems exceedingly outdated and no longer useful or applicable (at least to most residents of the First World). For people who are not in constant danger of starving or being murdered, the world obviously (to me at least) seems to be a rich and wonderful place that no one should have any desire to leave.