Hey, let’s spend $400 billion researching “the singularity”

Does the singularity actually have a 1 percent chance of happening?

Should the singularity arrive after all, this Cylon model will be the future home of my consciousness.

Battlestar Galactica

Someday soon, say tech optimists, humans might be able to upload their consciousness to machines. There it can live forever, get backed up in the cloud, replicated across the planet, downloaded into new hardware whenever needed. Boosters call such a moment "the singularity," since it would represent a point beyond which the human race would be forever and unpredictably altered. Critics, on the other hand, just roll their eyes.

But if, by some miracle, humanity does manage to turn itself into and/or build a host of Cylons, that would be a Pretty Big Change—and things that create Pretty Big Changes should be studied. But even if they cost $150 billion?

That's the argument of Max Tegmark, an MIT physicist, writing for "big questions" site Edge.org. He's not convinced the singularity will arrive, and he's not convinced its arrival would even be a good thing. But he is convinced the singularity would have absolutely stunning consequences for humanity.

On one hand, it could potentially solve most of our problems, even mortality. It could also open up space, the final frontier: unshackled by the limitations of our human bodies, such advanced life could rise up and eventually make much of our observable universe come alive. On the other hand, it could destroy life as we know it and everything we care about...

Objectively, whoever or whatever controls this technology would rapidly become the world's wealthiest and most powerful, outsmarting all financial markets, out-inventing and out-patenting all human researchers, and out-manipulating all human leaders. Even if we humans nominally merge with such machines, we might have no guarantees whatsoever about the ultimate outcome, making it feel less like a merger and more like a hostile corporate takeover.

Subjectively, these machines wouldn't feel like we do. Would they feel anything at all? I believe that consciousness is the way information feels when being processed. I therefore think it's likely that they too would feel self-aware, and should be viewed not as mere lifeless machines but as conscious beings like us—but with a consciousness that subjectively feels quite different from ours.

And, if there's even a tiny chance that the singularity could arrive, he says, we had better get a research program going to think about the best ways to deal with the coming immortality/cyborg apocalypse/colonization of the universe. That research program may be expensive, however. Tegmark has a modest proposal:

[The singularity] could be the best or worst thing ever to happen to life as we know it, so if there's even a one percent chance that there'll be a singularity in our lifetime, I think a reasonable precaution would be to spend at least one percent of our GDP studying the issue and deciding what to do about it. Yet we largely ignore it, and are curiously complacent about life as we know it getting transformed. What we should be worried about is that we're not worried.

Let's assume that "our GDP" here refers solely to the United States. In 2011, US gross domestic product hit approximately $15 trillion; one percent of that money would come to a whopping $150 billion. If the EU did its own singularity research at one percent of its GDP, that would add another $170 billion to the pot. Should China and other states contribute at similar levels, this singularity research project could approach the $400 billion range.

That's a lot of cash. As to the question of whether a singularity research project would be worth the money, it would seem to depend on the likelihood of the singularity becoming a reality. Say, for the sake of argument, that we accept Tegmark's "one percent chance" threshold as the proper one—does the singularity have a greater than one percent chance of happening this century?

As a perennial skeptic of most ideas that involve "uploading our consciousness" or "superhuman artificial intelligence," I'm more than a little doubtful. Author Bruce Sterling, who writes sci-fi and authored the nonfiction classic The Hacker Crackdown, is with me.

"It's just not happening," Sterling wrote in his own Edge.org commentary. "All the symptoms are absent. Computer hardware is not accelerating on any exponential runway beyond all hope of control. We're no closer to 'self-aware' machines than we were in the remote 1960s. Modern wireless devices in a modern Cloud are an entirely different cyber-paradigm than imaginary 1990s 'minds on nonbiological substrates' that might allegedly have the 'computational power of a human brain.' A Singularity has no business model, no major power group in our society is interested in provoking one, nobody who matters sees any reason to create one, there's no there there."

On the other hand, if the singularity does arrive despite my skepticism, I've already picked out the machine I'd like to house my future consciousness: the model six Cylon.

How many such events with a 1% chance of happening are there? More than 100? This kind of thinking came up in the discussions after Katrina or any other low-probability disaster. If we spend billions to anticipate every possible disaster, we may have no money left.

The problem with the idea of singularity is that our fear of death is a useful motivator. Many people who have made the greatest impact on our society have done so because they want to make their mark on society before they are gone. If there's no perceived "end date" on our existence, then there is no strong motivator to "get things done before it's too late."

The problem with the idea of singularity is that our fear of death is a useful motivator. Many people who have made the greatest impact on our society have done so because they want to make their mark on society before they are gone. If there's no perceived "end date" on our existence, then there is no strong motivator to "get things done before it's too late."

But that is a problem for after the singularity, before the singularity the fear of death is quite a good motivator to help usher it in. The very definition of singularity precludes us from making many assumptions about what problems or issues might arise afterwards.Not that I think it is happening anytime soon, but as long as we are not killed off someway and continue to advance technologically there does not seem to be any physical laws that we would violate by being able to create minds that are at least somewhat smarter than our own.

The singularity will happen. It's not an if, it's a when. The only thing that could prevent it is our own destruction.

That said, the whole "it's really just a copy" leads to many interesting ethical, moral, and philosophical questions. The whole "is there a soul" and "what happens to it if you copy your consciousness into a machine?" being the most obvious from a metaphysical standpoint. From there, you also have questions about the morality of only some people having access to the technology (as was mentioned in the article), and the practical matters of if everyone did.

214 Reader Comments

Proponents of the idea of the Singularity seem to have mistaken metaphor/analogy for reality.

When you upload, download or "move" a file, you aren't actually moving a file around. You are creating copies.

Similarly, even if we create machines which are capable of storing and continuing to execute human consciousness, and you "move" or "upload" your consciousness into that machine, it will be a copy of your consciousness. It won't be you. You will still die.

EDIT:

With regards to the likelihood of creating machines that can even simulate human consciousness (let alone execute a copy of one, which is its own challenge), I don't think it's likely until we've cracked quantum computing. I buy the proposal that our brains are quantum computers, and I have begun to think that they cannot be accurately simulated on current hardware.

As a software engineer whom dabbles in AI: the problem is in the programming. A faster computer with dumb programming will just make mistakes quicker. Until we can logically construct (or teach) an AI, we can't replicate it in a computer.

The idea becomes much more plausible if you think of the singularity starting when the human brain becomes "augmented" by technology. If one could increase a person's memory capacity by 50%, in a way that makes it as accessible as natural memory, for the price of a new iMac... then we'd really be starting something interesting. Then all those people may eventually find a way to increase other functions of the brain faster, and so on and so forth. It wouldn't have to even be computer technology. Drugs could be used.

That said, there's no precedent for tech like this. We've got external enhancements to our abilities like smartphones, the internet, etc that make a lot of information rapidly accessible but it would be hard to argue that people are actually smarter or that innovation is happening more rapidly as a result. If anything, I'd say human technological progress is approaching an asymptote.

I think that uploading our brain information to a robot body that has the ability to process that information in a very similar way as our brain does seems very hard, and very far away. But I believe that we will be able to make a backup of our brain in our lifetime, and that cloning human bodies won't be far away, giving us something very similar to the singularity.

When you upload, download or "move" a file, you aren't actually moving a file around. You are creating copies.

And the copies are just as good as the original, so people don't care.

Quote:

Similarly, even if we create machines which are capable of storing and continuing to execute human consciousness, and you "move" or "upload" your consciousness into that machine, it will be a copy of your consciousness. It won't be you. You will still die.

Again, people don't care. This topic is extensively discussed in the uploading literature.

DragonTHC wrote:

"Altered Carbon" by Richard K. Morgan. Read it. This is probably my favorite book on the subject.

A great book, but doesn't have much to do with the Singularity. Now that you mention it, the possibility of minds running at very high ratio economically outcompeting meat-humans was not explored that much in the book.

The idea becomes much more plausible if you think of the singularity starting when the human brain becomes "augmented" by technology. If one could increase a person's memory capacity by 50%, in a way that makes it as accessible as natural memory, for the price of a new iMac... then we'd really be starting something interesting. Then all those people may eventually find a way to increase other functions of the brain faster, and so on and so forth. It wouldn't have to even be computer technology. Drugs could be used.

That said, there's no precedent for tech like this. We've got external enhancements to our abilities like smartphones, the internet, etc that make a lot of information rapidly accessible but it would be hard to argue that people are actually smarter or that innovation is happening more rapidly as a result. If anything, I'd say human technological progress is approaching an asymptote.

The idea as I understand it has always been what happens if we build something "smarter" then us. If we can, then presumably the "smarter" thing will build something smarter then it and it will probably not take as much time as the first "upgrade." Continue this cycle until you've built god. That is the core idea here.

Proponents of the idea of the Singularity seem to have mistaken metaphor/analogy for reality.

When you upload, download or "move" a file, you aren't actually moving a file around. You are creating copies.

Similarly, even if we create machines which are capable of storing and continuing to execute human consciousness, and you "move" or "upload" your consciousness into that machine, it will be a copy of your consciousness. It won't be you. You will still die.

EDIT:

With regards to the likelihood of creating machines that can even simulate human consciousness (let alone execute a copy of one, which is its own challenge), I don't think it's likely until we've cracked quantum computing. I buy the proposal that our brains are quantum computers, and I have begun to think that they cannot be accurately simulated on current hardware.

This is something I don't think many people get. The best illustration of this is in an episode of a scifi anthology series, The Outer Limits maybe, where aliens gave humanity teleportation technology to allow us to cross the vast distances in space easily. What the public was never told is that the device isn't true teleportation. It didn't transport anyone anywhere. What it did was make an exact copy of a person at the other end. The original was killed and the copy allowed to go on oblivious to the fact that they are a copy. We have no idea what consciousness is so we have no idea if it can even be moved around from ody to body let alone replicated.

The idea becomes much more plausible if you think of the singularity starting when the human brain becomes "augmented" by technology. If one could increase a person's memory capacity by 50%, in a way that makes it as accessible as natural memory, for the price of a new iMac... then we'd really be starting something interesting. Then all those people may eventually find a way to increase other functions of the brain faster, and so on and so forth. It wouldn't have to even be computer technology. Drugs could be used.

That said, there's no precedent for tech like this. We've got external enhancements to our abilities like smartphones, the internet, etc that make a lot of information rapidly accessible but it would be hard to argue that people are actually smarter or that innovation is happening more rapidly as a result. If anything, I'd say human technological progress is approaching an asymptote.

The idea as I understand it has always been what happens if we build something "smarter" then us. If we can, then presumably the "smarter" thing will build something smarter then it and it will probably not take as much time as the first "upgrade." Continue this cycle until you've built god. That is the core idea here.

Classifying the Singularity as just consciousness upload is not very accurate. It's about technology advancing fast enough that there's no way we can even conceive of what's about to happen. Uploading consciousness is just one of the interesting things that would probably happen along the way.

There are also good arguments to be made that we're already well into the singularity advancement curve. Could people 100 years ago even comprehend the internet and the changes it has brought? People 50 years ago? Even 25? Technological advancement shows no signs of slowing down or becoming more predictable, so I think disregarding the idea of Singularity is pretty ignorant.

Why spend $150 billion? Head down to your local bookstore and you'll find several Micheal Chrichton or Dean Koontz books on this.

(But seriously, Koontz's Frankenstein series is a very interesting read on the ethical implictaions of the singularity. Not sure about Chrichton. If he didn't write one it's a shame, because it would have been excellent.)

People will always be dicks, so it doesn't matter if "you're running" on wetware or hardware. This idea that "uploading" is somehow equal to "enlightenment" is just a form of cyber religion. Yes, we know what the brain looks like but show 10,000 people a picture of a red apple and you will get 10,000 different nueral pattern responses. It's like everyone being born with MySQL pre-installed, but how and when things are stored and linked is going to be different from person to person.

I would LOVE to replace congress with a dozen or so self-aware cyber systems, but even that doesn't "fix" anything because politics and human nature are what they are. *Sigh* Maybe enslavement by robot overlords isn't such a bad thing.

No. Not robots. I want my own body. It's already becoming pretty clear that the aging process is artificial (thank you mother nature and evolution /sarcasm) to prevent us from living forever and breeding out of control and destroying the habitat in the process. There's no reason a biological entity could not live forever, barring a violent death, in theory. We are self repairing. Until we are not.

No. Not robots. I want my own body. It's already becoming pretty clear that the aging process is artificial (thank you mother nature and evolution /sarcasm) to prevent us from living forever and breeding out of control and destroying the habitat in the process. There's no reason a biological entity could not live forever, barring a violent death, in theory. We are self repairing. Until we are not.

I doubt robots will be made from metal and silicon in the future. We will most likely move to making organic machines. They are more efficient, adaptable, and durable than the kind of machines we are familiar with today. In all this time we have been trying to learn how to recreate ourselves and improve upon our design. We will likely either merge with our machines or be replaced by them.

How many such events with a 1% chance of happening are there? More than 100? This kind of thinking came up in the discussions after Katrina or any other low-probability disaster. If we spend billions to anticipate every possible disaster, we may have no money left.

The problem with the idea of singularity is that our fear of death is a useful motivator. Many people who have made the greatest impact on our society have done so because they want to make their mark on society before they are gone. If there's no perceived "end date" on our existence, then there is no strong motivator to "get things done before it's too late."

This is something I don't think many people get. The best illustration of this is in an episode of a scifi anthology series, The Outer Limits maybe, where aliens gave humanity teleportation technology to allow us to cross the vast distances in space easily. What the public was never told is that the device isn't true teleportation. It didn't transport anyone anywhere. What it did was make an exact copy of a person at the other end. The original was killed and the copy allowed to go on oblivious to the fact that they are a copy. We have no idea what consciousness is so we have no idea if it can even be moved around from ody to body let alone replicated.

I remember the first time I saw that episode. I gave me a new outlook on Star Trek that I have never been able to shake. I wonder if that isn't the reason Bones hates transporters. He just can't get over the knowledge that he is being destroyed. Having not really gotten into the hardcore technical understanding of Star Trek I've also grown to wonder why anyone dies of anything other than old age. Just go through a teleporter once a week and if something happens to you, have your pattern pulled from memory and rebuilt. It doesn't matter if the red shirt dies, he'll just be reconstructed right back on the ship. "Oh, uh, we cancelled the mission because the scanners picked up some poison gas. Yeah, that's it." Also, is that earl grey tea made from energy that was gained back from someones matter just a few weeks ago? There are so many places to go with this.

Also, The Prestige gives a bit of a nod to this idea as well.

[Edit] No spoiler tag as the movie is over 5 years old. Anyone that wants no spoilers on stuff that old (like me for gaming) knows how to avoid and forget.

This is something I don't think many people get. The best illustration of this is in an episode of a scifi anthology series, The Outer Limits maybe, where aliens gave humanity teleportation technology to allow us to cross the vast distances in space easily. What the public was never told is that the device isn't true teleportation. It didn't transport anyone anywhere. What it did was make an exact copy of a person at the other end. The original was killed and the copy allowed to go on oblivious to the fact that they are a copy.

This was also the case in a fairly recent movie from a few years back; the specifics are slightly different but it's a similar scenario. I can't really say the name of the movie without spoiling it (obviously...) but it was an interesting idea to explore. Each time the person walked into the machine he didn't know if he was going to be the one that survived or the one that was killed.

[edit] Or, you know, the post right above mine could give it away too.

The whole Singularity thing is something that's caught my attention lately. I used to be much more of a skeptic, but now I do believe it's possible. Certainly there's a lot of steps that are right now hand-waving, but it's not necessary to write an actual 'consciousness program'. The most believable model to me involves basically improving our scanning technology to get a lot more information about how the brain is wired, then clone that. Alternately, we might just build up augmenting our biological-based bit by bit, with extra processing and storage capacity, etc, until we progress into something different, probably somewhat seamlessly.

About questions if you can create consciousness, transfer consciousness, or if a potential 'backup' would just be another copy, or really *you*...I don't think we can know for sure until we do it. We might find out some very interesting things about the structure of the universe itself (which, when you go down to the quantum level, or even when you think about all implications of the whole space-time world we find ourselves in, it's all very counter-intuitive to day-to-day life).

I also don't understand Sterling's statement that nobody wants this. The next steps to the singularity are quantum computers, biotech, AI, robots...and powerful nations and big corps would kill to have stuff like that which was 1 generation ahead of their competition.

Anyway, I mostly wanted to comment on how crazy the question in the article is, if it's 1% likely in this century. I'd say it's more like 50/50.

No. Not robots. I want my own body. It's already becoming pretty clear that the aging process is artificial (thank you mother nature and evolution /sarcasm) to prevent us from living forever and breeding out of control and destroying the habitat in the process. There's no reason a biological entity could not live forever, barring a violent death, in theory. We are self repairing. Until we are not.

While DNA repair is not as efficient as it physically could be (there are mutations in DNA polymerase that seem to increase replication accuracy), there's no basis to assert that there is potentially perfect repair and replication. I'd also like to point out that as far as I'm aware there is no evidence for genetically mediated self-regulation in human population size and further there's no overwhelming consensus on the 'purpose' of or reason for aging in the field. I've never heard anyone working in the field support the 'make room for the young' theory.

The problem with the idea of singularity is that our fear of death is a useful motivator. Many people who have made the greatest impact on our society have done so because they want to make their mark on society before they are gone. If there's no perceived "end date" on our existence, then there is no strong motivator to "get things done before it's too late."

That's silly. I don't know anyone in their prime working age worrying about death.

Quote:

I was promised flying cars!!!! Where are my flying cars!!???

We could have flying cars. Do you really want drunk people flying cars around? Hell, do you really want your mom flying cars around?

Why spend $150 billion? Head down to your local bookstore and you'll find several Micheal Chrichton or Dean Koontz books on this.

Don't look at those hacks for sci-fi. If you want singularity fiction check out Charles Stross and Vernor Vinge.

Obviously. I don't think anyone considers them (Koontz in particular) sci-fi authors. But sci-fi is not the only genre that could address the subject. I believe Chrichton in particular would have treated the subject well, regardless what genre you want to stick him in.

This is something I don't think many people get. The best illustration of this is in an episode of a scifi anthology series, The Outer Limits maybe, where aliens gave humanity teleportation technology to allow us to cross the vast distances in space easily. What the public was never told is that the device isn't true teleportation. It didn't transport anyone anywhere. What it did was make an exact copy of a person at the other end. The original was killed and the copy allowed to go on oblivious to the fact that they are a copy. We have no idea what consciousness is so we have no idea if it can even be moved around from ody to body let alone replicated.

Yeah, I was thinking about this one day. If they "moved" my brain into a computer, the thing that woke up would believe everything was just fine, but, in reality, I had actually died. This works out fine for everyone (inlcuding my copy) still living.

I then really freaked myself out when I realized that this could basically be happening every night when I go to sleep. I'm pretty sure I'm the same entity I was last night, but I'm just not 100% sure.