Hey, let’s spend $400 billion researching “the singularity”

Does the singularity actually have a 1 percent chance of happening?

Should the singularity arrive after all, this Cylon model will be the future home of my consciousness.

Battlestar Galactica

Someday soon, say tech optimists, humans might be able to upload their consciousness to machines. There it can live forever, get backed up in the cloud, replicated across the planet, downloaded into new hardware whenever needed. Boosters call such a moment "the singularity," since it would represent a point beyond which the human race would be forever and unpredictably altered. Critics, on the other hand, just roll their eyes.

But if, by some miracle, humanity does manage to turn itself into and/or build a host of Cylons, that would be a Pretty Big Change—and things that create Pretty Big Changes should be studied. But even if they cost $150 billion?

That's the argument of Max Tegmark, an MIT physicist, writing for "big questions" site Edge.org. He's not convinced the singularity will arrive, and he's not convinced its arrival would even be a good thing. But he is convinced the singularity would have absolutely stunning consequences for humanity.

On one hand, it could potentially solve most of our problems, even mortality. It could also open up space, the final frontier: unshackled by the limitations of our human bodies, such advanced life could rise up and eventually make much of our observable universe come alive. On the other hand, it could destroy life as we know it and everything we care about...

Objectively, whoever or whatever controls this technology would rapidly become the world's wealthiest and most powerful, outsmarting all financial markets, out-inventing and out-patenting all human researchers, and out-manipulating all human leaders. Even if we humans nominally merge with such machines, we might have no guarantees whatsoever about the ultimate outcome, making it feel less like a merger and more like a hostile corporate takeover.

Subjectively, these machines wouldn't feel like we do. Would they feel anything at all? I believe that consciousness is the way information feels when being processed. I therefore think it's likely that they too would feel self-aware, and should be viewed not as mere lifeless machines but as conscious beings like us—but with a consciousness that subjectively feels quite different from ours.

And, if there's even a tiny chance that the singularity could arrive, he says, we had better get a research program going to think about the best ways to deal with the coming immortality/cyborg apocalypse/colonization of the universe. That research program may be expensive, however. Tegmark has a modest proposal:

[The singularity] could be the best or worst thing ever to happen to life as we know it, so if there's even a one percent chance that there'll be a singularity in our lifetime, I think a reasonable precaution would be to spend at least one percent of our GDP studying the issue and deciding what to do about it. Yet we largely ignore it, and are curiously complacent about life as we know it getting transformed. What we should be worried about is that we're not worried.

Let's assume that "our GDP" here refers solely to the United States. In 2011, US gross domestic product hit approximately $15 trillion; one percent of that money would come to a whopping $150 billion. If the EU did its own singularity research at one percent of its GDP, that would add another $170 billion to the pot. Should China and other states contribute at similar levels, this singularity research project could approach the $400 billion range.

That's a lot of cash. As to the question of whether a singularity research project would be worth the money, it would seem to depend on the likelihood of the singularity becoming a reality. Say, for the sake of argument, that we accept Tegmark's "one percent chance" threshold as the proper one—does the singularity have a greater than one percent chance of happening this century?

As a perennial skeptic of most ideas that involve "uploading our consciousness" or "superhuman artificial intelligence," I'm more than a little doubtful. Author Bruce Sterling, who writes sci-fi and authored the nonfiction classic The Hacker Crackdown, is with me.

"It's just not happening," Sterling wrote in his own Edge.org commentary. "All the symptoms are absent. Computer hardware is not accelerating on any exponential runway beyond all hope of control. We're no closer to 'self-aware' machines than we were in the remote 1960s. Modern wireless devices in a modern Cloud are an entirely different cyber-paradigm than imaginary 1990s 'minds on nonbiological substrates' that might allegedly have the 'computational power of a human brain.' A Singularity has no business model, no major power group in our society is interested in provoking one, nobody who matters sees any reason to create one, there's no there there."

On the other hand, if the singularity does arrive despite my skepticism, I've already picked out the machine I'd like to house my future consciousness: the model six Cylon.

How many such events with a 1% chance of happening are there? More than 100? This kind of thinking came up in the discussions after Katrina or any other low-probability disaster. If we spend billions to anticipate every possible disaster, we may have no money left.

The problem with the idea of singularity is that our fear of death is a useful motivator. Many people who have made the greatest impact on our society have done so because they want to make their mark on society before they are gone. If there's no perceived "end date" on our existence, then there is no strong motivator to "get things done before it's too late."

The problem with the idea of singularity is that our fear of death is a useful motivator. Many people who have made the greatest impact on our society have done so because they want to make their mark on society before they are gone. If there's no perceived "end date" on our existence, then there is no strong motivator to "get things done before it's too late."

But that is a problem for after the singularity, before the singularity the fear of death is quite a good motivator to help usher it in. The very definition of singularity precludes us from making many assumptions about what problems or issues might arise afterwards.Not that I think it is happening anytime soon, but as long as we are not killed off someway and continue to advance technologically there does not seem to be any physical laws that we would violate by being able to create minds that are at least somewhat smarter than our own.

The singularity will happen. It's not an if, it's a when. The only thing that could prevent it is our own destruction.

That said, the whole "it's really just a copy" leads to many interesting ethical, moral, and philosophical questions. The whole "is there a soul" and "what happens to it if you copy your consciousness into a machine?" being the most obvious from a metaphysical standpoint. From there, you also have questions about the morality of only some people having access to the technology (as was mentioned in the article), and the practical matters of if everyone did.

Plugging my friend's book The Erased. It also deals with this subject.

From a pragmatic point of view, we're going to need devices we can control via thought impulse before we even get close to a singularity moment. And while we're starting to get there, we're still a long ways away. Still, if it is possible, it is inevitable.

The problem with the idea of singularity is that our fear of death is a useful motivator. Many people who have made the greatest impact on our society have done so because they want to make their mark on society before they are gone. If there's no perceived "end date" on our existence, then there is no strong motivator to "get things done before it's too late."

But that is a problem for after the singularity, before the singularity the fear of death is quite a good motivator to help usher it in. The very definition of singularity precludes us from making many assumptions about what problems or issues might arise afterwards.Not that I think it is happening anytime soon, but as long as we are not killed off someway and continue to advance technologically there does not seem to be any physical laws that we would violate by being able to create minds that are at least somewhat smarter than our own.

I'm just curious why we're even listening to a physicist on this. He waxes philosophical very nicely, but it all sounds like he'd be better off pursuing a career in science fiction if he really wants to do something on this topic.

Quote:

I think a reasonable precaution would be to spend at least one percent of our GDP studying the issue and deciding what to do about it.

Sounds like a figure he literally pulled out of his ass with no concept of either what it entails nor what it would actually go towards. Not to mention no idea of what we already spend on the topic in various other ways. Studying what issue? There isn't anything to study right now. It may as well be left to the realm of science fiction, because we're so far from the type of AI able to even rudimentarily simulate consciousness that we barely have any idea of even how to pursue it.

And, as anyone who has been involved with machine learning should immediately know, there is a huge gap between teaching/writing a program to do something, and a program actively learning and in turn growing outside of its originally envisioned scope. We are still a number of fundamental breakthroughs away from anything even remotely resembling true machine consciousness.

Not only are we not there, but I question whether it will ever be practical, particularly on a wide scale. Which is, after-all, why we still don't have flying cars en masse. Technically, we at least do have the technology to make those at least, or something very similar. But, alas, the flying car revolution remains an ever distant dream.

Why spend $150 billion? Head down to your local bookstore and you'll find several Micheal Chrichton or Dean Koontz books on this.

Don't look at those hacks for sci-fi. If you want singularity fiction check out Charles Stross and Vernor Vinge.

Sometimes I think i'm the only one in tech who is aware of Vinge. Its difficult to suppress the eyeroll every time someone tells me about the concepts that Neal Stephenson invented. Vinge and Gibson did virtually all of it first, and much better. And with actual believable characters. Even in the case of Gibson, who isn't known for a particularly realistic tough with humans, his work is superior to Stephenson's. Snow Crash and The Diamond Age are exceedingly painful reads if one has any experience around actual humans.

The problem with the idea of singularity is that our fear of death is a useful motivator. Many people who have made the greatest impact on our society have done so because they want to make their mark on society before they are gone. If there's no perceived "end date" on our existence, then there is no strong motivator to "get things done before it's too late."

That's silly. I don't know anyone in their prime working age worrying about death.

Your kidding, right? We worry about it all the time. We have a set cycle to our lives. We can afford screw around and be immature for a while, but eventually, we settle down, think about marriage, think about kids, think about providing for them and maybe think about working a bit harder. This is because we are acutely aware of when we will most likely exit stage left.

the fact that we will eventually die sets the timing for the rest of our lives.

We could have flying cars. Do you really want drunk people flying cars around? Hell, do you really want your mom flying cars around?[/quote]

That was, in part, my point. It was really meant to be a commentary via the rhetoricalness of the question on how our vision frequently exceeds our grasp, especially when we happily wear blinders to the deeper realities of the situation. That and we always over estimate where we will be in the future for any given point in time (of course, we also often end up with breakthroughs no one expected, which bring their own organic changes in direction as a consequence).

I would upload myself into the singularity right now if I could. I'm not even kidding.

And all you would really be doing in all likelihood is making a copy of yourself. The original you would still die.

It doesn't matter. In fact, it would probably be better if the "original" was destroyed in the upload process. You go to sleep and wake up in a new body/machine. The fact that the "original" is now fertilizer is irrelevant.

while I'm not sure it is possible to make this happen... the research would be worth the effort. The computing advances would be enormous and the studies of the brain that would need to be done in order to make the 'transfer' would benefit all of humanity.

just like the advances made in ww2 that allowed us to land on the moon... the same potential advances would be possible with this much money flowing into the scientific world

I would upload myself into the singularity right now if I could. I'm not even kidding.

And all you would really be doing in all likelihood is making a copy of yourself. The original you would still die.

It doesn't matter. In fact, it would probably be better if the "original" was destroyed in the upload process. You go to sleep and wake up in a new body/machine. The fact that the "original" is now fertilizer is irrelevant.

But it wouldn't be YOU. You are the original and your existence ends. Your copy/twin/whatever would go on but YOU are dead.

That's the argument of Max Tegmark, an MIT physicist, writing for "big questions" site Edge.org [is]...

"On one hand, it could potentially solve most of our problems, even mortality.

I fail to see why mortality is a problem. Out-with-the-old, in-with-the-new is a crucial part of our success as a species.

Quote:

It could also open up space, the final frontier: unshackled by the limitations of our human bodies, such advanced life could rise up and eventually make much of our observable universe come alive.

Uh-huh. As a species, we are also "unshcakled by the limitations of our human bodies" because we can... dun, dun, DUN!... reproduce. If we had the technology to travel the stars, and really wanted to, we could do so without turning into cyborgs.

Quote:

Objectively, whoever or whatever controls this technology would rapidly become the world's wealthiest and most powerful, outsmarting all financial markets, out-inventing and out-patenting all human researchers, and out-manipulating all human leaders.

When you upload, download or "move" a file, you aren't actually moving a file around. You are creating copies.

And the copies are just as good as the original, so people don't care.

Quote:

Similarly, even if we create machines which are capable of storing and continuing to execute human consciousness, and you "move" or "upload" your consciousness into that machine, it will be a copy of your consciousness. It won't be you. You will still die.

Again, people don't care. This topic is extensively discussed in the uploading literature.

And why don't people care? They should care. Most people if you explained this to them would lose all interest in being uploaded, I think. Or at least most of their interest. If *you* still die, then what's so great about a copy of you living on in a computer? How is this any better than having kids?

I guess potentially it's less messy and maybe less expensive but it seems even more narcissistic than having kids and without the benefit that having kids; society maybe getting a version of you that's better than yourself.

When you upload, download or "move" a file, you aren't actually moving a file around. You are creating copies.

And the copies are just as good as the original, so people don't care.

Quote:

Similarly, even if we create machines which are capable of storing and continuing to execute human consciousness, and you "move" or "upload" your consciousness into that machine, it will be a copy of your consciousness. It won't be you. You will still die.

Again, people don't care. This topic is extensively discussed in the uploading literature.

And why don't people care? They should care. Most people if you explained this to them would lose all interest in being uploaded, I think. Or at least most of their interest. If *you* still die, then what's so great about a copy of you living on in a computer? How is this any better than having kids?

I guess potentially it's less messy and maybe less expensive but it seems even more narcissistic than having kids and without the benefit that having kids; society maybe getting a version of you that's better than yourself.

People don't care because it's not here and they aren't forced to think of this logically. If you were to ask the person offering you this path if it is actually you being moved to a new body or if it's just a copy and the real you stays in your current body and dies and they said "We really don't know" would you go through with it? How many people would? I for one wouldn't.

The singularity will happen. It's not an if, it's a when. The only thing that could prevent it is our own destruction.

That said, the whole "it's really just a copy" leads to many interesting ethical, moral, and philosophical questions. The whole "is there a soul" and "what happens to it if you copy your consciousness into a machine?" being the most obvious from a metaphysical standpoint. From there, you also have questions about the morality of only some people having access to the technology (as was mentioned in the article), and the practical matters of if everyone did.

Supposing we do reach the stage where we can duplicate our consciousness, should we not also consider the fact that maybe we don't have to kill the original? Could we not accept that maybe a consciousness has the right to exist multiple times?

People don't care because it's not here and they aren't forced to think of this logically. If you were to ask the person offering you this path if it is actually you being moved to a new body or if it's just a copy and the real you stays in your current body and dies and they said "We really don't know" would you go through with it? How many people would? I for one wouldn't.

But it's obvious that we do know the answer now. That was the point of my original post. There's no need for speculation. It would be a copy. People who believe otherwise are falling for the metaphor of "moving a file" or "uploading a file". You don't actually move anything, you make another copy on a remote location. Even if you delete the local copy, the file on the other machine is still a copy.

It would be the exact same for any kind of consciousness executing machine. There's no way for it to be otherwise. You cannot move your mind out of your brain any more than you can physically move a file off of one hard drive onto another. All you can do is copy.

EDIT:

If you want to be pedantic in theory you COULD move the molecules making up the file on one hard drive onto another but this would be very tedious and if you extrapolate it to the brain, you'd have to move neurons into the computer, which has all kinds of weird implications. At that point you're talking cyborg, not computer executing consciousness.

I would upload myself into the singularity right now if I could. I'm not even kidding.

And all you would really be doing in all likelihood is making a copy of yourself. The original you would still die.

It doesn't matter. In fact, it would probably be better if the "original" was destroyed in the upload process. You go to sleep and wake up in a new body/machine. The fact that the "original" is now fertilizer is irrelevant.

But it wouldn't be YOU. You are the original and your existence ends. Your copy/twin/whatever would go on but YOU are dead.

You all seem to be inferring something that should not be inferred. 'Singularity' only speaks to the advent of "super-intelligence", not the mechanism by which it is attained. It might be attained through GMO techniques, cyber-neural melding (read any Iain Banks novel), or through something resembling "personality transference" as you suggest (see Old Man's War by John Scalzi). It's even possible (likely?) that we could reach the singularity by pure accident (a seriously disruptive technology).

One of the problems with all of this is nobody's bothered to define exactly what 'super-intelligence' even looks like. What would a super-intelligent being be capable of that a mere human would not? Play chess better? That would qualify it seems to me, although weakly.

Clarke's law seems to apply here (except in reverse): If the technology seems magical, it must be sufficiently advanced. In my mind, grandmaster-level chessplay is already magical...

A tangentially related book series that sort of deals with this topic, but more in the manner of mortal humans vs the entities representing a kind of singularity: the Hyperion Cantos series by Dan Simmons.

People don't care because it's not here and they aren't forced to think of this logically. If you were to ask the person offering you this path if it is actually you being moved to a new body or if it's just a copy and the real you stays in your current body and dies and they said "We really don't know" would you go through with it? How many people would? I for one wouldn't.

But it's obvious that we do know the answer now. That was the point of my original post. There's no need for speculation. It would be a copy. People who believe otherwise are falling for the metaphor of "moving a file" or "uploading a file". You don't actually move anything, you make another copy on a remote location. Even if you delete the local copy, the file on the other machine is still a copy.

It would be the exact same for any kind of consciousness executing machine. There's no way for it to be otherwise. You cannot move your mind out of your brain any more than you can physically move a file off of one hard drive onto another. All you can do is copy.

And my point is people don't care now because it not a real option for them at present. They are pretending it's not just a copy because they REALLY want to believe they can cheat death and live forever.

The singularity will happen. It's not an if, it's a when. The only thing that could prevent it is our own destruction.

What do you base this on? We are already running into some hard barriers in physics that have no apparent resolution. Given that, why is it 'inevitable' that a singularity event will occur? Its entirely possible, even probable, that one or more of the steps required is flat out not able to occur under the laws of physics.

To be clear, we're assuming that uploading would be like going to sleep and waking up; you'd have all the same personality and memories. Your kids don't have your memories.

Just google for mind uploading continuity; there's already been a ton of debate on this topic.

I know what you're talking about and my first post made the point that it's not going to be like going to sleep and waking up for *you*. Anyone who believes that is indulging in pure fantasy. Logic dictates that it will be a copy of you. And for the copy it will be like waking up, because the copy will have all of your memories. But you'll still be in your decaying body. So from your perspective, again, how is it better than having kids?

Kids aren't born with your memories, but you can pass them on. And you can minimize passing on your own flaws as much as possible. Whereas your copy is going to have your flaws unless you engage in selective editing, at which point its no longer a copy.

Also, if we're just talking about any form of super intelligence and not people attaining immortality by moving their consciousness into a computer; I think clearly the super intelligence has already arrived and it is me.

If *you* still die, then what's so great about a copy of you living on in a computer? How is this any better than having kids?

To be clear, we're assuming that uploading would be like going to sleep and waking up; you'd have all the same personality and memories. Your kids don't have your memories.

Just google for mind uploading continuity; there's already been a ton of debate on this topic.

No it just sounds like you just want to ignore the fact that all that is done is copying information from one container into another. The original stays where it is. What is being taken out of your physical body? Nothing. What is being being duplicated? Information.

1). A technological society could eventually achieve the capability of creating a computer simulation that is indistinguishable from reality to the inhabitants of the simulation.

2). Such a society would not do this once or twice. They would create many such simulations.

3). Left to run long enough the societies within the simulations would eventually be able to create their own simulations, also indistinguishable from reality to the sub-simulations inhabitants.

As a result, you have billions of simulations, with a nearly infinite number of cascading sub-simulations, all of them perfectly real to their inhabitants. Yet there is only a single ultimate progenitor society. The math is actually pretty simple: the odds are nearly infinity to one that we are all living in a computer simulation.

The singularity will happen. It's not an if, it's a when. The only thing that could prevent it is our own destruction.

What do you base this on? We are already running into some hard barriers in physics that have no apparent resolution. Given that, why is it 'inevitable' that a singularity event will occur? Its entirely possible, even probable, that one or more of the steps required is flat out not able to occur under the laws of physics.

Because there are no hard barriers to us and numerous ways to crate the singularity. For one, physics barriers have been worked around by using different methods. Furthermore, the simplest step to singularity, mind-machine interfaces, are already on their way. Slowly, but on their way.

Understanding how the human brain works and how to extract things like knowledge is far more a limiting factor than hardware.

I will definitely grant I have no clue what form the singularity will take, but we're actively working towards it every day. Intentionally, or no.

[EDIT] DARPA has been investing millions into true AI and how to design computer processors that work more like neurons. It's a goal actively being worked for,

Why spend so much? The whole idea of the singularity is that it only becomes possible because the price / performance ratio keeps getting better and better. So, i'd wait until the price comes down a few orders of magnitude.

I would upload myself into the singularity right now if I could. I'm not even kidding.

And all you would really be doing in all likelihood is making a copy of yourself. The original you would still die.

It doesn't matter. In fact, it would probably be better if the "original" was destroyed in the upload process. You go to sleep and wake up in a new body/machine. The fact that the "original" is now fertilizer is irrelevant.

But it wouldn't be YOU. You are the original and your existence ends. Your copy/twin/whatever would go on but YOU are dead.

You all seem to be inferring something that should not be inferred. 'Singularity' only speaks to the advent of "super-intelligence", not the mechanism by which it is attained. It might be attained through GMO techniques, cyber-neural melding (read any Iain Banks novel), or through something resembling "personality transference" as you suggest (see Old Man's War by John Scalzi). It's even possible (likely?) that we could reach the singularity by pure accident (a seriously disruptive technology).

One of the problems with all of this is nobody's bothered to define exactly what 'super-intelligence' even looks like. What would a super-intelligent being be capable of that a mere human would not? Play chess better? That would qualify it seems to me, although weakly.

Clarke's law seems to apply here (except in reverse): If the technology seems magical, it must be sufficiently advanced. In my mind, grandmaster-level chessplay is already magical...

We aren't talking about the Singularity since we aren't talking about machine intelligence created by man which surpasses our own intelligence. People seem to think making a copy of human intelligence is what the singularity is about it's not. It's about Skynet. It's about Colossus from the Forbin Project. It's about humanity creating sentient intelligent machines.