> What is this Singularity? Is it a science-fiction thing >invented by Vernor Vinge?

The Singularity

Human history has been characterized by an accelerating rate of technological progress. It is caused by a positive feedback loop. A new technology, such as agriculture, allows an increase in population. A larger population has more brains at work, so the next technology is developed or discovered more quickly. In more recent times, larger numbers of people are liberated from peasant-level agriculture into professions that entail more education. So not only are there more brains to think, but those brains have more knowledge to work with, and more time to spend on coming up with new ideas.

We are still in the transition from mostly peasant-level agriculture (most of the world's population is in un-developed countries), but the fraction of the world considered 'developed' is constantly expanding. So we expect the rate of technological progress to continue to accelerate because there are more and more scientists and engineers at work.

Assume that there are fundamental limits to how far technology can progress. These limits are set by physical constants such as the speed of light and Planck's constant. Then we would expect that the rate of progress in technology will slow down as these limits are approached. From this we can deduce that there will be some time (probably in the future) at which technological progress will be at it's most rapid. This is a singular event in the sense that it happens once in human history, hence the name 'Singularity'.

This is my definition of the concept. Vernor Vinge, in his series of stories 'The Peace War' and 'Marooned in Real Time' had a different definition. He implicitly assumed that there was no limit to how far technology could progress, or that the limit was very very high. The pace of progress became very rapid, and then at some point mankind simply disappeared in some mysterious way. It is implied that they ascended to the next level of existence or something. From the point of view of the 20th century, mankind had become incomprehensively different. So that time horizon when we can no longer say anything useful about the future is Vinge's Singularity. One would expect that his version of the Singularity would recede in time as time goes by, i.e. the horizon moves with us.

When will the Singularity Occur?

The short answer is that the near edge of the Singularity is due about the year 2035 AD. Several lines of reasoning point to this date. One is simple projection from human population trends. Human population over the past 10,000 years has been following a hyperbolic growth trend. Since about 1600 AD the trend has been very steadily accelerating with the asymptote located in the year 2035 AD. Now, either the human population really will become infinite at that time (more about that later), or a trend that has persisted over all of human history will be broken. Either way it is a pretty special time.

If population growth slows down and the population levels off, then we would expect the rate of progress to level off, then slow down as we approach physical limits built into the universe. There's just one problem with this naive expectation - it's the thing you are probably staring at right now - the computer.

Computers aren't terribly smart right now, but that's because the human brain has about a million times the raw power of todays' computers. Here's how you can figure the problem: 10^11 neurons with 10^3 synapses each with a peak firing rate of 10^3 Hz makes for a raw bit rate of 10^17 bits/sec. A 66 MHz processor chip with 64 bit architecture has a raw bit rate of 4.2x10^9. You can buy about 100 complete PC's for the cost of one engineer or scientist, so about 4x10^11 bits/sec, or about a factor of a millionless than a human brain.

Since computer capacity doubles every two years or so, we expect that in about 40 years, the computers will be as powerful as human brains. And two years after that, they will be twice as powerful, etc. And computer production is not limited by the rate of human reproduction. So the total amount of brain-power available, counting humans plus computers, takes a rapid jump upward in 40 years or so. 40 years from now is 2035 AD.

Can the Singularity be avoided?

There are a couple of ways the Singularity might be avoided. One is if there is a hard limit to computer power that is well below the human-equivalent level. Well below means like a factor of 1000 below. If, for example, computer power were limited to only a factor of 100 short of human capacity, then you could cram 100 CPU chips in a box and get the power you wanted. And you would then concentrate on automating the chip production process to get the cost down. Current photolithography techniqes seem to be good for a factor of 50 improvement over today's chips (maybe a real expert can correct this figure for me if I am off). So it seems that we need at least one major process change before the Singularity and maybe it doesn't exist.

Another way to possibly avoid the Singularity is by humans messing themselves up sufficiently. The argument goes that the work involved in killing people is roughly constant over time, but the energy and wealth available to each person goes up over time. So it becomes easier over time for small numbers of people to kill ever larger numbers of people. Then, given a small but finite rate of loonies bent on mass murder, you eventually kill off large numbers of people and set things back.

The usual technologies pointed to are nuclear weapons and engineered plagues. One can describe scenarios like the hobbyist mad scientist of the future extracting Uranium from sea-water (where it is present in a few parts per billion), and then separating the U-235 with a home mass-spectrometer, and building a bomb with his desktop milling machine. It all is designed on his 'SuperCAD version 9.0' design software.

Some Other Interesting Thresholds

Human life expectancies have been increasing at about 0.1 years per calendar year. If the rate of progress in medical areas increases by a factor of 10, then life expectancy will be increasing as fast as you are aging. This means your projected lifespan suddenly jumps from being in the mid to upper 80 year range to a much larger number. From my point of view as a 36 year old, biotechnology is making gratifyingly rapid progress even today, and I hope that this will feed jumps in life expectancy in the future.

Whether the size of a factory or a Drexler-style assembler, the complexity of a self-replicating machine is probably about constant. At some point we will have tools capable of modeling and designing such machines, and shortly therafter building them. A finite investment in building the first such machine will yield an exponentially expanding output. This has radical consequences for wealth levels, etc. Even nearly self- replicating machines (say 99% capable) will have dramatic economic effects.

First of all, what is this thing called a Singularity? I think of it in two ways. First: Narrowly, as a tremendously rapid increase of the rate of technological progress (a "tech Singularity"). Second: More broadly, as the creation of intelligences that are so far beyond us that we can not understand most of what they're up to (a "subjective Singularity").

A "tech Singularity" is a projected brief interval of time in which the practical innovation rate becomes so fast as to seem essentially instantaneous to humans. For example, consider innovations such as the personal computer, the fax machine, the automobile, the gene chip, object-oriented programming, quark theory, abstract algebra, et cetera. Once a few hundred innovations of this magnitude occur each day, we will have definitely moved into the Singularity. Basically, a tech Singularity is a near-mathematical-singularity in the rate of technological advance.

In a broader sense, I think of a "subjective Singularity" as involving either:

Drastic alteration in human subjective experience. Technologies like virtual reality, genetic engineering, neuromodification, and uploading have the potential to drastically change the experience of being human. Once these technologies have transformed the subjective experience of a moderately large percentage of humans, in a highly significant way, then we will have reached a "subjective Singularity": a replacement of human mind with something different.

or,

The creation of nonhuman intelligences that are tremendously smarter than us, and most of whose activities are as opaque to us as our activities are to a dog.

************

In principle, either variety of subjective Singularity could occur without a tech Singularity; but I think the two are likely to come together.... It is the combination that I refer to generically as "The Singularity."

As for what life or mind will be like after the Singularity, I think this is something we can not know. The most important aspects of post-Singularity reality and mind will likely be as opaque to an unimproved human as advanced mathematics is to a dog.

The advent of the Singularity is not certain, but given the exponential trends observable in technological and scientific progress, it seems very likely. The Singularity will probably be brought about by the convergence of a large number of interrelated technologies, a process that Kurzweil has described very well.

It seems likely that in the late pre-Singularity years, Artificial General Intelligence programs will play a starring role in bringing the Singularity about - because once such programs are able to invent things at the human level, they will probably be able to modify their design, enabling them to supersede human powers of invention. They will thus be able to push toward the Singularity faster than humans can.

How likely any of our present actions are to significantly affect the Singularity is difficult to say. On the one hand, huge processes like this tend to take their course independently of the actions of any one being. On the other hand, the Singularity may be a multifurcation point, so that a small nudge in one direction or another can have a big effect on what path is taken.

What can we do to encourage the Singularity to go well? Unfortunately, all we have to guide us is human common sense; there is no science of such complex processes. We all should ensure that human alterations and artificial intelligences and other advanced technologies are constructed and deployed in a morally acceptable way.

The Singularity is an event in history yet to come. It is the point when mankind receives the keys to the universe. It will be precipitated by the exponential advancement of the Singularity technologies of nanotechnology, biotechnology and artificial intelligence. The Singularity will occur very suddenly and disruptively to current social systems. The Singularity promises great benefits but also contains great risks.

I believe the term "Singularity," as we are using it these days, was popularized by Vernor Vinge in his 1986 novel Marooned in Realtime. (It appears that the term was first used in something like this sense, but not implying superhuman intelligence, by John von Neumann in the 1950s.) Vinge's own usage seems to leave an exact definition open to varying interpretations. Certainly it involves an accelerating increase in machine intelligence culminating in a sudden shift to super intelligence, either through the awakening of networked intelligence or the development of individual AIs. From the human point of view, according to Vinge, this change "will be a throwing away of all the previous rules, perhaps in the blink of an eye." Since the term means different things to different people, I will give three definitions.

Singularity #1: This Singularity includes the notion of a "wall" or "prediction horizon"--a time horizon beyond which we can no longer say anything useful about the future. The pace of change is so rapid and deep that our human minds cannot sensibly conceive of life post-Singularity. Many regard this as a specific point in time in the future, sometimes estimated at around 2035 when AI and nanotechnology are projected to be in full force. However, the prediction-horizon definition does not require such an assumption. The more that progress accelerates, the shorter the distance measured in years that we may see ahead. But as we progress, the prediction horizon, while probably shortening in time, will also move further out. So this definition could be broken into two, one of which insists on a particular date for a prediction horizon, while the other acknowledges a moving horizon. One argument for assigning a point in time is based on the view that the emergence of super-intelligence will be a singular advance, an instantaneous break with all the rules of the past.

Singularity #2: We might call this the AI-Singularity, or Moravec's Singularity since it most closely resembles the detailed vision of roboticist Hans Moravec. In this Singularity humans have no guaranteed place. The Singularity is driven by super-intelligent AI, which immediately follows from human-level AI. Without the legacy hardware of humans, these AIs leave humans behind in a runaway acceleration. In some happier versions of this type of Singularity, the super-intelligent AIs benevolently "uplift" humans to their level by means of brain uploading.

Singularity #3: Singularity seen as a surge into a transhuman and posthuman era. This view, though different in its emphasis, is compatible with the shifting time-horizon version of Singularity #1. In Singularity as Surge the rate of change need not remotely approach infinity (as a mathematical singularity). In this view, technological progress will continue to accelerate, though perhaps not quite as fast as some projections suggest, rapidly but not discontinuously transforming the human condition.

This could be termed a Singularity for two reasons: First, it would be a historically brief phase transition from the human condition to a posthuman condition of agelessness, super-intelligence, and physical, intellectual, and emotional self-sculpting. This dramatic phase transition, while not mathematically instantaneous, will mean an unprecedented break from the past. Second, since the posthuman condition (itself continually evolving) will be so radically different from human life, it will likely be largely if not completely incomprehensible to humans as we are today. Unlike some versions of the Singularity, the Surge/phase transition view allows that people may be at different stages along the path to posthuman at the same time, and that we may become posthuman in stages rather than all at once. For instance, I think it fairly likely that we achieve superlongevity before super-intelligence.

When you say singularity,do you mean the "light at the end of the tunnel",where the two opposites meet and eventually cancel each other out,forming a nirvana type experience?I am new here,you see.

it is more of a super-transcendent state of consciousness. "The Light at the End of the Tunnel" is an interesting association. Perhaps there is something there. A collective Nirvana? We must transcend ourselves sufficiently first before we can have experiences of such things.

Could the current concept concerning implantable computers lead us to this Singularity?I have heard a rumor ( and I stress it is a rumor only) the military is ‘toying' with a mind-computer link to give better control to pilots and "Smart-Bombs."The Internet has already greatly advanced the availability of data. We can find almost anything we wish with a web search. Of course the down-side of web searching right now is wading through all the Information Pollution to find exactly the data we are looking for. Google and other search engines are currently working on this problem, which I believe involves a basic sort of AI.It doesn't take much imagination to go from this type of Virtual Reality to pondering the implications of human minds directly connected not only to the whole of the Internet, but enhanced with advanced AI.Adding everything together leaves us not with super-computers competing with the Human mind, but enhancing it. Rasing it to a new level of intelligence.Implantable computers with a Mind-Brain-AI link all connected to the Internet with everyone who is connected having instantaneous access, not only to each other, but to all pertinent information.Since the brain is directly linked to the information there is no lag in downloading through the senses. Everything would be available as a ‘memory.'

I view singularity as a point in time when the "intelligence/technology" is sufficient enough to satisfy the "Desires". The current level of "intelligence/technology" is not sufficient enough to satisfy the "Desires"

But I am not sure what would happen to the "intelligence/technology" or to the "Desires" after the singularity.

What does the intelligence do without any desire? (like demand-supply)

Is it like the escape velocity for the rockets, Once you escape you donot need to think about it anymore?

To "unknown".I am talking about an ego-death experience,where all there is is an infinite void(black),and an infinitely small "me",or light(a perfect white).

If you think of the outside world(object)of perception being the product-or projection,of consciousness,as being percieved by the ego,which incidentally is also a projection,and we think of whatever it is that is doing the projecting,STOPS projecting-what are we left with?NOTHING,no projections.This would be the void/emptiness.But,we would still be left with that that does the projecting which,albeit unconsciously,is ME,or the SELF.This is experienced as a blinding light in the ego-death,nirvana,or samadhi experience-which can be experienced on hallucinogenic drugs.This light(and the emptiness/void)are what I think of as the singularity.The reson why I say that they are BOTH PART of the singularity is because you can't have one without the other-they are EXACT opposites,and complement each other perfectly,like a jigsaw puzzle piece and the surrounding piece(s).This is what I think of as the "psychological singularity",the place where all physical manifestations-or PERCEPTIONS come from.

Think of the univerese BEFORE the big bang happened,just the emptiness and the singularity,the singularity being the place or source of all that is in the universe,and imagine that singularity as an infintely small(imploding)ball of infinite energy,and this energy being infintely bright.This is the best that I can explain the drug-induced ego-death/nirvana type experience-imagine this and you're pretty close.

P.S.I have just read about "baseline consciouness",and it sounds pretty close to what I am trying to describe here.

This light(and the emptiness/void)are what I think of as the singularity.The reson why I say that they are BOTH PART of the singularity is because you can't have one without the other-they are EXACT opposites,and complement each other perfectly,like a jigsaw puzzle piece and the surrounding piece(s).This is what I think of as the "psychological singularity",the place where all physical manifestations-or PERCEPTIONS come from.

interesting ideas, Richard. Ego death does not necessarily imply the SELF/VOID experience you described. The ego is a structure in your consciousness. You can do away with ego and still have consciousness, albeit of a non-dualistic variety that does not involve the self-other dichotomy.

It is not clear to me that the SELF/VOID experience you describe is the consciousness or psychological singularity. If it were, it seems like it would be pretty boring because it would lack structures in consciousness (such as various representions and ideas in consciousness). The consciousness you're talking about consists simply of SELF and VOID. It may very well be something like "baseline consciousness" since it seems to be consciousness without an object (or consciousness without being conscious of anything), but I don't think it's the singularity. The singularity, in my opinion, can best be captured by an analogy: like what an incandescent light bulb is to 1000 blazing suns, so too will our human consciousness today stand in relation to the consciousness singularity. The difference between our human consciousness and that of the singularity will be one of intensity, radiance, and magnitude. Our consciousness today, even during peak experiences and transcendent experiences, offer but the faintest glimpses into what the consciousness singularity will be like. It will be an overflowing of radiant consciousness, but it's hard for me to accept that this overflowing consciousness will be devoid of objects, which seems to be what you're suggesting. The consciousness characterized by SELF/VOID is something that can be experienced nowadays, so why would you think that that would characterize the psychological singularity?

QUOTE

Think of the univerese BEFORE the big bang happened,just the emptiness and the singularity,the singularity being the place or source of all that is in the universe,and imagine that singularity as an infintely small(imploding)ball of infinite energy,and this energy being infintely bright.This is the best that I can explain the drug-induced ego-death/nirvana type experience-imagine this and you're pretty close.

there are many ways to experience ego-death. Not all ways are coupled with the SELF/VOID experience you've had. It's an interesting picture you paint, though, and is good food for further thought. Thanks.

No,I have to disagree about saying that the void would be boring because it lacks content.In fact,it is because it lacks content it is heavenly.Everything is relative,right?But where there is nothing,there is nothing to be relative to,so we don't have right/wrong,good/bad in the void,but only TRUTH,which is neither right nor wrong-it is eternal.NO DUALITIES=NOT RIGHT OR WRONG,GOOD OR BAD(resp.)Do you see what I mean?Imagining the void is a projection,where the void is the LACK or END of projection...nothing,the singularity(in my humble understanding.)

I had a void experience about 14 years ago,and I remember it like yesterday.It is hard to imagine pure nothingness when you live in a world of DUALITIES-there are no dualities in the void....it is heavenly

I view singularity as a point in time when the "intelligence/technology" is sufficient enough to satisfy the "Desires". The current level of "intelligence/technology" is not sufficient enough to satisfy the "Desires"

But I am not sure what would happen to the "intelligence/technology" or to the "Desires" after the singularity.

What does the intelligence do without any desire? (like demand-supply)

Is it like the escape velocity for the rockets, Once you escape you donot need to think about it anymore?

Rajesh, Are you saying we can achieve singularity without any change in intelligence/technology, just by reducing the desires to the currentlevel of technology?

The Singularity holds out the possibility of winning the Grand Prize, the true Utopia, the best-of-all-possible-worlds - not just freedom from pain and stress or a sterile round of endless physical pleasures, but the prospect of endless growth for every human being - growth in mind, in intelligence, in strength of personality; life without bound, without end; experiencing everything we've dreamed of experiencing, becoming everything we've ever dreamed of being; not for a billion years, or ten-to-the-billionth years, but forever... or perhaps embarking together on some still greater adventure of which we cannot even conceive. That's the Apotheosis.- Eliezer Yudkowsky

This is what I think of as the "psychological singularity",the place where all physical manifestations-or PERCEPTIONS come from.

Think of the univerese BEFORE the big bang happened,just the emptiness and the singularity,the singularity being the place or source of all that is in the universe,and imagine that singularity as an infintely small(imploding)ball of infinite energy,and this energy being infintely bright.This is the best that I can explain the drug-induced ego-death/nirvana type experience-imagine this and you're pretty close.

P.S.I have just read about "baseline consciouness",and it sounds pretty close to what I am trying to describe here.[/quote]

Finaly, I have found the first one talking about this!

" PSYCHOLOGICAL SINGULARITY IS THE LIMIT FOR THE EVOLUTION OF THE NATURAL INTELLECT

Summary

There is the limit for the evolution of systems that can reflect reality the way human mind does. There is a moment in evolution of the mind after which it becomes incompatible with essential requirements of existence. The reason for psychological singularity is peculiarity of the auto-reflection process in mind which leads to creating an insoluble and irreplaceable strategic motivation. The core moment of it is the ultimate understanding of total absence of "free will"."

Let's say the conscious shift happens, which it probably will...everything can be accessed instantly, and if something hasn't been created yet it will almost instantaneously appear to exist.

It's as though the race comes to be a god in a sense, which may or may not have fundamental limits on what can be tapped. But from the past, one knows that its usually perspective that leads to the idea of having a limit.Man couldn't fly until he built the plane...

This idea is funny to me because ancient civilizations have it in their own terms; so does that mean they achieved some sort of time travel? were they just projecting into 2035?

What if there were ways to project the outcome of the singularity and incorporate that knowledge of existence into now? Wouldn't that mean you would transcend time, and just be?

Ants; that's what I'm reminded of. They exist to satisfy essential needs and recreate. It's just that a human's essential needs are on a larger spectrum....

The only thing that will ever separate us from any other life form will be to give up desire....and most people wouldn't. Would life be satisfying still? Would it be empty?

I guess that's why there's love...

So our own brain chemicals are the only thing we need to satisfy ourselves...shyttttttt we can modify that already whether it be by meditation or hopefully one day by some kind of outside brain modifier(whatever that's gonna be)

Now I'm starting to feel the limits of being human...too bad I can't just float off out of here.

Maybe we can learn how to exist out of body and become like those myths or deities...If anybody knows how to do that, I'm all ears!

everything can be accessed instantly, and if something hasn't been created yet it will almost instantaneously appear to exist.

Aren't we getting there with the Internet?

Everything can be accessed instantly

With the internet we dont require a postman, messages travel instantly around the world. If you require something skilled doing, you don't need to train at a college or get a professional. The information for self-training and information on specific topics are all at our fingertips.

Now we are able to learn anything we like because we can pull most of humanities knowledge up instantly. The training aspect will always be there, but now the time when you are training is irrelevant. This has freed up a lot of our time.

If something hasn't been created yet it will almost instantaneously appear to exist

Now its viable to have a shop devoted to extremely rare things (6 fingered gloves, marzipan zoo animals), because a business can sell to the whole world. As humanities tools increase to monitor what our needs are in increasingly better detail, it wont be long before businesses are setup because of a need within this data.

There are also those 3D printers which can be used to create parts for vehicles and such on the fly. The army uses them in squads of tanks and such. If they have a faulty component, they just download the blueprint from HQ and print another. This maybe the future and definately crosses into instant gratification.

If the singularity is all about instant gratification, and eternal freedom, does is not sound like a literal wet dream to you?

and btw, I don't think we're fully there with the internet yet. We will be there soon, but I don't think the process is instant as of right now. I find myself having to weed through a lot of bs just to get to what I want to find out. Interestingly enough, that dilemma has been reduced dramatically within the past couple years of my endeavors.

If the singularity is all about instant gratification, and eternal freedom, does is not sound like a literal wet dream to you?

It's hard to fathom a perfect reality with no flows. CS singularity as Shaws describes would be a giant leap in the evolutionary process. But we'll we keep evolving even after that? Most certainly, evolutions seems to be a never-ending process, right? And what are the circumstances necessary for this process? The same as they are now: Instant gratification, curiosity, good and evil-or duality, to be more exact. So, there goes your perfect picture.

Well thank you, I appreciate yours as well. I have one question though...when one does leave, do they come back?

There is a great paradox!!! If we come back then, have we been there yet? Where are those that have come back! And if there is no one back, that this mean that we never managed to achieve CS? Meaning we'll never make it and the doom-sayers were right after all? Scary thought isn't it?

Hah, the doom-sayers. I think I'll be okay either way, but sometimes I think it's the nature of some to keep things to themselves...that way the experience can stay gratifying......or maybe there's rules to be kept, and certain teachings to be already understood. I guess I won't know until somebody shows me, or I find it myself. Either way, that's where I'm headed

It's not that complicated and speculative. Think about human mind like a set of programs. Now the question: what would a set of programs like we are do if there are no limitations for access to ANY particular basic program and it modification? The answer is obvious: unless there is still at least ONE program out of reach of the system, the system will freeze. Because it will try to modify all it's own sub-programs to reach some preset goal, but after modifying the last bit of itself, what the next goal should be? Nothing. Self-motivation is nonsense, there is only a motivation from outside. Right now this motivation is a set of instincts, memory, habits and so on. After all sub-programs will be modified, this will not work. So simple. Freeeeze...

By the way, where are all the super-civilizations that appeared just a bit earlier than our?...

By the way, where are all the super-civilizations that appeared just a bit earlier than our?...

One theory is that we see about one a day committing suicide via type 1A supernova. When a civilization advances to the point that they can collide atoms with sufficient energy to create strange matter, a runaway chain reaction consumes the planet, releasing abundant energy.