The next stage in human technological evolution is a single thinking/web/computer that is planetary in dimensions. This planetary computer will be the largest, most complex and most dependable machine we have ever built. It will also be the platform that most business and culture will run on. The web is the initial OS of this new global machine, and all the many gadgets we possess are the windows into its core. Future gizmos will be future gateways into the same One Machine. Designing products and services for this new machine require a unique mind-set.

What are the dimensions of this global Machine?

Today it contains approximately 1.2 billion personal computers, 2.7 billion cell phones, 1.3 billion land phones, 27 million data servers, and 80 million wireless PDAs. The processor chips of all these parts are feeding the computation of the internet/web/telecommunications system. So how many transistors are powering the Machine?

An Intel Pentium processor circa 2004 has 100 million transistors in it, while a Itanium processor inside a server has over 1 billion processors since 2005. More current models have more transistors of course, but these older models would be closer to an average count.

One thing to note is that there are just as many processing chips in the Machine (one billion from the one billion online PCs) as there transitors in an Itanium chip. The Machine is a super computer where each “transistor” is computer. A very rough estimate of the computing power of this Machine then is that it contains a billion times a billion, or one quintillion (10 ^ 18) transistors. Since only the newest servers have a billion processors, the figure is probably an order of magnitude smaller. When we add the transistors for cell phones, handhelds, it calculates out to about 170 quadrillion (10^17) transistors wired into the Machine

There are about 100 billion neurons in the human brain. Today the Machine has as 5 orders more transistors than you have neurons in your head. And the Machine, unlike your brain, is doubling in power every couple of years at the minimum.

In 2003 alone a total one quintillion transistors were produced, but not all of them are wired up into the Machine. Many transistors made their way into cameras, TVs, GPS units and the like, few of which are currently online. One day they will be. Every chip will eventually connect to the web in some fashion. That would mean we would be adding as many transistors to the Machine in a year as exist right now.

If the Machine has 100 quadrillion transistors, how fast is it running? If we include spam, there are 196 billion emails sent every day. That’s 2.2 million per second, or 2 megahertz. Every year 1trillion text messages are sent. That works out to 31,000 per second, or 31 kilohertz. Each day 14 billion instant messages are sent, at 162 kilohertz. The number of searches runs at 14 kilohertz. Links are clicked at the rate of 520,000 per second, or .5 megahertz.

There are 20 billion visible, searchable web pages and another 900 billion dark, unsearchable, or deep web pages (for instance pages behind passwords or the kind of dynamic page that Amazon will produce when you query it). The average number of links found on each searchable web page is 62. Assuming the same count for dynamic pages that means there’s 55 trillion links in the full web. We could think of each link as a synapse — a potential connection waiting to me made. There is roughly between 100 billion and 100 trillion synapses in the human brain, which puts the Machine in the same neighborhood as our brains.

There were more than 5 exabytes (10^18) of information stored in the world in 2003, but most of this was kept offline on paper, film, CDs, and DVDs. Since then online storage has mushroomed. Today the Machine’s memory totals some 246 exabytes of information (246 billion gigabytes!). This storage is expected to grow to 600 exabytes by 2010.

But not all the information flowing through the Machine is stored. An increasing amount is generated, pushed, or into the net without anything more then temporary copies. One study estimates that the information stream in 2007 is 255 exabytes while the storage only 246 exabytes, and that this gap between generation and storage will widen to 20% per year. We might think of this total amount of information as “movage,” or even the 9 exabyte difference as RAM. The total movage estimated for the Machine in 2010 is one zetabyte (10^21).

To keep things going the Machine uses approximately 800 billion kilowatt hours per year, or 5% of global electricity.

One of the problems we have discussing this Machine is that its dimensions so far exceeds the ordinary units we are accustomed to, so we don’t have a way to reckon its scale. For instance, the total international bandwidth of the global machine is approximately 7 teratbytes per second. We used to talk about one Library-of Congress-worth of information (10 terabytes), but that volume seems absolutely puny now. In ten years terabytes will fit on your iPod. Keeping that metric for the moment, one Library-of Congress-worth of information is zipped around on the Machine every second. These are very deep cycles of processing. What will we use to measure traffic in another 15 years?

We could start by saying the Machine currently has 1 HB (Human Brain) equivalent . That measure might hold up for a decade or so, but after it gets to 100 HB, or 10,000 HB, it begins to feel like using inches to measure galactic space.

While personal computers are increasing in power roughly in tune with Moore’s rate, doubling every few years, the Machine can advance in power even faster because its total power is some exponential multiple of all the computers comprising it. Not only is the power of its “transistors” doubling in power, but the number of them are doubling, and the connections between increasing exponentially. Computer chip manufactures talk about making chips in 3D, rather than their conventional flat 2D now, in order to gain another dimension in which to expand the number of transistors. The Machine offers more than this. It can expand in all its many dimensions so that its power may be rising at a rate that exceeds the rates of its components.

Somewhere between 2020 and 2040 the Machine should exceed 6 billion HB. That is, it will exceed the processing power of humanity.

Kevin, I wonder if the machine could eventually growth and evolve without control from human brain? Is it becoming a true Artificial Inteligence (AI)

Breaker

Great article,

Although what I think you haven’t mentioned is that the complexity of connections of the human brain at the neuronal and (ultimately quantum level) far outweigh those of the ‘world brain’ at the present time.

I guess my question would be, does complexity define sentience/intelligence more so than capacity alone?

Stefan

I so much see this possible in association with *cloud computing* …

Nathan Hangen

I can’t always keep up, but your blog up to and including this article, is fascinating. Can’t wait to buy the book!

Andrew

Hanii Puppy seems to be using “american” billions in saying there are aproximately 8,000,000,000,000 people on earth, which i would read as a trillion, not 1 billion.

Kevin, what a great article! I was nearly put off at first because of the use of the term “One Machine.” I have no claim to it in terms of trademark or copyright, but it’s the name that I have chosen to use to describe a theoretical machine of my “creation.”

It doesn’t exist in complete form today, but I am almost certain that it will in the near future — and probably without being connected to me in any way whatsoever.

I’d be very interested to get your thoughts on my version of the machine. In particular, what might the two “One Machines” (now there’s irony for you) do if coupled together? An all-knowing brain attached to an all-powerful tool of production-slash-destruction. Almost scary, almost exciting, to me anyway.

I know you’re probably a very busy man, but any input regarding my machine would be much appreciated. My article can be found at the URL provided.

Thanks!

JMK

Dr. Goulu

noticed a glitch in sentence “Since only the newest servers have a billion processors” … you probably meant transistors.

I also think a neuron can hardly be compared to a transistor. The brain’s transistors are more likely to be the synapses, and they are about 10^13 in a brain. However, they work much slower than transistors. Some studies suggest 1 brain has a global processing power of about 10 Petaflops, which is about the total power of the 500 most powerful supercomputers today.

Firstly, “The Machine”, as you put it, cannot operate as one single entity. It can share information between different parts and can even have clumps of it working together, but at no time in the future is the entire internet going to be able to work as one.

Secondly, “a billion times a billion, or one quintillion” – That’s 1,000,000,000,000 * 1,000,000,000,000, which would be 1,000,000,000,000,000,000,000,000. 1,000,000,000,000 is a billion, 1,000,000,000,000,000,000 is a trillion, and 1,000,000,000,000,000,000,000,000 – a billion squared – is a Quadrillion, not a Quintillion. Also, 1 Quadrillion = 10 ^ 24, and 1 Quintillion = 10 ^ 30.

Thirdly, I’m doubting the validity of your numbers. For instance, 170 Quadrillion computers (by 2040) would mean 170,000,000,000,000,000,000,000,000 Ã· 56 (the time that GUI computers will have been available by 2040) = (approx) 3,035,714,286,000,000,000,000. That would mean approximately 3,035,714,286,000,000,000,000 computers bought per year, or approx 8,311,332,746,651,000,000. Since there’s approximately 8,000,000,000,000 people on earth (approx 8 billion), that would be assuming that every single person on earth would buy approximately 1,038,916,593 (approximately 1 milliard) computers -every single day-, without ever throwing one out or deactivating it.

Fourthly, many chip manufacturers already produce chips in “3D”, either by layering “2D” chips over the top of eachother or genuinely “3D” ones in which the best use of space is maintained by making the design follow a 3 dimensional grid, encased in plastic. (This is, incidentally), one of the main reasons why it’s practically impossible to manually solder modern chips)

You’re leaving out the most important part of the evolving global superbrain: the billions of human brains connecting with each personal computer, workstation, cell phone,and PDA.

When the superbrain achieves trancendental mind (if it hasn’t already), we won’t be capable of comprehending it any more than one of our neurons is able to comprehend the brain of which it is a part.

suncat

This Machine, and what it is involving into, IS us!

Jai

I saw your TED talk on this – it’s just a fantastic insight. I just note – you call the internet, at it’s current state, the equivalent of one human brain, and the web will continue to expand at this rate…though a brain has an imagination, while the internet alone does not, so we as people are the imagination of the brain, and the growth of running computers would be irrelevant unless there were people using them or they were running software, so would the growth of the one be limited to the growth of overall human population?

richardb

There were more than 5 exabytes (10^18) of information stored in the world in 2003, but most of this was kept offline on paper, film, CDs, and DVDs.

I don’t understand how exactly one can measure the existing amount of data in the world on paper, film, CDs and DVDs.

For example – a holllywood movie exists as film in a can, so it can only be measured once converted to a digital format (lets say as close to original quality as it can be). But then, there’s all the other unquantified information about that film… not to mention the encoding/decoding algorythms which change and improve all the time.

fatcat1111

You state, “There are about 100 billion neurons in the human brain. Today the Machine has as 5 orders more transistors than you have neurons in your head.” I note the careful wording, but it still seems as though neurons and transistors are being conflated here. In reality a neuron can do far more than a transistor, and is much better connected.

Interesting but flawed. Yes, there are numerous connections in the internet, but the subsystems are limited in scope and redundant in function; whereas the subsystems of the brain are highly specialized and are part of an overall architecture. In memory retrieval there may be similarities and the internet almost certainly exceeds the brain, but in cognitive function the brain far exceeds the internet. I’d look to Watson more than the internet for any hint of cognition.