In case anyone was wondering, the 5.6 billion mile stack is about 6.6 million Library of Congress Distance Units right now (exact amount is subject to uncertainty in the measurement and drift of the Bookshelf Inflationary Constant, consult your local Librarian of Congress for more details).

"By the year 2100, old had become so scarce that it was worth more than an ounce of silver, creating an energy drought. Citizens could barely afford to turn-on a 10 watt lightbulb..... forget the high expense of a computer and internet network."

Surely there must be some Science-based fiction that deal with this negative future?

LEDs use a Lot more power. They would not be used during an energy drought. Probably e-ink would be used (like the kindle), although it would cost $50 per battery charge (ouch). Maybe society would revert to paper, since it requires no energy to use a book.

Maybe society would revert to paper, since it requires no energy to use a book.

LOL good luck reading paper books without gas for the chainsaws, diesel for the cranes and trucks to the mill, hundreds of megawatts of electricity for the paper mills, diesel for the trucks to the printers, oil for the printing press ink, propane for the pallet forklifts, diesel for the trucks to the store, gasoline to drive to and from the store...

Would be easier and probably more ecologically sound to stick to wireless kindles.

Also I find it unlikely a kindle charge would require $50 at present value...

"By the year 2100, old had become so scarce that it was worth more than an ounce of silver, creating an energy drought. Citizens could barely afford to turn-on a 10 watt lightbulb..... forget the high expense of a computer and internet network."

Surely there must be some Science-based fiction that deal with this negative future?

There's a seniors home near my place that's full of old. Come and get it.

I agree that it's silly to think that electricity will disappear forever. However when you sit down and think seriously about the problem involved in say, powering your home through alternative energy, it dawns on you what a huge amount of power we consume with our basic house-hold appliances. I just have to look at that 125 amp breaker - with 240V that is 30kW that can be sucked through it. Of course I never get close, even with the water heater, clothes dryer, fridge, stove and computer running at the sam

"I have travelled the length and breadth of this country, and have talked with the best people in business administration. I can assure you on the highest authority that data processing is a fad and won't last out the year." — Editor in charge of business books at Prentice-Hall publishers, responding to Karl V. Karlstrom (a junior editor who had recommended a manuscript on the new science of data processing), c. 1957

It's been hardly more than fifty years. Where will we be in another fifty years, say b

I have travelled the length and breadth of this internet, and have talked with the worst people in web 2.0. I can assure you on the highest authority that people's hunger for unoriginal content is a fad and won't last out the year.

Most people cannot imagine the distance to Neptune, so that is a bad visual. Here is a better one:

9.57ZB is approx 10^22 bytes. A typical laptop HDD can hold a terabyte, so you would need 10^10, to about 10 billion. A laptop HDD is about 3 cubic inches. A standard shipping container (40x8x8 ft^3) would hold about 1.5 million if they were packed tightly. So you would need about 6800 containers. That would be a train about 75 miles long.

If each byte in 9.57ZB was a water molecule. It would be slightly less than a teaspoon.

> Most people cannot imagine the distance to> Neptune, so that is a bad visual.>...> If each byte in 9.57ZB was a water molecule. It> would be slightly less than a teaspoon.

Most people can't visualize the size of a water molecule either.;-)

Good HDD analogy though. I agree that the original "stack of books" one is dumb. Though I wouldn't even break it down to hard drives. Just say "it would take X many containers full of laptops* to hold all that data."

The vast majority of this data isn't stored. The vast majority of it is streaming porn and Netflix. Why did we pay some "scientist" for 3 years (read the summary, it says "three years ago") to calculate this, so we can all be amused by it on/. for 10 minutes? Part of the reason nobody's working in science anymore is that most of our government- and university-backed science is fluff like this to get your soundbite, rather than stuff that makes a difference in our world. Figure out how to GET to Neptune, not how to stack virtual books that high with 30-second free trials of every porn site in Russia.

FTFA "Most of this information is incredibly transient: it is created, used and discarded in a few seconds without ever being seen by a person," said Bohn, a professor of technology management at UC San Diego.

XML overhead, HTTP headers, page reloads instead of AJAX/DOM updates. And much of it is identical, just served to different people, such as the dynamically generated static pages of slashdot.

There is no point to this number other than illustrating how much data goes over the pipes. And even then, it

The vast majority of this data isn't stored. The vast majority of it is streaming porn and Netflix. Why did we pay some "scientist" for 3 years (read the summary, it says "three years ago") to calculate this, so we can all be amused by it on/. for 10 minutes? Part of the reason nobody's working in science anymore is that most of our government- and university-backed science is fluff like this to get your soundbite, rather than stuff that makes a difference in our world. Figure out how to GET to Neptune, not how to stack virtual books that high with 30-second free trials of every porn site in Russia.

Who cares? I'll tell you who cares -- Copyright holders. I may have a website, but I did not authorize you or all the intermediary routers to copy my workmultiple times per view! Just because I put my HTML e-book on my web-server doesn't give you or your ISP the right to make so many duplications!

I'm positive if you further analyzed the data that was transmitted you would realize that there are Billions and Billions of illegal reproductions in that dataset!

I mean you could also make it seem really small by saying it was equivalent to the size of a 1 second clip of the beating of a fruit fly's wing recorded as uncompressed 4096x4096 video at 71.3 picohertz.

Zettabyte already is a relevant term. It means 10^21 bytes. Using "billion terabytes" is more confusing because you're using two different scaling factors. It would make more sense to say "billion trillion bytes".

It just seems that most people who use the internet know what a terabyte is. Terabyte size hard drives are pretty common, so it's easy to understand how much space that is. And although 10 billion is hard to imagine, "billion" is a commonly used term and understood.

Whereas people aren't familiar with a zettabyte, or a 10^21 bytes, or 1 with 21 zeros, any more then they are familiar with the space required to hold a 4096x4096 1second video running at 71.3 picohertz, or how much data is in a stack of books 5.

Most would have a better idea of what a terabyte that is compared to zettabyte. And we are talking about network traffic, which is a subject that the people would would be interested in and the people that know what a terabyte is probably overlaps somewhat, I'm guessing.

Europe, right... and not the UK, though, because they use billion as the US does.. so people speaking other languages. I apologize, I should have thought about them.. I can't be English centric, regardless if we are talking about terms in th

Using "billion trillion" is at least using two terms from the same nomenclature. Using "billion terra" is mixing SI nomenclature with standard counting nomenclature - which just seems silly.

And the only reason some number of people know what a terabyte is? Because people started using it when the term was appropriate, instead of using "one million megabytes". Now zettabyte is the relevant term and people need to learn it. Fortunately they can drop kilobyte from their memory to make room; no one cares ab

Using "billion trillion" is at least using two terms from the same nomenclature. Using "billion terra" is mixing SI nomenclature with standard counting nomenclature - which just seems silly.

You didn't blink when I pointed out you just contradicted yourself, and then you hit me up with that?

Is "billion trillion" less silly? Why not thousand million million? How about gigaterabyte? Does that sound better?

Anyway, the term "billion trillion bytes" is confusing to more people than "billion terabytes", because terabytes is a more familiar term. When was the last time you saw a hard drive advertised that it held "one trillion bytes"? You even wrote,

Fortunately they can drop kilobyte from their memory to make room; no one cares about those any more.

Why the hell would they measure the data in Zetabytes? That comes out to an unwieldy 9.57 ZB.

Books between planets? Common folk don't comprehend global scales, much less interplanetary scales... Want proof? Did anyone ask at what time of year the measurement was taken? An exact date would be required, and even then, most common folk don't know if we are closer or nearer to the planets mentioned at that date -- It's a ridiculously obtuse measure since the unit (planetary distance) wildly varies by da

We can carry that much information in our cellphones. We go back in history see this article and laugh at their puny attempt to impress our future selfs on the amount of data we once processed.I remember back in the late 80's people were talking about the amount of data that can be stored on 3.5" Floppy. And was impressed they could fit the text of an encyclopedia onto it.

That would presumably cause a wobble in the sun's rotation, which could be detected at interstellar distances. We finnaly have a way of finding intelligent civilisations (even if they don't broadcast radio or TV signals.

As stated in the summary. While the mark of 9.57ZB implies that number, it does not inherently mean that exact number - especially in a situation like this where that precise of a measurement is pretty well impossible.

I know the zettabyte is really hard to conceptualize, but does it help to convert it to some equally ludicrous measurement? Neptune and back, 20 times? Is that when Neptune is closest to Earth, furthest, or some sort of average? I mean, there is the possibility of a 2 AU difference in EVERY STACK! 2 AU x 2 stacks x 20 round trips is 80 AU of uncertainty! That enough for at least a round trip to Neptune!

(Assuming, of course, that all that data actually transferred over the internet. Which is actually not all that likely: Much of that data would be data generated in-house, and transfered - if not not processed on the same server which generates it - over local networks. After all, if you generated a few TB of data every day that needed to be processed, why spend money to send it someplace if you don't have to?)