It is of most interest when considered in conjunction with plateaus in bandwidth needed for different purposes. For instance, the bandwidth required to everyone's home will likely plateau once people are satisfied with the bitrate of video streamed simultaneously to one or two devices for everyone in the family.

I predict that 1920x1080 will be an extremely long lived video standard. Of course there will be higher-def standards, but it is hard to imagine those being needed on mobile devices. Videophile, entire-room immersion will need more bandwidth, but it seems likely that most people will be using smaller screens most of the time. We really are that close to the "good-enough" threshold for the perception of video quality on most common screen sizes.

Once a large percentage of users reach that level of throughput/bandwidth-consumption, we should expect the slope of the graph to flatten significantly.

This article has a bogus looking graph with projected exponential growth. Who to believe?

You're talking about different sets of numbers measuring different things. The graph in this article I think you're referring to talks about the capacity deployed by service providers, which surely is increasing.

And the article you linked to has the following: "High broadband adoption increased by 17 percent on a year-over-year (YoY) basis, while the average connection speed globally grew by 19 percent YoY." It reports a decline in quarter-to-quarter adoption, although I wonder if it's simply a temporary slowing in growth rather than an actual drop in the number of users. I haven't read the full Akamai report.

I always asked myself how the undersea cables are deployed, how much they cost, and how they can last so long (with all the earthquake and the like) . For me it's seems like a Titanesque task though it is like it ever existed. I'd be glad to see an article on the subject on ars.

I always asked myself how the undersea cables are deployed, how much they cost, and how they can last so long (with all the earthquake and the like) . For me it's seems like a Titanesque task though it is like it ever existed. I'd be glad to see an article on the subject on ars.

Read the link that tormeh provided. It's a great article by Neal Stephenson on just that subject and should be required reading for any network geek.

I don't think the net in general is suffering in terms of overall bandwidth but rather one of locality of data to the end users. For example, the economical thing for a content provider is to have only two installations on the net - one primary for providing content and the other as disaster recovery. What that means in terms of bandwidth is that all traffic goes to one or two locations which likely hits the backbone at some point. Increasing locality reduces the load over the backbone and the same amount of local network traffic. The downside is significantly higher cost to content providers. Until the cost of dedicated/guaranteed bandwidth gets too high, increasing locality won't be favorable in the business sense. This has happened for some of the larger players but the economics of this idea prevent it form being adopted by the medium and smaller companies on the scale necessary to have an impact.

This is affecting our production at most of our sites as our networks were built for low latency and not throughput. We are now taxing our meager networks to death and will need to increase our bandwidth exponentially to keep up with our application demand.

Some judicious local (county) caching should allieviate bottlenecks for modern media. I see no justification in going above 2D 1080p @60fps as I regard 3D as something of a gimmick. Home cinemas cannot hope to compete with the apparent "borderlessness" of IMAX, so any attempts to replicate the iffy 3D experience in the home runs into problems where objects that are made to appear in front of the screen are thrown back by the actual screen border of the TV. This cognitive dissonance is jarring and I expect we will need to wait for 3DS style lenticular technologies for 3D to be viable within the home and then only for gaming as it can only benefit one viewer sat directly infront of it.

My eyes aren't good enough to benefit from resolutions higher than 1080p and the specs for the PS4 strike me as ridiculous. I also don't expect instant streaming from my media. Sky+HD is fine for getting hold of a copy of a film via Satellite to watch at a later time. Who is it that decides their evening's entertainment at the eleventh hour? Sky+HD let's you obtain a film, but only pay for it if you start to watch it, so you need not be fussed about wasting money on something you have bought because if you are unexpectedly invited out that evening.

Some judicious local (county) caching should allieviate bottlenecks for modern media. I see no justification in going above 2D 1080p @60fps as I regard 3D as something of a gimmick. Home cinemas cannot hope to compete with the apparent "borderlessness" of IMAX, so any attempts to replicate the iffy 3D experience in the home runs into problems where objects that are made to appear in front of the screen are thrown back by the actual screen border of the TV. This cognitive dissonance is jarring and I expect we will need to wait for 3DS style lenticular technologies for 3D to be viable within the home and then only for gaming as it can only benefit one viewer sat directly infront of it.

My eyes aren't good enough to benefit from resolutions higher than 1080p and the specs for the PS4 strike me as ridiculous. I also don't expect instant streaming from my media. Sky+HD is fine for getting hold of a copy of a film via Satellite to watch at a later time. Who is it that decides their evening's entertainment at the eleventh hour? Sky+HD let's you obtain a film, but only pay for it if you start to watch it, so you need not be fussed about wasting money on something you have bought because if you are unexpectedly invited out that evening.

I agree with the general gist but we must remember that everyone is different in regard to expectations for TV/home-theater performance.

People have consistently shown that they are willing to spend much of their discretionary income on large TVs. This is especially true for mobile devices. To see examples of this, ask someone you know who works a low paying job like janitorial work or fast food. Chances are they have a 30+ to 50+ inch screen at home. It is a surprisingly cheap luxury when compared to home ownership, fine furniture, and dining out. I wouldn't be surprised if it eventually becomes common for everyone to have a 8' screen and for video-geeks to have even larger screens for their home theater. When that happens, more than 1080p will be common for AV enthusiasts. For general purposes though, yeah, 1080p60 is sufficient.

You ask who is it that decides their evening's entertainment at the eleventh hour? The answer should be obvious, a good percentage of all humans. Some people plan their evenings out before-hand, some don't.

As for not expecting instant streaming of media, that may be true now. However people will soon become accustom to being able to watch anything anywhere anytime immediately. Waiting is just pointless and a waste of time. Imagine if it took 6 seconds to change between channels when channel surfing? Our expectations wouldn't tolerate that now and soon we won't tolerate non-instantaneous streaming. I already don't.

“Networking is about to be reinvented and Cisco will do that reinvention of networking,” says Cisco CTO Padmasree Warrior, during an interview at the company’s annual business partner conference [in San Diego.]

By 2015, on-demand video traffic will be the equivalent of three billion DVDs per month, and one million minutes worth of video will cross global IP networks every second.

That already sounds a bit too low. 1m minutes per second... is a more complex way of saying "at any given moment, 60 million people are streaming on-demand video" (1 m minutes/second * 60 second/minute = 1*60).

I find it hard to believe we aren't already hitting that, worldwide. With all the people who have cut their cords, those watching YouTube & Netflix?

technology aside, the fact that there are zip-ties all over that cable amuses me.

That made me chuckle as well. I don't know enough about underwater cabling to know what they are for but it looks like they are holding on some sort of spiral protective sheath...maybe as part of a patch to a damaged section...anyone here know?

By 2015, the top one percent of households worldwide are on pace to need one terabyte of data each per month, four times the amount generated by the top one percent in 2010.

And yet most homes in the US have caps of 250GB or less with no foreseeable increase in the near future. Admittedly one percent isn't much, but I'm assuming that the next 10% also will exceed those "so big you won't even notice" caps.

By 2015, the top one percent of households worldwide are on pace to need one terabyte of data each per month, four times the amount generated by the top one percent in 2010.

And yet most homes in the US have caps of 250GB or less with no foreseeable increase in the near future. Admittedly one percent isn't much, but I'm assuming that the next 10% also will exceed those "so big you won't even notice" caps.

Most homes?

Citation? Not to be rude but I'm interested in seeing numbers on this and am a bit skeptical of the assertion.

It must take far less bytes to store a black and white movie and a movie with no sound I doubt this was included in the 4 minute figure.

No it doesn't. In fact, it takes exactly the same amount of storage for a B&W film, and a silent film is only 0.1% smaller.

The B&W storage is really strange. You'd think that you'd push a button on the encoder and encode just the Y signal (luma), or in RGB, just encode one of the numbers, because in RGB, R=G=B for B&W. But in the end, it's always stored as YUV or RGB.

It must take far less bytes to store a black and white movie and a movie with no sound I doubt this was included in the 4 minute figure.

No it doesn't. In fact, it takes exactly the same amount of storage for a B&W film, and a silent film is only 0.1% smaller.

The B&W storage is really strange. You'd think that you'd push a button on the encoder and encode just the Y signal (luma), or in RGB, just encode one of the numbers, because in RGB, R=G=B for B&W. But in the end, it's always stored as YUV or RGB.

Additionally, most of the information in visual imagery is contained in the luma as most compression technologies subsample the chroma.

This is because our eyes discern detail with the intensity of light, not the colour.

In fact, it takes exactly the same amount of storage for a B&W film, and a silent film is only 0.1% smaller.

I'm positive that that is not true. You are correct that movies are stored as YUV. You are probably correct that there is no special format for B&W, so U and V are always stored, even though they are always zero for a B&W film.

But -- compression. Many extremely bright people have labored for years to come up with elaborate compression streams, and they would *not* miss a chance to remove the redundancy that having U and V always zero implies.

Now, while B&W will be smaller, I do agree that it will not be *much* smaller. Typically U and V are chopped to half resolution in each direction (YMMV) which means that two thirds of the samples are Y, and one sixth each are U and V. Still, that suggests to me that B&W should be about two thirds the size of color.

I don't think the net in general is suffering in terms of overall bandwidth but rather one of locality of data to the end users. For example, the economical thing for a content provider is to have only two installations on the net - one primary for providing content and the other as disaster recovery.

All big web properties use Content Distribution Networks, such as Akamai. The claim of one primary and the other for disaster recovery does not match what I have heard.

I always wondered how my data travels through network. If I send an email (hotmail) from Australia to my parents' polish email account where will the signal travel... what cables, to what servers?Are there any simulations of how data travels through satelites, cables and other network?

By 2015, the top one percent of households worldwide are on pace to need one terabyte of data each per month, four times the amount generated by the top one percent in 2010.

And yet most homes in the US have caps of 250GB or less with no foreseeable increase in the near future. Admittedly one percent isn't much, but I'm assuming that the next 10% also will exceed those "so big you won't even notice" caps.

Most homes?

Citation? Not to be rude but I'm interested in seeing numbers on this and am a bit skeptical of the assertion.

Comcast alone controls 80% of the broadband market and they have caps. I'm sure it's not hard to figure out that "most"(greater than 50%) of all USA broadband users have caps.

If you don't have broadband, then 250GB cap is moot.

Thanks for the explanation of what most means.

But seriously, where is this data coming from? I'm interested in the broadband market but couldn't find any stats on market share, etc.

The article is very loose with terminology. Bandwidth is the capacity of a connection in data units per second; the amount of data sent thru the connection is *traffic* amount. The issue is that traffic is increasing, and therefore more bandwidth (capacity) is needed, but this is a little confused because of the way terms are used.

The article is very loose with terminology. Bandwidth is the capacity of a connection in data units per second; the amount of data sent thru the connection is *traffic* amount. The issue is that traffic is increasing, and therefore more bandwidth (capacity) is needed, but this is a little confused because of the way terms are used.

That narrow/specific definition of "bandwidth" is not used by everyone.

In everyday and scientific usage, it can refer to either available or consumed capacity.

I agree that the ambiguity can be confusing if the term is used without ample context.

By 2015, the top one percent of households worldwide are on pace to need one terabyte of data each per month, four times the amount generated by the top one percent in 2010.

And yet most homes in the US have caps of 250GB or less with no foreseeable increase in the near future. Admittedly one percent isn't much, but I'm assuming that the next 10% also will exceed those "so big you won't even notice" caps.

Most homes?

Citation? Not to be rude but I'm interested in seeing numbers on this and am a bit skeptical of the assertion.

Comcast alone controls 80% of the broadband market and they have caps. I'm sure it's not hard to figure out that "most"(greater than 50%) of all USA broadband users have caps.

If you don't have broadband, then 250GB cap is moot.

Thanks for the explanation of what most means.

But seriously, where is this data coming from? I'm interested in the broadband market but couldn't find any stats on market share, etc.

... hence my original query about the assertion that most broadband users have a data cap. It could be true. But given the inconsistent figures being thrown about, it would be interesting to find a direct source or at least one that indicates where the stats are coming from.