So how fast is Google Fiber? Really fast... and it may be far ahead of its time.

That said, I can’t say that I noticed a significant difference in terms of my day-to-day use of the Internet. Even the people in the Kansas City Startup Village (KCSV)—a small strip on the edge of Kansas City, Kansas that was among the first homes to have Google’s gigabit connection—agreed with me.

“Right now, we don’t do anything that requires fast Internet,” admitted Adam Arredondo of LocalRuckus, a new local startup, when I stopped by to learn about their experience.

As many Ars commenters have noted, the rest of the Internet is still really slow in comparison to a gigabit home broadband pipe; there are many other obstacles getting in the way of a full-bore Internet experience. After all, once my Web request leaves the fiber-connected house, makes its way out of the city, and starts talking to the rest of the Internet, there are all kinds of routers, switches, boxes, firewalls, and quality of service (traffic shaping) issues that make it so I can’t actually download an album from iTunes in the blink of an eye.

But my brief taste of what promised to be the craziest, fastest Internet for the lowest amount of money may not have been enough. I may have to stay longer next time to really appreciate it.

Another guy in the same work house as Arredondo, Andy Kallenbach (also known as Ars reader “Andreas_kc”) of FormZapper, said that what he really notices is the high upload speed.

“The best connections that I get personally are to several of our servers colocated in downtown [Kansas City],” he wrote in a comments thread, echoing what he told me in person. “It is just as good as walking into the data center and jacking in. Our nightly deployments take around 10 seconds now. [The only] bottleneck now being somewhat drive speed.”

Other Ars readers have suggested that I’d really start to appreciate it once I spent more time with the connection. Malor, who has a 250MB connection in Chattanooga, commented:

“This is something you need to live with, as opposed to just experience briefly, to start to understand how it matters. The biggest thing is that you never, ever have to think about what you are doing with your connection. It doesn't matter. No matter how crowded either direction is, with that kind of packet throughput, you don't really get latency issues. You don't have the problems with cable modems of saturating the upload link and then seeing your download link hosed. At least, I don't think you ever have that problem; I've never been able to completely saturate my upload, even when it was merely 100Mbps.”

Still, I would definitely throw down $70 a month for a 1Gbps connection, happily tossing aside the $30 basic Internet package (20Mbps) that I get from Comcast in Oakland. I think we can all agree that most Americans would definitely enjoy even a small speed bump on the order of 50Mbps to 100Mbps for around the same price as they’re paying now.

“We have a lot to do in Kansas City”

Enlarge/ Google fiber technician Ben Estes worked on the Hacker Home connection on Wednesday.

Cyrus Farivar

When I first arrived on Tuesday afternoon, I ran a test on Google’s own speed test page and got 460Mbps down. The two denizens of the Hacker Home (Mike Demarais and Andrew Evans) told me that such speeds were slow—I ran countless tests, ranging from BitTorrent, to multiple Hulu and YouTube streams, to traceroutes. I ran the tests provided by SpeedTest.net and Google fiber speed tool. But there wasn’t anything that could really, adequately test what such high speeds mean beyond clicking Google’s little “begin test” button.

Many commenters suggested things that I couldn’t do, like run a server. (Google’s terms of service forbid that.) Accessing Usenet proved a bit difficult, as I didn’t want to go to the trouble of setting up a new account and paying for it. I also tried running a Tor bridge relay—but even then my connection only tended to hit around 23.5KBps, far lower than what I would have expected. Later on Wednesday, Kallenbach helped me use Go2PC between two machines in different fiber-enabled houses to send a Linux ISO, and that only topped out at about 2MB per second.

The only real test that I could do on a single machine with minimal hassle was to run a script written by Ars’ own Lee Aylward that downloads an Ubuntu ISO from all US-based mirrors (45, in total) at once. When I ran that, I got the best performance I had so far—a whopping 50MB per second—but still far short of what Google should be serving me.

Another thing to keep in mind is that running Fiber over WiFi, of course, slows down the connection noticeably. Google techs who came to visit the Hacker Home Wednesday (we’ll get to that later) told me that they regularly see Wi-Fi speeds of around 100Mbps compared to wired speeds of around 800Mbps.

Other readers asked about the Google fiber service (a $120 package) with TV option (the basic option the house has is only $70 a month). The Hacker Home doesn’t have a TV and doesn’t have that option, so unfortunately I couldn’t test it. In fact, what may be a better test than the one I ran would be to record a bunch of TV shows and then run some of these higher-bandwidth and lower-latency tests (games, Skype calls) simultaneously across multiple machines to see how that stresses network performance.

Ben Estes, a local Google fiber technician who was over at the house Wednesday, estimated that “over half” of the approximately 300 homes in Kansas City, Kansas that have the gigabit installed already have gone for the $120-per-month TV option.

He and his colleague Brett Neal came to the Hacker Home to check out why our gigabit connection had fallen to about 40Mbps. After over an hour of inspections, he determined that the electrical power to the house wasn't sufficient and that once he replaced the fiber jack in the house, everything should be fine. (The Hacker Home hadn't been lived in in about two years, and had some issues with not being able to feed enough power to its new residents and all their devices, including the Google Fiber box. An electrician working at the house that same day expanded its capacity from 40 amps to 200 amps.) By the time Estes and Neal left, they were pulling down 900Mbps-plus speeds. Later Wednesday evening my tests were still resulting in speeds in the neighborhood of 400Mbps.

“Right now there’s only six of us [technicians],” he said, noting that he and his girlfriend were considering moving to this neighborhood just to get fiber access. “By six months, we should be over 100 [technicians]. We have a lot to do in Kansas City.”

"Still, I would definitely throw down $70 a month for a 1Gbps connection, happily tossing aside the $30 basic Internet package (20Mbps) that I get from Comcast in Oakland."

$30 for 20mbit? Wow. What kind of data cap does it have? At my ISP, it's $72 (after tax) for a 20mbit connection without a cap (caps don't kick in 'til the next tier up - 100mbit). For $30 ($34.50, after tax) around here, all you can get is 256kbit.

> The only real test that I could do on a single machine with minimal hassle was to run a script written by Ars’ own Lee Aylward that downloads an Ubuntu ISO from all US-based mirrors (45, in total) at once.

ummm....iperf anyone (or you can use jperf if the OS doesn't have a native iperf package)?plus you may need to tweak your tcp window to fine-tune it for the fat.....delicious....*droooool* pipe....

I'm confused and maybe you can clarify. I was excited to move a startup to KC from Wichita instead of Austin when Google Fiber was announced. But since GF is just for residential we scrapped it and moved to Austin.

Did I miss something that says you are allowed to run your business on GF? Because the last TOS I saw said no, especially to home servers.

Give me a fast connection and I will make home servers and host my own content on them. I can't wait.

Except Google's terms of service expressly forbids that.

It would be interesting to understand what Google will or will not allow in terms of "servers." Media Streaming or checking on your security cameras while on the road? My guess is they will expect your media to be in the cloud, where you can be incrementally charged for storage. I doubt personal servers fit into Google's cloud vision, which may be a cause of concern.

I'm confused and maybe you can clarify. I was excited to move a startup to KC from Wichita instead of Austin when Google Fiber was announced. But since GF is just for residential we scrapped it and moved to Austin.

Did I miss something that says you are allowed to run your business on GF? Because the last TOS I saw said no, especially to home servers.

No you can't host servers off it. Google also won't let a commercial property order Google Fiber. The reason we have it as a business thou is because we are in a house... and this is one reason why we are moving our startups into houses in KC.

Another thing is we don't really have access to the router box either to twiddle with settings or open up any ports. In the near future Google will have a customer facing web page called "My Fiber" that is billed as a super easy way to configure your router and access your home from anywhere.

On hosting servers on Google Fiber... Google *WILL* have a business offering to commercial spaces and likely provide static ip's with the ability to host. They are working on what this will look like right now. Although, that business offering does not excite me much. I'd much rather our servers be locked away, with the redundancy and power that a datacenter can give.

Many commenters suggested things that I couldn’t do, like run a server. (Google’s terms of service forbid that.)

Well, that takes the shine off it, making it just like every other provider out there. If you can't run a server (assuming you aren't a complete idiot at security), having a fast upload really doesn't matter too much once you are beyond 5Mb/s, unless you are trying to upload your BR rips, err I mean your hi-def home movies

running a "server" seems a little vague, wouldn't you technically be hosting a server if the host migrated to you in a round of Black Ops 2?

I find it hard to believe that the electrical power was making your connection slow...

First, a overly used power circuit will blow the fuse or trip the breaker. There will not be less juice to the devices until this happens and when it happens, the devices won't power up, obviously.

Second, a fiber connection is immune to electrical interference that **might** be present in old houses. The local Cat-5 cables also won't be affected as they are low-power cables independent of the main power supply.

Last, if you were benchmarking over wireless, this is most likely your problem. Unless you were trying to benchmark the possible WiFi speeds in that particular house within that particular neighbourhood and with that particular equipment, you should not have done that. Always benchmark an internet connection with as few variables as possible. Wired.

Oh also a simple peer to peer transfer with another Google Fiber user would have made it possible to see if the local routing is good or not.

In all, I'm sorry but this makes it very difficult to trust this review.

Many commenters suggested things that I couldn’t do, like run a server. (Google’s terms of service forbid that.)

Well, that takes the shine off it, making it just like every other provider out there. If you can't run a server (assuming you aren't a complete idiot at security), having a fast upload really doesn't matter too much once you are beyond 5Mb/s, unless you are trying to upload your BR rips, err I mean your hi-def home movies

running a "server" seems a little vague, wouldn't you technically be hosting a server if the host migrated to you in a round of Black Ops 2?

Yup. Say you wanted to run a private Minecraft or whatever game server for you and your buddies. Technically, that's a TOS violation. Probably would never be enforced, but the sheer fact that this legalese is in all these AUP/TOS documents from ISPs is a big stinking pile.

As part of my strata fee, I have high-bandwidth internet in my condo. It's not gigabit-class, but I consistently see 150Mbit U/D and can get as much as 300Mbit down, off-peak (it's delivered, incidentally, via microwave from one of the big local trunks.)

It's absolutely true that there is almost *nothing* that can saturate my connection, but that's essentially the point. I can be backing up major changes to CrashPlan while my fiancee streams something off of Netflix to her laptop, and when I click an HD movie on iTunes, it's still ready to go in a few seconds, and never hitches or stutters.

After a certain point, it's no longer about completing an operation faster -- after all, once I can live-stream an HD movie, how much more pipe needs to be devoted to that task? - but about not having to think about the impact of Transaction A on Transaction B. And that's a powerful thing indeed.

I find it hard to believe that the electrical power was making your connection slow...

I was there for a few mins at the Hacker home while the technicians were there, I know they replaced the Google router box. Thats close enough to being an electrical problem >:) There is not much info you can get out of the technicians. They know to be careful talking to us, we have already got a technician in trouble for talking.

I think you have to think of this Gigabit line as the preliminary steps of a railroad.

The best way that I could describe it is to look at it the way roads and cities evolve. We have streets and roads to travel, but when your cities grow your roads need to get wider (20mbps). And as your city grows, your roads are getting crowded. You need subways to get around (300mbps) And as your city continues to grow you eventually need a highways to get from one city to another (T1 connection speeds). Eventually you travel so far that driving doesn't even make sense anymore--build a railroad/train tracks (Google Fiber).

If you look at how the railway system boomed in the late 1800's, people eventually built great businesses from a few initial lengths of track, and even more so in the following 100 years. If Google fiber resembles this trend in history, it could be an essential establishment for the future of the internet here in North America--and very likely--the world.

Why does it matter? If you don't live in KC, you will never get this. Google has a history of trying things out and never expanding beyond their test area. Admit it, ars technica...you wrote articles 6 or 7 years ago when Google announced plans to bring "regular" speed broadband to the country for free, saying how great it was. Well... over half a decade later and not one single house that didn't have it available originally has it now.They never expanded beyond their test area.

This one will be even worse when they realize that no one gives a crap about those speeds. The vast majority of takers in KC will be for the free one, which is not nearly as profitable for Google. So they will never expand.

And what is with their TV lineup apparently not offering HBO, at least according to their site. HBO has more subscribers than all the otehr pemium channels combined. Why wouldn't they offer it?

But again, the difference is that the people making the railroad didn't have a history of starting projects and then abandoning them. Google does. I know the internet doesn't like to look at Google with Reality Glasses...but in the real world, they rarely follow throgh on ttheir grand plans.

No you can't host servers off it. Google also won't let a commercial property order Google Fiber. The reason we have it as a business thou is because we are in a house... and this is one reason why we are moving our startups into houses in KC.

Another thing is we don't really have access to the router box either to twiddle with settings or open up any ports. In the near future Google will have a customer facing web page called "My Fiber" that is billed as a super easy way to configure your router and access your home from anywhere.

On hosting servers on Google Fiber... Google *WILL* have a business offering to commercial spaces and likely provide static ip's with the ability to host. They are working on what this will look like right now. Although, that business offering does not excite me much. I'd much rather our servers be locked away, with the redundancy and power that a datacenter can give.

So you are skirting the TOS by being in a residence and possibly breaking some zoning regulations as well - that's not how I want a start-up to begin.

I didn't realize you can't even configure port forwarding yet. That seems rather limiting.

And where are you getting this WILL have a business offering? Everything I've read says that isn't set in stone yet.

As others have noted, the real benefit of this sort of broadband speed is in concurrent use.

Sure, your download of a particular file may be no faster, but when your son is doing some multiplayer gaming in his room, your daughter is watching Netflix in her room, you're downloading a new ISO in the background while checking out an MMO and your wife's streaming something else in the living room... And nobody has bandwidth or latency issues.

Try just managing two of those things with a standard broadband connection these days.

I doubt I'd saturate a 1gbps fibre connection, but I could easily use a goodly portion of it. And isn't that the goal, from the providers' angle? They sell you a connection at a given speed, with the assumption that you won't be running it 100% saturated all the time.

So you are skirting the TOS by being in a residence and possibly breaking some zoning regulations as well - that's not how I want a start-up to begin.

And where are you getting this WILL have a business offering? Everything I've read says that isn't set in stone yet.

Hrmph, first we are not skirting the TOS or breaking any zoning regulations. Google Fiber is super supportive of us. We have support from KCK and all the other regulatory entities which we engaged very early on.

I've heard it said several times that they are working on a business offering at the fiber space. When that happens I don't know, but it is talked about here locally in KC.

I can’t say that I noticed a significant difference in terms of my day-to-day use of the Internet. Even the people in the Kansas City Startup Village (KCSV)—a small strip on the edge of Kansas City, Kansas that was among the first homes to have Google’s gigabit connection—agreed with me.

“Right now, we don’t do anything that requires fast Internet,” admitted Adam Arredondo of LocalRuckus, a new local startup, when I stopped by to learn about their experience.

Admittedly, this is the farthest I've made it into the article, so sorry if this point was made later on; however, I am of the mindset that this is a good thing.

Could you imagine a world where we didn't care about bandwidth usage? The internet was just always "there", with no consideration to speed. This is what the long term plan should be, in a perfect world.

Unfortunately, that is not going to be the case. Telcos such as AT&T, Rodgers, Comcast, Cablevision, T-Mobile, Sprint, etc, etc would naturally be against such a solution. Those exact same types of companies will always be against that sort of world, because that would affect the bottom line, which would affect profits, which would in turn affect stock price, and eventually would result in less growth in the company.

It's sort of a "chicken-or-the-egg" scenario. They have to give up a little bit to make a lot, but by making a lot they will get little in return.

For those of us who have had 500Mbps or 1Gbps Internet outside of US, the slowest connection is going out of the country. Since most of the internet ( Servers and DC ) lives inside US,surfing within US you are not experiencing the Internet is too slow at all. While there may be inefficiency in routing around US, the difference would be minimal. Something you shouldn't feel even if it was operating at 100% efficiency. Unless you are playing Network games which requires you want constant sub 100ms latency.

Once you are out of the country, let say visiting a website in EU, you are effective back to your good old speed of Comcast. Google once suggested the Web site speed different beyond a 3Mbps connection start to fade, and latency will take a much higher importance.

But it really makes a different in Online Data Storage and Downloading and Uploading. You get faster Internet Storage then using some cheap NAS sitting right next to you. And you start watching 1080P clips on the internet while downloading a few software update and surfing on the net with Skype Video Chat turned on.

I find it hard to believe that the electrical power was making your connection slow...

First, a overly used power circuit will blow the fuse or trip the breaker. There will not be less juice to the devices until this happens and when it happens, the devices won't power up, obviously.

Second, a fiber connection is immune to electrical interference that **might** be present in old houses. The local Cat-5 cables also won't be affected as they are low-power cables independent of the main power supply.

Last, if you were benchmarking over wireless, this is most likely your problem. Unless you were trying to benchmark the possible WiFi speeds in that particular house within that particular neighbourhood and with that particular equipment, you should not have done that. Always benchmark an internet connection with as few variables as possible. Wired.

Oh also a simple peer to peer transfer with another Google Fiber user would have made it possible to see if the local routing is good or not.

In all, I'm sorry but this makes it very difficult to trust this review.

The article stated that the electrician upgraded the service in the home from a total of 40 Amps(!) to (modern code required) 200 Amps. If you only had a total capacity of 40 Amps, the hot water heater turning on would make the lights dim, and deprive the Fiber Box of much-needed power to drive the fiber connection.

In other words, the whole house was suffering the equivalent of a prolonged brown-out.

deet wrote:

Quote:

…he determined that the electrical power to the house wasn't sufficient

Explain.

Quote:

(The Hacker Home hadn't been lived in in about two years, and had some issues with not being able to feed enough power to its new residents and all their devices, including the Google Fiber box. An electrician working at the house that same day expanded its capacity from 40 amps to 200 amps.)

The article stated that the electrician upgraded the service in the home from a total of 40 Amps(!) to (modern code required) 200 Amps. If you only had a total capacity of 40 Amps, the hot water heater turning on would make the lights UNdim, and deprive the Fiber Box of much-needed power to drive the fiber connection.

In other words, the whole house was suffering the equivalent of a prolonged brown-out.

I must disagreed with you. First, a water heater is on 220/240. The Google box is on 110/120. They don't share the same phases...

Next, a 40 amp power panel is able to provide 40 amps. Not 41 amps at a reduced voltage... If you have too much stuff on the same panel, the fuses or breakers will trip.

I'm curious what this will mean for the future of web development standards. For a long time, the only thing that keeps those writing web pages "honest" and makes them write efficient coding is bandwidth constraints. If you have gigabit fiber, it would almost be like loading content from your HDD. We shall see.

Any software or web developer worth a darn will always strive for the fastest and most efficient method possible so I wouldn't worry about it too much.

That's a joke, right? If software evelopers really strived for efficiency, then tell me why things like the latest MS Office, or Photoshop or anything that has been around for 15+ years requires like 100-200 times more storage, and memory...often even more. Yes..the new versions ARE more advanced..but not NEARLY that much more.

Coders are lazy. IF they think you have more storage and memory, they will require it.

Could you imagine a world where we didn't care about bandwidth usage? The internet was just always "there", with no consideration to speed. This is what the long term plan should be, in a perfect world.

Unfortunately, that is not going to be the case. Telcos such as AT&T, Rodgers, Comcast, Cablevision, T-Mobile, Sprint, etc, etc would naturally be against such a solution. Those exact same types of companies will always be against that sort of world, because that would affect the bottom line, which would affect profits, which would in turn affect stock price, and eventually would result in less growth in the company.

It's sort of a "chicken-or-the-egg" scenario. They have to give up a little bit to make a lot, but by making a lot they will get little in return.

Could you imagine a world where rainbows were edible and we all sing around campfires all day long!

I can imagine it...but it is stupid since things cost money. So no...I don't expect others to lose money just so I could have 1 gig internet. Nor do I think taxpayers should pay for it since the VAST majority of people have no need for it.

It is called HAVING A LIFE. Try it some time. Here is something to think about...No one (but you, apparently) will EVER say on their deathbed "I really wish I spent MORE time on the internet when I was alive instead of socializing with actual people!"

People who moved there for Fiber must have at least employed a basic idea of upgrading their consumer sh*t routers to custom ones running several stacked NIC's and enterprise switches. Quite stupid to think your netgear will support it without bottle necking it.

I'm pretty sure that you can't use your own router. Google provides a special one that handles routing and I believe also helps with the TV portion (presumably that is delivered on a separate VLAN or through a non-routable subnet).

That's not to say you couldn't stick another router behind it, but that would be idiotic of you unless said other router was reconfigured to just act as an AP...

I suspect that many KC users will now find out just how fast their consumer level network adapters, routers, hard disks, and wireless routers actually are [or aren't]. Can't say I'm not jealous though. We pay near $50 a month for our 12/1.5 connection.

Anyone care to post ping times to major sites and services? Whats the latency like? What kind of bandwidth is in place to interface to other ISPs once you get out of the "Fibre Hood"? Is Google co-locating their services in KC on the Fibre LAN so that they will be faster? Are they going to let other service providers do the same? Or is that not necessary?

I think cloud companies should lobby broadband providers to increase upload speeds. I'm currently backing up my personal data, mp3s, photos, and videos to my 1TB cloud host and it will take me 2 months to upload 500gb of data. This is unexceptable.

Has anyone experienced a cap with Steam downloads? My measly 50 Mbps connection is easily saturated by Steam. Would I see any appreciable benefit going to 100 Mbps or higher?

Steam's kind of weird in how much the download speeds can vary. This is probably controlled most by your local Steam server's connection, and by how fat a pipe your ISP has to that server, but I'll often get 12 megabytes/sec on Steam, which is 100Mbit throughput, roughly. Even after they upgraded me to 250Mbit, I rarely see speeds much faster, and it's quite common for it to be much slower, down in the 5-7meg/second range. 25 megabytes a second (roughly 200Mbit) seems to be about the max I ever get from any one source.

Keep in mind that, after a point, TCP/IP itself becomes the slowdown; you can only have so many packets in flight, and latency starts to matter a very great deal. It can be tuned to deliver better throughput on very high-bandwidth links, but both sides have to do it. IIRC, the biggest things are the transmit window on the send side, and the receive window on the recipient -- this determines how many packets can be outstanding.

Say you're at 80ms latency, the typical amount for crossing from the US east to west coast. A gigabit connection sends 1 billion bits every second... with typical 1,500 byte packets, that means about 83,333 packets per second. 80 milliseconds is .08 seconds, so each side would need to allow send and receive windows of 6,666 packets, under perfect line conditions, to reach full gigabit throughput. That's 10 megabytes of memory that needs to be reserved on both sides, and remember that the server has to reserve 10 megs for every simultaneous connection. The client won't care about 10 measly megs, but a server that's trying to support a thousand clients at once, when any of them could be gigabit class, would need roughly 10 gigs of RAM just to hold state for every client. And that's above and beyond the RAM for the OS and any running application. And if latency support goes up to, say, 160ms, then the amount of RAM needs to double.

This can obviously be done, but it takes careful setup and attention, and eats a ton of resources on a server that will hardly ever see gigabit clients, especially not ones that have been tuned properly.

One way around this is to set up many small TCP/IP connections, each carrying only a small fraction of the total download, but I'm not aware that any software stacks are really designed for this. Kinda by accident, Usenet happens to fit into this paradigm perfectly, which is why it tends to be extremely fast, and sort of an ideal test candidate for total network throughput. You can open, say, twenty connections to a Usenet server, and because the stuff you're downloading has been chopped into small chunks already, farming out the many small downloads across your many small receive windows allows for massive throughput.

The article stated that the electrician upgraded the service in the home from a total of 40 Amps(!) to (modern code required) 200 Amps. If you only had a total capacity of 40 Amps, the hot water heater turning on would make the lights UNdim, and deprive the Fiber Box of much-needed power to drive the fiber connection.

In other words, the whole house was suffering the equivalent of a prolonged brown-out.

I must disagreed with you. First, a water heater is on 220/240. The Google box is on 110/120. They don't share the same phases...

Next, a 40 amp power panel is able to provide 40 amps. Not 41 amps at a reduced voltage... If you have too much stuff on the same panel, the fuses or breakers will trip.

Just to clarify, the Google box would share one of the phases. The power feed going into a residence has two phases, or two hot leads if you will, that are 180 degrees out of phase in relation to the other. Each individual line's AC swings from +110 to -110 with ground as a reference. When hooking up a 220 you're running both hots 180 degrees out of phase making the voltage potential 220 volts. That is why if you look at your breaker box you see two hot leads. Each 110V breaker bridges one of the hot leads and ground. The 220 breakers are larger, and if you look carefully, bridge both 110 volt leads, giving the circuit a 220V potential.

Offering something free or low cost in a test, temporary, controlled environment is one thing.

Developing a long term, extensive, business model is another.

The kinds of things users here are contemplating depend a good amount on the terms of any future rollout. There's already been significant discussion of the TOS, the ability to run to tailor a business and it's needs. For the end user, what will the restrictions be? What will the competition have to say? Will it be tied up in court?

Say it's expansive. So now MORE users are creating, accessing, posting HIGHER definition (read LARGER) video, audio, on the internet. More data requires LARGER capacity server farms to accommodate the volume. Clouds become LARGER taking up more space, requiring more resources to maintain optimum efficiency, peak end user productivity.

Will it be WIDELY accessible or will it be tiered giving the greatest access to those with the most money? Back to that business model.

Looking at the progression of use from 300kbs to the current Mps speeds it seems natural that if we can do MORE we'll do MORE as much as is allowable. Look at what's currently going on with peer to peer and torrents - larger files, changing models - restrictive, tiered, restrained.

MORE hasn't necessarily gotten us, gotten MORE of us, gotten more widely distributed access to all that is promised.

Stoplights, detours, traffic controls, limitations, all that makes a GOOD thing so much less than desired seems to be there, have followed on whether the connection was 28.8, 56k, ISDN, DSL, Cable and, to listen to the USERS here, fibre as well.

I think it COULD be a good thing, COULD be a better thing, but, how it all shakes out. IF / WHEN it all shakes out and what it looks like, what it costs, and what limitations there are or aren't will make a significant difference to something worth looking forward to or what could've been but wasn't for whatever reasons.