Ars asks: Help us max out Google Fiber

Ars editor Cyrus Farivar is on Google Fiber right now.

I’ve been in Kansas City for about 15 hours now. Yes, I’ve already made the requisite stop at Arthur Bryant’s for barbecue (and I’m going to Oklahoma Joe’s today)—but we all know that’s not why I’m here.

I’m here to experience Google Fiber. You know, the service that promises 1Gbps for $70. The one that could potentially be incredibly disruptive if fully deployed across America. And yes, the one that prompted one Kansas City Web developer to pony up to buy a four-bedroom house and turn it into the Homes for Hackers—aka, the HackerHome.

I’m sitting, as I type this, at the Homes for Hackers, in Kansas City, Kansas. I can touch the Google Fiber box. I know I’ve become the envy of American geeks everywhere. My Twitter followers and Facebook friends have expressed all the myriad forms of speed lust.

I’ve posted the requisite speedtest pics (see below). My first test was on Google’s own testing page, which returned 464Mbps down and 835Mbps up—the guys at the house had told me there had been reported speed issues earlier today and that these results are slow. As of right now, Wednesday morning, that same testing page is only returning about 30-50Mbps down—obviously far slower than it should be. (We have an e-mail to Google about this issue and will report back.)

Enlarge/ Ars editor Cyrus Farivar's very first Google Fiber test returned these numbers.

Cyrus Farivar

A gigabit to where?

I have to say, though, I’ve yet to see stuff load crazy fast. I’ve tried all kinds of tests, traceroutes, normal Web use, Hulu/Netflix, and BitTorrent. I downloaded 1.2GB worth of data on BitTorrent (more than 7,000 seeds) in about 15 minutes. But another 25GB torrent has been going for nearly 10 hours, with 17 seeds at around 200kB per second.

In other words, so far, it seems like a gigabit connection really only gets close to such high speeds if you have something on the other end to serve it adequately and not throttle or otherwise slow it down. Even major websites like Microsoft were only serving me with a Windows 8 download at about 1-2MB per second, comparable to what my Ars colleagues on non-Fiber connections were getting.

As Ars staffer Lee Hutchinson pointed out, "You've in essence removed a bottleneck that the Internet isn't yet structured to deal with being removed. Having that much pipe means you're basically plugging your computer directly into the thing you're downloading from. Your own bandwidth is so great that it becomes immaterial. It becomes a question of how much bandwidth the other side has available."

So, Ars readers, what is a simple, meaningful test that I can do to best understand how fast this connection truly is (or isn’t)? What downloads or uploads can I run to really push this network to its limits? I’m in town for about another 20 hours and will have a full report on Fiber and the Kansas City Hacker Village up before I head out.

Promoted Comments

Dunno if you're old enough to remember this, Cyrus, but this is kind of the same problem that a friend of mine had when he brought in a T1 line, back when we were mostly on modems. (he was working as a high-end graphics developer, on video codecs, back when full screen video on the computer was The Holy Grail... he was making so much money that he could actually afford one.)

We were all insanely jealous, but he said that almost every site out there was tuned for 56K, so it didn't matter nearly as much as you might think. Years later, when I was one of the very first people to get fast DSL to my house; when Pacific Bell rolled out the original cheap DSL service (1.5M/128K for like $75/mo), I signed up instantly, and then saw the same thing -- everything was tuned for 56K.

These days, I'm on a 250Mbit connection here (Chattanooga) and there aren't that many services that really take advantage of that much speed. Downloading Steam games will very occasionally hit 25 megs a second, though 7 to 12 is much more common. And I'll occasionally see 25meg/second downloads from Giganews, the Usenet provider, but most of the time get about 15. And whatever service GOG uses for downloads is very fast as well, though most of their games are small enough that it's hard to tell how fast the peak speed would end up being. (you don't really hit full stride on one of these babies until the first gig has gone by.)

This is something you need to live with, as opposed to just experience briefly, to start to understand how it matters. The biggest thing is that you never, ever have to think about what you are doing with your connection. It doesn't matter. No matter how crowded either direction is, with that kind of packet throughput, you don't really get latency issues. You don't have the problems with cablemodems of saturating the upload link and then seeing your download link hosed. At least, I don't think you ever have that problem; I've never been able to completely saturate my upload, even when it was merely 100Mbit.

At this kind of network speed, local storage of stuff becomes less important, because you can just go get a new copy. When I installed Dishonored from CD, that was slower than installing it from Steam. Slower! A CD install was slower than pulling it from Atlanta! And, if you have the technical chops, you can be your own cloud, run your own VPN service for when you're remote, mirror your own files for your laptop, be your own web point of presence.

I have a coloed machine that hosts my domain, and then my firewall registers its name whenever it comes up, or the DHCP address changes (at least, I think it does, because my IP hasn't changed in the months since I wrote the script). It's like DynDNS, except I'm hosting it myself. And then my laptop is set up to use OpenVPN into my Atom firewall, so if I'm on good remote bandwidth, it feels almost exactly the same as sitting at home. So I don't care about wireless encryption or remote networks that are hacked; as long as I can get a connection to my firewall, all my traffic is subsequently encrypted, and probably can't be tampered with, intercepted, or sniffed. (Of course, it re-originates from my firewall as regular traffic, so that part can still be snooped on by the government minders, unless I've got another layer of encryption to the remote site.)

Basically, as a guest, all you can do is marvel at your download speed on the few sites that support it. It's not until you actually live with it that it becomes transformative. Rent a tiny virtual machine from somewhere like Linode, to give yourself a fixed address for DNS, and you can be your own web point of presence.

We do have multiple houses here in the first fiberhood that Cyrus can test out. My office in the first kcsv.org house is just 4-5 houses down from the hacker home. Talked to Cyrus this morning about headin over here after some Oklahoma Joe's to test some more. If he doesn't need an afternoon nap to sleep off the BBQ

We have some other stuff here like a few ac routers for a bit better wifi... The best connections that I get personally are to several of our servers colocated in downtown KC. It is just as good as walking into the datacenter and jacking in. Our nightly deployments take around 10secs now... only bottleneck now being somewhat drive speed.

Upload speed might be where much of the innovation is? We can HD stream or upload files of sizes you would never consider before... there was some mention this morning of a KCPostalService of sorts? mail us a giant device we will upload it for you >:)

Our datacenter is looking at sponsoring a hacker house just to allow customers a VLAN to reach their boxes without having to go downtown, deal with parking etc...

Someone may have touched on this already, but I think the true benefit is missed here, and that's home use scenarios.

Being able to stream netflix, play video games, use VOIP and perhaps have someone else in the house upload a batch of photos to your favorite social network without completely terrorizing others in the house is absolutely why google fiber, or any super fast fiber connection, is sought after.

Watching 1080p video from youtube while my wife is streaming high quality HD netflix means that I'm usually dropping my video quality down to 720p because my 15/2 cable line can't handle both at the same time. Forget uploading ANYTHING while someone is trying to do anything else, because that would completely bottleneck everything else going on in the house.

It isn't the individual connection speeds that will see a benefit, it's the fact that everyone in your house using the connection will have the luxury of using each service like they have their own slice of internet love pie.

I have to say, though, I’ve yet to see stuff load crazy fast. I’ve tried all kinds of tests, traceroutes, normal Web use, Hulu/Netflix, and BitTorrent. I downloaded 1.2GB worth of data on BitTorrent (more than 7,000 seeds) in about 15 minutes. But another 25GB torrent has been going for nearly 10 hours, with 17 seeds at around 200kB per second.

The flaw with this is that a Torrent connection (much less any other Internet connection) is ONLY going to be as fast as the slowest choke point between A and B. [weakest link syndrome]

So no shit your Torrents are slow - no shit your Netflix connection is "normal". Once the connection leaves Google's Fiber and hits a harline copper-wire - and so on. (What is the upload speed of the Torrent connection from the file-originator ? what is the upload speed of the Netflix server ?) Not to mention the dozens of other factors involved in true connection speeds.

Just because you (and the rest of Kansas City) have access to 1Gbps throughput does not mean you will get those speeds in any direction. Only when you are conencting to another point where the throughput matches.

Access a Torrent file hosted on a server that is in Kansas City using Google 1Gbps connection (or a streaming service doing the same) - you will get different results. ANd no shit you will get a nice number to look at pinging Gogole's own Speed-Test on Google's optimized servers - using Google's Fiber connection. I get pretty numbers when I ping Comcast or Cox (not 3 digits pretty - but much nicer than real world use).

Any more rhetorical nonsense you want to throw at us ? (Yes - it will be great when a larger percentage of the nation has the option of a 1Gbps conenction - but until then - it's simply shiny candy)

Obligatory car analogy: what you've got is a much, MUCH bigger driveway than before. And maybe a 20-lane offramp from the freeway directly to your house. Lots more people can now arrive simultaneously to drop off or pick up stuff than before. But the freeway to get to your part of the world is no bigger than before.

181 Reader Comments

start hosting media servers and their requisite functions. That's one thing that can be done which will be noticeably significant. Doing this can enable people to host their own backups at home and easily access them as well. Amusing things that can be done? Download your entire spotify account to be usable offline.Use google drive as a live host for a document in some way.

Basically, things involving large files become easier - so think along those lines and you'll come up with ideas. Bittorrent is the tiniest of files by it's definition, actually. Bandwidth usage, sure - but not files.

actually, I like the idea of a Tor Node as suggested, too.

Except Google Fiber's ToS specifically rules out running any kind of server. I tried asking for clarification on the term server but Google refused. I specifically asked if P2P clients such as Bittorrent would fall under that rule and they said that it did not but I was not to use the service to download stuff illegally (well.. doh, I kinda figured that much).

In other words running a server is not a way to max out the service, it is a way to get kicked off the service. Sorry.

Regardless I want this in Brazil so badly, here I can hardly get a decent connection, let alone a nice 1GBit low latency fiber connection like the one Google is offering. I am sure they could get a sweet deal with the government if they wanted to offer it here, and labor is dirt cheap.

This. This is the best way for you to test. Subscribe to an unlimited bandwidth usenet server with a massive amount of allotted connections and then download an NZB file using sabnzbd.

The other thing to consider, honestly, is that you might be bottlenecking your machine at those speeds too. Make sure that your hard drive is fast enough to write the data you're trying to intercept, or if you're on a *nix machine, pipe out your downloads to /dev/null.

I'm actually more interested in the TV experience. How's the Nexus 7 remote + On Demand screen, TV Guide, the search capabilities and integration with Netflix.

As "cheap" as their TV service is, it's by far the least appealing thing to me. I just don't care for it at all.You have 1 Gigabit, no need for walled gardens with speeds that high. Bring on the next gen internet based TV successor IMO.

The TV service will also have a lot of development opportunities. Android can be used as a remote. I'm told the STB/DVR has fairly open APIs with rich two way messaging to Android remote. This opens up all kinds of opportunities for sophisticated control of STB/DVR by 3rd party developers, way beyond Tivo.

And if Google can get advertisers on board, there are opportunities to do things like have an ad show up in frame on remote simultaneous to ad on live TV. Click ad on remote, buy product with Google Wallet (GWallet is already setup to bill GFiber service). Advertisers love impulse buys. The Cable/Sat players do not have the entire ecosystem to pull off something like this.

FTP a large file to someone else on gigabit. The limiting speed factor might well be your hard disk.

A spinning HD would almost certainly be the limiting factor. If both ends had SSDs it might be a more useful/interesting test, but even then you'd probably be limited.

Two or three high-end SSDs in RAID0 would probably eliminate disk writes as a factor.

Last I knew, consumer-grade SSDs don't get a significant speed benefit from RAID, though enterprise SSDs can, possibly even moreso over PCI Express. But I'm not expecting Cyrus to have that sort of kit handy.

EDIT: Same site has another review that suggests consumer SSDs in RAID can benefit from an upgrade from SATA 3.0 Gbps to SATA 6.0 Gbps, so perhaps I was wrong. We're talking storage speeds up to 340 MBps (or 2.66 Gbps, note bytes vs bits), easily enough to outpace a full fiber pipe.

From my own anecdotal evidence you are not going to be able to max it out on a single transfer. I attempted this very thing once I got gigabit networking in my house. The maximum rate of transfer I could achieve on my local network was 360 megabits/sec, with the reading and writing occurring from ram disks.

Average person here. I think it would be cool to see the top download speeds from various sites. Like the example from Microsoft in the article. I always wonder if having super fast internet really makes that big of a difference if sites can only push so much data.

I'm one of the few lucky people to live in a condo in Toronto that has 100mbit fibre available. The bottleneck is definitely an issue on the other side, and if you want to take full advantage of the pipe you start "mitigating". If you want to get full advantage of the the pipe you very quickly dump torrents and switch to newsgroups. I pony up about $25/month for 2 different newsgroups providers. While each CLAIMS they can push at gigabits speed, it isn't proving to be so. I'm only getting 40-60mbit if only connected to one, but when I'm connected to both I max out connection at 95mbit, letting me download 720p movie from newsgroups in about 5 mins.

Often times if you're trying to download a file it's worth the time to cancel and find an alternate source. I had a 1gb download that claimed would complete in 10 mins. Cancel, google search filename, click first few links till you find one that can push at max speed and you're done in 1-2mins.

My biggest virtue with having a fast pipe is running ssh proxy on 443 letting me bypass a very agressive firewall at work. Everyone always wonders how able to access youtube and facebook at work. I just smile. While running ssh is not the pinnacle of coolness, being able to watch HD video over one while at work is.

In other words, so far, it seems like a gigabit connection really only gets close to such high speeds if you have something on the other end to serve it adequately and not throttle or otherwise slow it down. Even major websites like Microsoft were only serving me with a Windows 8 download at about 1-2MB per second, comparable to what my Ars colleagues on non-Fiber connections were getting.

I'd say that this is a no-brainer. This has always been the case for the internet. Just becuase you have a gigabit connection doesn't mean that the FTP server with a 128kb link is going to feed it to you any faster.

See how many simultaneous Netflix connections you can maintain, or How many youtube videos you can run at 1080p at the same time

In other words, so far, it seems like a gigabit connection really only gets close to such high speeds if you have something on the other end to serve it adequately and not throttle or otherwise slow it down. Even major websites like Microsoft were only serving me with a Windows 8 download at about 1-2MB per second, comparable to what my Ars colleagues on non-Fiber connections were getting.

I'd say that this is a no-brainer. This has always been the case for the internet. Just becuase you have a gigabit connection doesn't mean that the FTP server with a 128kb link is going to feed it to you any faster.

Download managers will do multiple http/ftp download streams and are smart enough to find multiple sites to download from if available. Definitely should install a download manager.

I'm actually more interested in the TV experience. How's the Nexus 7 remote + On Demand screen, TV Guide, the search capabilities and integration with Netflix.

As "cheap" as their TV service is, it's by far the least appealing thing to me. I just don't care for it at all.You have 1 Gigabit, no need for walled gardens with speeds that high. Bring on the next gen internet based TV successor IMO.

The TV service will also have a lot of development opportunities. Android can be used as a remote. I'm told the STB/DVR has fairly open APIs with rich two way messaging to Android remote. This opens up all kinds of opportunities for sophisticated control of STB/DVR by 3rd party developers, way beyond Tivo.

And if Google can get advertisers on board, there are opportunities to do things like have an ad show up in frame on remote simultaneous to ad on live TV. Click ad on remote, buy product with Google Wallet (GWallet is already setup to bill GFiber service). Advertisers love impulse buys. The Cable/Sat players do not have the entire ecosystem to pull off something like this.

I'm assuming they're using IPTV, right? What are the chances of them publishing an API so support can be baked into programs like MythTV and MediaPortal?

Stage a game server in one part of the house and connect to it from another machine via the internet, not lan. While doing this sync a few 2-3 gig files to Dropbox, stream some music from Spotify, and try running a Netflix video in the background. I agree with others here about testing concurrent items that are bandwidth hogs and see how they work.

While you're doing this have someone else do the same thing within that network. See if it impacts your speeds.

As has been mentioned, you're going to have a hard time maxing it out without running maxing out your hard drive(s). As others have mentioned, a Tor node should be largely network traffic, and yet have potentially large amounts of network traffic.

Echoing the recommendations of Usenet for downstream testing. It was the only thing that reliably maxed out my 50mbit cable. As for upstream testing, start seeding a ton of legal torrents (say for example every major Linux distro at the same time) off a SSD or ramdisk.

Yeah, definitely usenet. Subscribe to a couple servers - one *should* do, but multiples avoid the chance of error due to an underperforming server.

Torrents are a bad way to go, as torrent health, overhead, and individuals seeding the torrent limit it's potential speed. It's largely a non-issue with standard broadband, but at gigabit speeds it certainly comes into play.

Even with my (in comparison horrendously slow) 25down/5up mbit connection, I've found torrents rarely max it out, but Usenet always does - and often will pull down substantially faster than 25mbit.

Along the same lines as other posters have said, but to suggest a very simple tool that I didn't see mentioned, partner up with somebody else on Google Fiber and use PCATTCP. Repeat with another PC on the LAN in your house as a control. Better still if you can repeat with somebody on FiOS.

FTP a large file to someone else on gigabit. The limiting speed factor might well be your hard disk.

A spinning HD would almost certainly be the limiting factor. If both ends had SSDs it might be a more useful/interesting test, but even then you'd probably be limited.

A good mechanical hard drive can come very close to maxing out a 1Gb/s network connection. Many SSDs could max out a 3 or 4Gb/s connection.

Assuming no contention, better than very close, especially if you're not streaming UDP packets directly from disk. And while SSDs do saturate 3Gb/s connections (read: SATA-II) on a regular basis, /dev/null is both slightly faster and significantly cheaper (as are small stripe sets of mechanical drives in sequential I/O applications where the reliability guarantees of write-only storage devices are insufficient).

Dunno if you're old enough to remember this, Cyrus, but this is kind of the same problem that a friend of mine had when he brought in a T1 line, back when we were mostly on modems. (he was working as a high-end graphics developer, on video codecs, back when full screen video on the computer was The Holy Grail... he was making so much money that he could actually afford one.)

We were all insanely jealous, but he said that almost every site out there was tuned for 56K, so it didn't matter nearly as much as you might think. Years later, when I was one of the very first people to get fast DSL to my house; when Pacific Bell rolled out the original cheap DSL service (1.5M/128K for like $75/mo), I signed up instantly, and then saw the same thing -- everything was tuned for 56K.

These days, I'm on a 250Mbit connection here (Chattanooga) and there aren't that many services that really take advantage of that much speed. Downloading Steam games will very occasionally hit 25 megs a second, though 7 to 12 is much more common. And I'll occasionally see 25meg/second downloads from Giganews, the Usenet provider, but most of the time get about 15. And whatever service GOG uses for downloads is very fast as well, though most of their games are small enough that it's hard to tell how fast the peak speed would end up being. (you don't really hit full stride on one of these babies until the first gig has gone by.)

This is something you need to live with, as opposed to just experience briefly, to start to understand how it matters. The biggest thing is that you never, ever have to think about what you are doing with your connection. It doesn't matter. No matter how crowded either direction is, with that kind of packet throughput, you don't really get latency issues. You don't have the problems with cablemodems of saturating the upload link and then seeing your download link hosed. At least, I don't think you ever have that problem; I've never been able to completely saturate my upload, even when it was merely 100Mbit.

At this kind of network speed, local storage of stuff becomes less important, because you can just go get a new copy. When I installed Dishonored from CD, that was slower than installing it from Steam. Slower! A CD install was slower than pulling it from Atlanta! And, if you have the technical chops, you can be your own cloud, run your own VPN service for when you're remote, mirror your own files for your laptop, be your own web point of presence.

I have a coloed machine that hosts my domain, and then my firewall registers its name whenever it comes up, or the DHCP address changes (at least, I think it does, because my IP hasn't changed in the months since I wrote the script). It's like DynDNS, except I'm hosting it myself. And then my laptop is set up to use OpenVPN into my Atom firewall, so if I'm on good remote bandwidth, it feels almost exactly the same as sitting at home. So I don't care about wireless encryption or remote networks that are hacked; as long as I can get a connection to my firewall, all my traffic is subsequently encrypted, and probably can't be tampered with, intercepted, or sniffed. (Of course, it re-originates from my firewall as regular traffic, so that part can still be snooped on by the government minders, unless I've got another layer of encryption to the remote site.)

Basically, as a guest, all you can do is marvel at your download speed on the few sites that support it. It's not until you actually live with it that it becomes transformative. Rent a tiny virtual machine from somewhere like Linode, to give yourself a fixed address for DNS, and you can be your own web point of presence.

The TV service will also have a lot of development opportunities. Android can be used as a remote. I'm told the STB/DVR has fairly open APIs with rich two way messaging to Android remote. This opens up all kinds of opportunities for sophisticated control of STB/DVR by 3rd party developers, way beyond Tivo.

And if Google can get advertisers on board, there are opportunities to do things like have an ad show up in frame on remote simultaneous to ad on live TV. Click ad on remote, buy product with Google Wallet (GWallet is already setup to bill GFiber service). Advertisers love impulse buys. The Cable/Sat players do not have the entire ecosystem to pull off something like this.

I'm assuming they're using IPTV, right? What are the chances of them publishing an API so support can be baked into programs like MythTV and MediaPortal?

They haven't published the APIs yet but are planning to put out an SDK at a later time. Expect a paradigm shift by allowing 3rd parties to develop for it.

FTP a large file to someone else on gigabit. The limiting speed factor might well be your hard disk.

A spinning HD would almost certainly be the limiting factor. If both ends had SSDs it might be a more useful/interesting test, but even then you'd probably be limited.

A good mechanical hard drive can come very close to maxing out a 1Gb/s network connection. Many SSDs could max out a 3 or 4Gb/s connection.

True to a point. If you're downloading multiple files (because of slow speeds on the OTHER end), then chances are you're writing to multiple locations on the disk, and then you fall WAY down from the theoretical maximum, because the read head has to keep moving between the different areas on disk.

That's why I think a Tor node might be better, as there should be little disk activity for a whole lot of network traffic.

*Disclaimer* I am not an expert, and I reserve the right to be wrong. I can gracefully accept being wrong!

Download JDownloader, and in the settings, set the maximum number of open connections to something like 10-20. Go to Youtube and find a very long video available in 1080p or original form. Download it using JDownloader. By increasing the maximum number of connections, Youtube should see it as coming from different sources, and allocate you more speed than one connection. I may be misunderstanding how Youtube and/or JDownloader work, but using Youtube with JDownloader and 5 open connections always maxes out my 10mbps download. 1 open connection results in much slower speeds for me. Tweak number of open connections as needed.