Yes.. I just bought my Airport Extreme this January too.. (I'm happy with the router)

Am I A fool for doing so?

I thought internet speed depends on the cable provider, not the actual device..
am I wrong here..?

I don't know jack about what Extreme can do, I'm just happy it 'just worked' and I don't have to deal with any technical difficulty.

My cable provider has speeds upto 20MB........ my 2 year old Time Capsule cannpt get close to those speeds in the wireless connection it provides. It is wireless "n" but now when the new standard comes out with the new products you still will be limited by what your laptop, ipad, imac can connect. If they still have a wireless G or N card installed then that is the speed they will connect at....no matter what the router is capable of producing. So along with the router being able to connect at the new speeds you will have to upgrade all of your other devices to get the increase in speeds.

Tallest Skil:

"Eventually Google will have their Afghanistan with Oracle and collapse" "The future is Apple, Google, and a third company that hasn't yet been created."

Assuming any of this report is true, I'm looking at this not for a new technology for iPads and iPhones (at least not in 2012) but for the upcoming Apple TV. If you think about it, an Apple TV is going to be a device designed to deliver online 1080p content quickly, efficiently with high quality. Since most homes don't have wired ethernet available, you need quality wireless networks with more speed that ever. 3-antenna 802.11n is good, but if you can get double or triple speed out of it, you can solve a lot of problems in a modern home.

Let's say you have two Apple TVs in your house, both watching online streamed content in 1080p. Add to that someone doing the same on an iPad 3 and maybe a Macbook surfing the web at the same time and all of a sudden you have a pretty congested network. In order to provide consistent streams without frame dropping, you need something like 802.11ac.

Let's say you have two Apple TVs in your house, both watching online streamed content in 1080p. Add to that someone doing the same on an iPad 3 and maybe a Macbook surfing the web at the same time and all of a sudden you have a pretty congested network. In order to provide consistent streams without frame dropping, you need something like 802.11ac.

Yep. Bingo. And whether you think the Apple TV means a plausible tiny box or a laughable HDTV, this is exactly what you need to succeed with a multi-location household. You'll need 11ac to push the 1080p around.

I'm pretty sure that there are more people than the population of the US that would strongly disagree with your very obnoxious statement about "Anyone else in the world". there are several reasons why I wouldn't want to raise my kids in the US even though it would be relatively easy for me to come and live there. Canada for example is much more desirable from my point of view.

Regs, Jarkko

Canadians come to America much more than Americans go to Canada. Just sayin...

Yes, there are always 'rules' that the 'purists' put out and expect people to follow. In reality, everyone is different and has different goals. I have a 55" TV that I watch from about 12 feet - and I enjoy it just the way it is. Moving forward to about 9 feet would make me feel claustrophobic. And for all the people who brag about how great 1080p is and how easy it is to see the difference, I say that you should be watching movies that make you more excited about the movie than about the number of pixels on the screen. The difference between DVD and Blu-Ray is quite small - even on my 55" set and even if I sit closer. Can I see a difference? Sure. But Avatar is every bit as enjoyable on DVD as on Blu-Ray. Content is more important than specs.

It's really no different than the home audio stuff that used to be the big bragging rights thing. Could you actually hear the difference between a $1000 cable and a $20 cable? Maybe. Barely. Under precisely controlled conditions. Is it something that mattered in the real world? Not a bit.

Cables are proven to have no difference. But the difference between component and hdmi is noticeable. That's an equal comparison between 720 and 1080. True, content makes a movie good or bad. But quality of picture and sound is can make a good movie better and a more immersive experience.

Something about sitting in a media room with lossless audio in 7.1, and 100" of 1080p @ 12' is fantastic. It feels like you're at the movies 55" can still create that at the appropriate distance, but if that isn't for you, that's cool. But it's not a placebo effect that we (who prefer it) dont notice in "real life" as you related it to overpriced cables.

Unless you have a 55 inch tv and sit too close, nobody can tell the difference.

Video Pros certainly can tell the difference between 720 and 1080. There are a lot of us on Macs, at least as long as they keep making Mac Pros. A lot of consumers can tell as well. They have to get to 1080 to be competitive, eventually.

And Thunderbolt currently runs at 10 gigabits per second. This goes to show that contrary to what many Mac users are thinking, a single Thunderbolt cable is nowhere near ready to replace expandable tower computers. Thunderbolt would need to run at hundreds or even thousands of gigabits per second in order to support the full bandwidth of a Mac Pro with every slot being utilized.

Uhm...no. Not even close. If you think the Mac Pro moves hundreds or thousands of Gb/s over their 3 PCIe slots you are smoking something. A Ma cPro has 1 PCIe 2.0 x16 and 2 2.0 x4 slots. That is a total of 24x or 12GB/s aggregate speed. That is one way as well, not bi-directional. Thunderbolt is 10Gb/s or 1.25GB/s bi-directional or 2.5GB/s simultaneous. Thunderbolt can replace the expansions slots of the Mac Pro when it is 5 times faster than current speeds. If you insisted on it being capable of those speeds one way, then 10 times faster than current.

Also some of the TB controllers are 4 lanes which get turned into 2 channels of 10Gb/s bi-directional, so those controllers are already pushing double standard speed. I believe the MBA uses the 2 lane/1 channel design and the MBPs have the 4 lane/2 channel. So really, all we need to do is produce TB interfaces w/more channels at current speeds or boost the speed per channel. I'm sure both will happen over the next couple of years.

What's the rush to wirelessly connect your phone to an access point at gigabit speeds to the Internet that is probably real-world 5-10mb/s unless you're at a Starbucks with a bunch of other people which will drag it down even more?

While improvements are always welcomed, it cracks me up reading how people want this technology now when in the real world, they will nitice little or no improvement.

But we gotta have it now.

What do you plan on doing on that tiny phone that necessitates gigabit speeds of bandwidth?

Your internet speed is determined by the slowest hop you have to pass through to get to the webpage or data storage you are trying to reach. The bottleneck could be your wifi or router or your Internet provider or any other router along the way to wherever you are going.

More importantly, 802.11 is a shared medium, and there are only a small finite number of non-overlapping channels available to run "physically separate" networks.

What I've always wondered about is why nobody has yet introduced "coordinated peer-to-peer" extensions to Wi-Fi to increase both bandwidth-efficiency and latency except in certain cases, there's no real reason every packet sent between my MacBook and Mac mini needs to pass through my AirPort Extreme, and the latency introduced by doing so can be fatal to applications like network MIDI.

Sure, I can set up a peer-to-peer network manually, but this means I must disconnect from the shared network. Automating this seems like it could be a huge win, especially for a company like Apple who so actively promotes "unmanaged" Wi-Fi applications beyond mere Internet connection sharing.

Nice for some future-proofing, I guess, but I've never been dissatisfied with a 3x3 antennae Wireless N connection. What I really want them to adopt at long last is USB 3.

I think better WiFi options are needed, most importantly in the form of more robust AirPort routers, but I would expect USB 3.0 in the next Mac updates that include Ivy Bridge microarchitecture. Currently there only option was an additional chip to support USB 3.0 where they are already low on space due to their design choices. I'm sure some will argue that Apple is pushing Thunderbolt blah blah blah but they aren't really competing for the same peripheral connections so I think it's unlikely that is the reason they haven't added it yet.

This bot has been removed from circulation due to a malfunctioning morality chip.

As great as WiFi is, if you want speed, you have to go wired. And it doesn't help that the MacBook Air Ethernet adapter is limited to USB 2.0 speeds, which is maybe 30MB/s in the best case scenario. Hopefully we'll get something better.

I think better WiFi options are needed, most importantly in the form of more robust AirPort routers, but I would expect USB 3.0 in the next Mac updates that include Ivy Bridge microarchitecture. Currently there only option was an additional chip to support USB 3.0 where they are already low on space due to their design choices. I'm sure some will argue that Apple is pushing Thunderbolt blah blah blah but they aren't really competing for the same peripheral connections so I think it's unlikely that is the reason they haven't added it yet.

True, Panther Point chipsets will support USB 3 natively, so I don't see why Apple would not use USB 3 then. But still, unless you're using bottom barrel Wireless N connections I've never had problems with them even in large houses. Even with that Apple HDTV rumour, a 1080p video stream doesn't saturate a N connection even at reasonable ranges. What will ac do, let me use my connection down the street?

Wow, it didn't take someone long to bring out that old, tired, chart, there is nothing to backup those figures other than they are the authors ideas, that's all.

You're wrong. Again.

Facts are facts. You can bury your head in the sand and pretend to be an ostrich all you want, but you're wrong.

Those charts are all almost identical and use the following:

Quote:

These distances are calculated based on the resolving power of the human eye (reference), or visual acuity. The human eye with 20/20 vision can detect or resolve details as small as 1/60th of a degree of arc. These distances represent the point beyond which some of the detail in the picture is no longer able to be resolved and "blends" with adjacent detail.

How the hell do you think Steve came up with "retina display"? Or, because Steve said it, he doesnt have to have any evidence?

Thats like me getting on here touting that retina display is all a marketing scheme and no one can tell the difference because I can't. And anyone who disagrees with me is just making up numbers and not having anything factual.

Eventually, it's all going to be cloud-based. So again, my argument stands. I think most people will not notice a difference whatsoever since their internet provider will not have that kind of bandwidth going into the house.

LAN speeds are a different matter. But then, most folks again will not wirelessly sync to their Macs/PC's but will continue to use their dock/USB connection.

My point is that the community here that is clamoring for gigabit wireless once again represents such a small minority. Most folks in the real-world will not notice a single bit of difference.

I'm all for making things smaller, faster, and more efficient. However listening to the community here about how important it is they have it now is just funny. The infrastructure right now is just not there, and will not be there for a long time.

But hey, if the 1% of you that have to stream their HD videos, or sync your 64GB of data every time at gigabit speeds has to have it, then it's obvious everyone else will find it useful to.

Really? You are saying those charts exactly match my eye sight, and my room conditions, and that of my family etc? Is that a fact? No, they are based on a particular eye sight measure.

Quote:

Originally Posted by Andysol

How the hell do you think Steve came up with "retina display"? Or, because Steve said it, he doesnt have to have any evidence?

Hate to tell you this, but Steve didn't come up with "retina display", it is a meaure based on a particule eye sight measure from a certain distance etc, etc, etc. Apple makes a phone based on a certain dpi which meets that measure.

Also, Steve used to stretch the truth quite a lot on those speeches, don't take them all at face value.

Quote:

Originally Posted by Andysol

Thats like me getting on here touting that retina display is all a marketing scheme and no one can tell the difference because I can't. And anyone who disagrees with me is just making up numbers and not having anything factual.

Based on what you are claiming, Retaina display is a marketing scheme.

I don't know why I am responding to you because you are obviously clueless. But here goes nothing.

Quote:

Originally Posted by jfanning

Really? You are saying those charts exactly match my eye sight, and my room conditions, and that of my family etc? Is that a fact? No, they are based on a particular eye sight measure.

Nope. I didn't say that. Those charts replicate 20/20 vision. If you have bifocals and are cross-eyed, I can't help you. I won't even get into the fact that 720p/1080p has nothing to do with room conditions. What I said was that you were wrong when you said the following:

Quote:

Originally Posted by jfanning

Wow, it didn't take someone long to bring out that old, tired, chart, there is nothing to backup those figures other than they are the authors ideas, that's all.

You're the one that said those numbers are nothing more than the authors ideas. And that's wrong. They're proven facts based on the quote and link I provided that you just conveniently ignored.

Quote:

Originally Posted by jfanning

Hate to tell you this, but Steve didn't come up with "retina display", it is a meaure based on a particule eye sight measure from a certain distance etc, etc, etc. Apple makes a phone based on a certain dpi which meets that measure.

Hello? Are you thinking you're educating me? That's what the charts were that you said mean nothing to anyone!

Quote:

Originally Posted by jfanning

Based on what you are claiming, Retaina display is a marketing scheme.

Seriously? You must be very very very foreign because what you just quoted was me saying that is what YOU claim. That your interpretation of the charts as nothing more than a marketing scheme. The same charts you said mean nothing. Althoh also the same charts that are used to determine retina display based on distance and 20/20 vision! You're unbelievable. Just say "I'm wrong" and be over it. You're wrong. Period.

Verizon runs an ugly semi rigid black pipe (trees help) from pole to pole, then a fiber optic "wire" to a plastic box attached to the house.

That black pipe to enclose the fiber is probably cheaper than armored fiber. Squirrels love to chew on fiber. We had to replace a long run of fiber with armored fiber for the county because of the little rodents.

I don't know why I am responding to you because you are obviously clueless. But here goes nothing.

Personal insult, great start indicates you don't have an argument.

Quote:

Originally Posted by Andysol

Nope. I didn't say that. Those charts replicate 20/20 vision. If you have bifocals and are cross-eyed, I can't help you. I won't even get into the fact that 720p/1080p has nothing to do with room conditions. What I said was that you were wrong when you said the following:

I don't have bifocals, and I'm not cross-eyed, but if all those charts are for 20/20 you missing the large number of people that don't have that particular level of eye sight.

There are a number of environmental conditions that will change how you see an image, light is one of those, hence room conditions will affect things.

Quote:

Originally Posted by Andysol

You're the one that said those numbers are nothing more than the authors ideas. And that's wrong. They're proven facts based on the quote and link I provided that you just conveniently ignored.

I haven't ignored anything, they are based on one particular level of eye sight in certain conditions, a fact you constantly want to ignore.

Quote:

Originally Posted by Andysol

Hello? Are you thinking you're educating me? That's what the charts were that you said mean nothing to anyone!

What?

Quote:

Originally Posted by Andysol

Seriously? You must be very very very foreign because what you just quoted was me saying that is what YOU claim. That your interpretation of the charts as nothing more than a marketing scheme. The same charts you said mean nothing. Althoh also the same charts that are used to determine retina display based on distance and 20/20 vision! You're unbelievable. Just say "I'm wrong" and be over it. You're wrong. Period.

Very foreign? What the hell is that meant to mean? Are you trying to indicate if someone is a different nationality to you they are wrong? Hmm, great attitude you have there

You have you theory, based on certain circumstances, you are more than welcome to believe them, I am not doubting those results based on the inputs listed.

But to try and claim what I can see (and anyone in general) based on a chart you downloaded, without knowing anything about me, and any other circumstances is pure ignorance.

What's the rush to wirelessly connect your phone to an access point at gigabit speeds to the Internet that is probably real-world 5-10mb/s unless you're at a Starbucks with a bunch of other people which will drag it down...What do you plan on doing on that tiny phone that necessitates gigabit speeds of bandwidth?

This is all about internal home networking bottlenecks. Wireless gaming, 1080p video and larger photo files displayed using AirPlay, all from iPhones, make this a welcome upgrade.

Even now, I regularly move 50-200MB photo files between machines on our gigabit ethernet home network (and also via WiFi). I network DirecTV DVRs to play HD movies between rooms and stream DirecTV HD video to iPads (though I expect DirecTV will be slower to deploy this -- they still need to add gigabit Ethernet). I also stream local HDTV channels to MacBook Pro's using an el gato EyeTV ethernet network tuner. All this sucks up lot's of bandwidth.

It also will ensure the success of wireless Apple (iOS) gaming (which is coming) to challenge consoles like Xbox and Playstation with a new AppleTV because this will help eliminate the ever so slight lag that is evident with current WiFi technology when using iPhones as game pads. Of course, we're hoping this technology will migrate down to iPhones, iPods and iPads too, but the antenna challenges may throttle that expectation.

But to try and claim what I can see (and anyone in general) based on a chart you downloaded, without knowing anything about me, and any other circumstances is pure ignorance.

I'm not going to claim anything on your eyesight, but I just had to ask: Are you saying that your vision is better than the assumed "perfect" 20/20 vision that the conclusions in the articles are based on (i.e biophysics and scientific measurements)? My understanding was that they test the human eyes' best case resolution and give the conclusions from that.

My final response. Maybe this time you will see what I quote in bold again.

Quote:

Originally Posted by jfanning

Wow, it didn't take someone long to bring out that old, tired, chart, there is nothing to backup those figures other than they are the authors ideas, that's all.

Except 20/20 vision. This is what I'm saying you wrong on. That the charts have- as YOU put it- nothing to back them up and there is nothing to them". Not "they don't match my eyesight".
Not "those aren't what my eyesight is". No... You said "there is NOTHING to backup those figures". And you are wrong wrong wrong wrong wrong wrong wrong.

Are we on the same page yet?

Quote:

Originally Posted by jfanning

I don't have bifocals, and I'm not cross-eyed, but if all those charts are for 20/20 you missing the large number of people that don't have that particular level of eye sight.

I haven't ignored anything, they are based on one particular level of eye sight in certain conditions, a fact you constantly want to ignore.

That's like saying we take the crash impact ratings of car seats and throw them out. "I never go 70mph, so those crash ratings mean nothing". In fact, there is nothing to backup those figures other than they are the carseat rating person's ideas. See how dumb that sounds?

You have to have a standard to measure by. That standard is 20/20. If you don't wear glasses, that's no ones fault but your own.

Quote:

Originally Posted by jfanning

There are a number of environmental conditions that will change how you see an image, light is one of those, hence room conditions will affect things.

And lighting doesn't impact resolution- which is the only thing we are arguing. There's no debate if you have a room with the sun on The screen it will affect the picture- but in terms of contrast, NOT resolution. 720p and 1080p is a distance measurement only. Period- end of argument.

Quote:

Originally Posted by jfanning

Very foreign? What the hell is that meant to mean? Are you trying to indicate if someone is a different nationality to you they are wrong? Hmm, great attitude you have there

Nope- just curious why you have a hard time following the English language. You either are foreign because you don't understand what "there is nothing to backup those figures" means, you have short term amnesia, or you just can't admit you said something that simply wasn't true as you were telling someone else what they said and quoted had no bearing.

Quote:

Originally Posted by jfanning

You have you theory, based on certain circumstances, you are more than welcome to believe them, I am not doubting those results based on the inputs listed.

But to try and claim what I can see (and anyone in general) based on a chart you downloaded, without knowing anything about me, and any other circumstances is pure ignorance.

I never claimed anything. I said if you have 20/20 vision and a 1080p tv, that the chart is exactly right on how resolution is determined. It's a fact, proven by research and sciences. Not an opinion. If you are one of the millions and millions of people with either 20/20 vision or wear corrective lenses, the chart is 100% fact. Yours will always be an opinion on the difference you can see. So your argument is fact vs opinion.

There is no chart for you. Because you and your family won't go get glasses or have such poor lighting conditions you can't see any resolution due to your contrast being off. That's no ones fault but your own. But again- lighting conditions have no merit on resolution. If you turned your brightness to 0, and made your screen black, you can't argue that your 1080p tv has poor resolution. The resolution would still be 1080p!

Wow, it didn't take someone long to bring out that old, tired, chart, there is nothing to backup those figures other than they are the authors ideas, that's all.

Quote:

Originally Posted by jahonen

I'm not going to claim anything on your eyesight, but I just had to ask: Are you saying that your vision is better than the assumed "perfect" 20/20 vision that the conclusions in the articles are based on (i.e biophysics and scientific measurements)? My understanding was that they test the human eyes' best case resolution and give the conclusions from that.

Regs, Jarkko

Jarkko- you are exactly correct. The article is based on biophysics and scientific measurement.

Which, in jfannings own words, there is nothing to backup those numbers except the authors ideas (opinion), that's all.

But even when you point out its not an opinion and all based on biophysics and scientific measurement, he'll argue for a full page that he is right and the chart is just opinion and not fact.

My final response. Maybe this time you will see what I quote in bold again.

You can bold all you like, but the charts you posted didn't contain any data to back them up, the article may have, but the charts didn't, not even a reference.

Quote:

Originally Posted by Andysol

Are we on the same page yet?

That you can't accept another opinion? That you can't accept the possibility that you are missing a point? Yes, we are on the same page then

Quote:

Originally Posted by Andysol

That's like saying we take the crash impact ratings of car seats and throw them out. "I never go 70mph, so those crash ratings mean nothing". In fact, there is nothing to backup those figures other than they are the carseat rating person's ideas. See how dumb that sounds?

No, that's comparision is totally different, it was relevant as you posting a user edited wiki page as proof

Quote:

Originally Posted by Andysol

You have to have a standard to measure by. That standard is 20/20. If you don't wear glasses, that's no ones fault but your own.

You've lost me, have we ever meet? How do you know if I wear glasses or not, in fact how do you know what my eyes are like at all?

Quote:

Originally Posted by Andysol

And lighting doesn't impact resolution- which is the only thing we are arguing. There's no debate if you have a room with the sun on The screen it will affect the picture- but in terms of contrast, NOT resolution. 720p and 1080p is a distance measurement only. Period- end of argument.

Actually the articule you posted talks about light being important in relation to sight, did you actually read the document?

Quote:

Originally Posted by Andysol

Nope- just curious why you have a hard time following the English language. You either are foreign because you don't understand what "there is nothing to backup those figures" means, you have short term amnesia, or you just can't admit you said something that simply wasn't true as you were telling someone else what they said and quoted had no bearing.

Let's see, according to Wikipedia (since you like Wikis as proof), there are 328 million native English speakers in the world, that accounts for around 5% of the worlds population, since 95% of the world speaks a non english language as their native language, there is a high chance I am foreign, but your definition of foreign would have to be the strangest one I have heard.

Quote:

Originally Posted by Andysol

I never claimed anything. I said if you have 20/20 vision and a 1080p tv, that the chart is exactly right on how resolution is determined. It's a fact, proven by research and sciences. Not an opinion. If you are one of the millions and millions of people with either 20/20 vision or wear corrective lenses, the chart is 100% fact. Yours will always be an opinion on the difference you can see. So your argument is fact vs opinion.

Go back and read what you linked, I'm not an opthmologist, I can't challenge the work of Kalloniatis or Luu, or the document they wrote, and he chart may be a 100% fact, but they haven't referenced it one bit, not even to this article, or any other article

Quote:

Originally Posted by Andysol

There is no chart for you. Because you and your family won't go get glasses or have such poor lighting conditions you can't see any resolution due to your contrast being off. That's no ones fault but your own. But again- lighting conditions have no merit on resolution. If you turned your brightness to 0, and made your screen black, you can't argue that your 1080p tv has poor resolution. The resolution would still be 1080p!

You make strange assumptions, I say you don't know anything about the environmental conditions in peoples rooms, and you assume that everything is bad? Again, did you even read the article you linked to, I really don't think you did.

Now, since you like you chart so much, let's go back to the earlier posts, where this dicussion started, and the chart you referenced.

I replied to a user who said you had a to sit 2 x the diagonal measure of the screen to enjoy Full HD, they used 50" as an example, stating no further an 2.8m, but according to the chart you must sit 2.13m or closer to benefit, What is right, the person who claimed I was wrong, or the chart?

And at the end of the day, the best play to sit is personal preference, as I said orginally

I'm not going to claim anything on your eyesight, but I just had to ask: Are you saying that your vision is better than the assumed "perfect" 20/20 vision that the conclusions in the articles are based on (i.e biophysics and scientific measurements)? My understanding was that they test the human eyes' best case resolution and give the conclusions from that.

I'm not saying it couldn't be better but I still like living here more than any other country and apparently so does anyone else in the world who can figure out a way to live here. There is more to life than Internet speed. When the rest of the world downloads movies at lightning speed there is a good chance it is a US made movie and probably pirated since the US media doesn't actually provide much content outside of the states as far as I have heard or experienced in my travels abroad.

The US media doesn't import much content from outside the states either, which is probably warping your perspective.

In my experience of travelling, which is mainly in Western Europe, most other places you'd care to go have a rich balance of domestic and foreign produced content, including TV and movies from the states.

The US is a big country, and produces a lot of TV and movies, but it's far from a monopoly on good content. You also don't need to live there to enjoy it.

Since when are Thunderbolt or USB 3.0 wireless standards??? And most of you don't know your bits from your bytes.

Anyway, if there is anyone wanting to know as to when Apple will support USB 3.0, I have the answer for you: Now, if you have an ExpressCard slot in your MacBook Pro, or a PCIe card for your Mac Pro and the rest of us will have to buy a new computer this April or May when the new Intel Ivy Bridge CPU's will be released. Intel will support USB 3.0 at the chipset level as part of Ivy Bridge.

As Mac users, we don't have to worry about Intel not including Thunderbolt at the chipset level. We use Macs. All future Macs will have Thunderbolt (10Gbps), USB 3.0 (5Gbps), SATA III (6Gbps) and hopefully 802.11ac WiFi. I love my 3x3 MIMO (450Mbps) MacBook Pro / Time Capsule setup, and yes, I'll buy the 802.11ac-based Time Capsule the day it's released!

I'm in the same camp, but since my ISP only runs at 8 Mbps a faster Airport Extreme really wouldn't help much. I'm getting 7.8Mbps on my n-speed wifi devices, so a 802.11ac router would currently be overkill.

The 802.11x specs aren't about ISP speed, they're about device to device speed, in which, you could DEFINITIVELY use the extra speed.

Originally Posted by Sevenfeet
Let's say you have two Apple TVs in your house, both watching online streamed content in 1080p. Add to that someone doing the same on an iPad 3 and maybe a Macbook surfing the web at the same time and all of a sudden you have a pretty congested network. In order to provide consistent streams without frame dropping, you need something like 802.11ac.

For streaming I'd think most homes would be limited by their internet speed first though. You may get gigabit wifi link speeds, but what good is that when your provider feeds you 20Mb/s. Now if you had localized content, say full 1080p video on your computers streaming to the ATVs and file transfers and whatnot, that speed would sure help.

While we're off topic, please allow me to correct some more inaccuracies: USB 2 is not 12Mbps. That's USB 1.1. USB 2.0 has a maximum theoretical speed of 480Mbps and for the sake of completeness, now we have USB 3.0, which is 5Gbps (approx. 10x speed increase over USB 2.0).

So, mstone, when someone gets speeds like 20Mbps or 30Mbps through their ISP, which is a high number, although becoming more common every day, it's still less then...say, 54Mbps a.k.a. 802.11g. Using your theory, most people would not even need 802.11n, right? Besides, when you try to explain 5Gbps and 5GB/s, you haven't accounted for 5 GiBps. (Gigibyte per second). A lot of people use Gbps when they actually mean Gibps. Add to that the fact that you can even come close to Gigabit WiFi with 3x3 MIMO tech on both the router and the client using both the 2.4 and 5GHz simultaneously at 450Mbps each for a combined 900Mbps, or .9Gbps.