Share this:

Like this:

Related

13 thoughts on “Is there a theoretical limit to wireless bandwidth?”

It’s a good question. I might have been able to answer that right out of college, but that was a while ago now. In the meantime, it does seem like engineers keep figuring out more and more clever ways to jam more information into the same bandwidth. Perhaps quantum physics will somehow, someday render this a moot question?

High def video is not a problem at all. Yes video is the most intensive media in terms of BW, but an H.264 / MPEG-4 stream can easily compress into a 5 Mb/s or so stream with minimal “loss”. This BW is child’s play for single streams of video being transmitted to CS / handlheld devices. As an example, 802.11g (yesterday’s technology) in theory can transmit 50 Mb/s (in practice closer to 20). 802.11n, which is mainstream today, can handle much much more. Also LTE / 4G will bring mobile broadband up to 50Mb/s or more in practice.

I get your analogy. practically congestion in the carrier network would break down prior to everyone receiving full video to their mobile. but networks aren’t engineered for full capacity. try to have everyone placing a phone call at once. your questions appear to be more oriented to spectrum – it’s really it’s an issue of the avail (licensable) spectrum which gores all the way up to visible light, how many bits (Nyquist’s theorem) can be packed into the channel, limited by noise, latency, how clever your engineering is etc

The theoretical limit is the Shannon Capacity:C = B*log(1+S/N) Where C is the information capacity, B the bandwidth, S the signal power and N the noise. Power dissipation and interference are the major limiting factors in wireless. Mesh, cognitive and collaborative networking approaches among others are popular ways to push practical capacity to Shannon capacity.

Short answer is that we should be more worried about hardware processing speed than the information capacity of the electromagnetic spectrum. In theory we could be communicating via gamma rays which oscillate at 10^24 Hz. If we could modulate them at 1 bit per period, thats like 10^24 MB/s. Trouble is in order to modulate at 10^24 bits/s, you have to sample the waveform at twice that (nyquist rate), which means you have to be processing data at 2×10^24 clock cycles per second. Even if “moore’s law” is true, it will take us over 100 years to get there.Granted, the signal to noise ratio of gamma rays is very low because we’re bombarded by cosmic rays at that frequency, but even if its 1/1000, just change the above math to 10^21 and we’re still a long way away.

hey chris, your question spurred me to do some thinking about next generation wireless and how it must evolve to fit the mobile world – both bandwidth but also how mobile apps are impacting usage and creating other issues… Next Gen Wireless – Will Twitter, FourSquare, et al. Spur Innovation? (or Weigh us Down) – let me know if you have any thoughts http://stevecheney.posterous.com/next-gen-wireless-will-twitter-foursquare-et