Extending WiFi to one mile, thanks to empty TV channels

Rice University grad student Ryan Guerra is on a mission to extend the range of WiFi signals from a few hundred feet to a mile—and beyond. This month, he succeeded, thanks to some nifty engineering and a few empty TV channels.

The first beneficiary of his work is Houston resident Leticia Aguirre, 48, who lives at the very edge of a local free WiFi network run by Technology for All. The high frequencies (2.4 GHz and 5 GHz) used by WiFi mean that signals don't easily penetrate the tree branches and leaves which surrounded Aguirre's home.

On the "deadest, stillest day of winter" the signal might be reliable, Guerra told me, but most of the year, it has been frustratingly intermittent. Even though the WiFi connection was free and Aguirre can't afford DSL or cable, her experience was so poor she considered canceling the service.

Guerra decided to use Aguirre's home as the first location for new "Super WiFi" test gear being developed and tested at Rice. Instead of relying on traditional WiFi frequencies, the Super WiFi project downshifts the signal into an empty TV channel—in this case, channel 29, which has the additional virtue of having empty adjacent channels, as well.

Guerra was put in charge of creating the Super WiFi gear for Aguirre's home, which was one mile from the Technology for All transmission tower. He began with an off-the-shelf 2.4GHz WiFi card on a computer running Linux. The card's output is piped through a frequency translator prototype from Alcatel Lucent, which shifts the signal down to Channel 29's 563MHz—far better for plowing through trees and walls.

Guerra installing the Super WiFi antenna

While WiFi channels at 2.4GHz are 20MHz wide, TV channels only have 6MHz of bandwidth, so this setup also has to squeeze the incoming signal down to 5MHz of bandwidth in order to stay safely within channel boundaries (in the future, techniques like channel bonding should increase the available bandwidth). Output from the translator then goes to a small TV antenna on Aguirre's home, where it's sent to the local Pecan Park transmission tower and patched into the fiber backbone connection there.

The setup "actually works very well," says Guerra. The longest link he could make with existing point-to-point WiFi connections was 400-500 feet; with the new Super WiFi gear in the TV band, he can reach a mile—and it's not a point-to-point signal. Instead, the transmitter serves up a directional 60 degree beam, and anyone in its path can receive broadband service.

The wider signal beam also has a side benefit: smaller antennas. Aguirre used to require an antenna mounted on a 30 foot pole in order to get a line-of-sight, point-to-point connection; the newer setup doesn't need this kind of tight alignment, and so Aguirre now uses a much more discreet TV antenna.

Results so far have been good. Rice has worked with mesh WiFi networks for years, but researchers have noted that the quality of channels varies dramatically over time. The new connection has been quite stable, even with leafy trees and through bad weather, and has yet to show any interference or connection problems, even at a mile from the transmitter.

From left to right: UHF-band filter, frequency translator, PC motherboard and WiFi card, power supply

The current approach has some bandwidth limitations. Because it uses existing WiFi protocols but uses only 25 percent of WiFi's bandwidth, Guerra's setup will never get more than 25 percent of WiFi's maximum throughput—and indeed, that's what he's seeing.

The greater range of Super WiFi gear means that the current setup will also run into problems as it scales up. WiFi uses a Carrier Sense Multiple Access (CSMA) approach to sending data, which means that WiFi transmitters try to detect other nearby transmitters and wait for them to fall silent before sending data of their own. This works pretty well when only a few devices coexist, but it can become chaotic when hundreds of nodes are involved.

As Guerra rolls out his Super WiFi solution to other Pecan Park residents, he anticipates running into CSMA congestion, which will lower the efficiency of the network. Eventually, whole new protocols may be required, probably relying on scheduled access mechanisms in which transmitters are assigned specific time slices wherein they can transmit without having to detect other signals (cell networks often work this way, and companies like Microsoft are already researching new protocols for use in the TV white spaces).

But the work has shown that empty TV channels will be a huge boon for broadband. Urban areas will likely rely on wireline networks, but Guerra sees terrific potential for people like his rural relatives, who need better last-mile connections. If Super WiFi can easily reach a mile in urban conditions, it's likely to go further in rural locations.

Guerra's work doesn't just involve time spent at the lab bench or with computer models, giving him a unique grad school experience. "I didn't realize how unique it was until I started going out in the field and installing equipment," he says, until he would return from a trip to see other grad students looking up in jealousy from their computers.

How does this compare to Wimax? My limited understanding of it was that it allowed you to build a network of relatively inexpensive stations to blanket a large area with a (relatively) high speed wireless connection.

Great idea but mickey mousing around with 5 mhz bandwidths serving customers over a 1 mile radius. This would get old real fast in urban areas but rural you bet.

A simple dirt cheap solution fiber to the block network. could easily give most of us 1 gbs broadband access for a few dollars a month.

The cost of a fiber to the block network would be less than $20 a household/business to the block level wired/wireless N access point plus subscriber connect costs of $100 for Cat 6 copper to most subscribers and fiber to the rare more distant ones, $50 for Phoneline/Powerline, $50 for a WiFi mesh repeater or zilch to the customer's wifi card. A buck or two a month would suffice for O&M.

The FCC now recognizes that low speed smart meters would be a component on the broadband network they envision. The small incremental cost of the high speed network over the low speed smart meter net power companies are planning, pays for broadband network for a extra few dollars a month per subscriber.

As Time Warner brags in their annual report, Big Telecom makes 3000% profits on broadband with their ancient antiquated equipment. They could cut their fees to 3% of current level - $1 a month ADSL - and still make money Lots of room for a nonprofit to provide a service at for a few bucks a month.

Keeping that in mind and as an alternative, we could pass legislation state/federal or muni requiring Big Telecom as a condition of licence to install a bulk purchased $200 a unit outdoor dual band wireless units on every street block in every neighborhood in the USA serviced by the top of line $50 a month average 30 Mbs highest speed innernet service Big Telecom offers. Each unit would supply 50 roughly households.

There are 110 Million households in the USA so total cost would be about $500M less than Big Telecom spends annually buying booze for compliant FCC members, "journalists" and politicians. $500M financed at 10% plus $50 a month for the network with 50% broadband penetration works out to about two bucks a month per subscriber.

95% of the population centres of the country would be wirelessly connected with the rest covered with current 3g/4g Big Telecom offering or white space tech.

Only corrupt politicians stand between the people and dirt cheap universal broadband access. Give them a call and ask why.

So basically we spend $1.8 million dollars to see how far a PTMP WiFi based 700MHz signal will go. I ran the same test in 900MHz about 3 weeks ago and used a 2x2 MIMO which has twice the throughput and it cost a wopping $800 and 4 hours of my time. Good to see that junior is getting an education on my taxpayer dime that could have easily been duplicated by hanging out with a WISP installer for a couple days. I said this test was a huge boondoggle when it was funded and at least that has been confirmed. It was a huge waste of taxpayer funding and is typical of academia making stuff up to get government grants.

They're referring to the old analog broadcast channels, which were freed up due to the digital transition a few years ago. Those two channels definitely exist, but they are no longer being transmitted on the relevant frequencies.

They're referring to the old analog broadcast channels, which were freed up due to the digital transition a few years ago. Those two channels definitely exist, but they are no longer being transmitted on the relevant frequencies.

There is no difference. Analog channel 29 is 560-566 MHz, and digital channel 29 is 560-566 MHz.

The analog channels that were freed up are 52-69, 698 MHz to 806 MHz, which are now auctioned off.

I'm struggling to understand how hooking a frequency converter to off the shelf equipment is even newsworthy, especially for such a phenomenally short link.

I regularly send point to point 5.8GHz networks 15-20 miles, and utilize 2.4GHz distribution networks for customer based links of up to 15 miles (even more in corner cases where the customer has a tower), so "a mile" registers as just short of nothing in my mind.

The 5MHz channel width is a standard feature in all Atheros based wifi cards (and most others, as well). I would guess the only reason that they didn't use a wider channel is because they were stuck on channel 29 for whatever reason, and couldn't overlap into channels 27 & 28 or 30 & 31. The channel size is a trivially configurable setting.

Looks like they've used a standard issue WRAP or Soekris single board computer with an Engenius 8603 (which is a terrible card, signal quality wise).

They probably could have saved quite a bit of money by using a Mikrotik RB411 ($49) with a CM9 or R52 radio ($39 either way, vs the $59 8603).

Real expectations from those of us in the WISP industry (who are in dire need of these frequencies, given how ridiculously expensive licensed spectrum is) are that we should have 2.5-3.5 bits per hertz efficiency by the end of 2012 with ranges of something like 15 to 25 miles, depending on what the max antenna height and HAAT rules end up being. Right now they are a little low, but several industry groups are trying to get them raised up a bit so that the range will be better.

There is also the very real, very existent WiDOX which is simply DOCSIS mapped to 6MHz wireless channels in this sort of band. Mostly deployed in Lower 700MHz areas and produced by Arris. It exists, and you can buy it today - albeit not in white spaces bands just yet.

I'm quite looking forward to it, as most of my service area has 100 to 200MHz of available TV white space!

There is no difference. Analog channel 29 is 560-566 MHz, and digital channel 29 is 560-566 MHz.

The analog channels that were freed up are 52-69, 698 MHz to 806 MHz, which are now auctioned off.

Actually, ALL analog TV channels were freed up in the sense that analog TV broadcasts are no longer allowed. The 700 MHz spectrum was auctioned off for commercial use, while 54-698 MHz is available for several other potential uses including "white space" devices. Isn't that what this article was all about?

Actually, ALL analog TV channels were freed up in the sense that analog TV broadcasts are no longer allowed. The 700 MHz spectrum was auctioned off for commercial use, while 54-698 MHz is available for several other potential uses including "white space" devices. Isn't that what this article was all about?

By that definition, that's true. But that's not what TK said; he was implying that channel 29 no longer uses 560-566 MHz, which is not the case. KUGB-CD is definitely on 28 (554-560 MHz) and KCVH-LD is definitely on 30 (566-572 MHz).

While this experiment proves well enough that wifi extends over a mile at those freq's the downside is liscencing for everyday consumer use of said freq's and knowing the spectrum auctions, how many of those are going to be Big ISP's squatting on the spectrums. Was spectrum squatters addressed?

I guess there's no official cut off point, but it's definitely a long way from omni directional. It's all semantics, but usually when you have to aim the antennas, you call it point to point.

It's more than point to point. Directional? Yes. But not point to point. A 60 degree beam over a mile can cover a fair area. I forgot the trigonometry to figure this out (shame, I know...) but that would cover lots of households. Plus you can probably have several beams "fan out" to cover larger areas.

I guess there's no official cut off point, but it's definitely a long way from omni directional. It's all semantics, but usually when you have to aim the antennas, you call it point to point.

It's more than point to point. Directional? Yes. But not point to point. A 60 degree beam over a mile can cover a fair area. I forgot the trigonometry to figure this out (shame, I know...) but that would cover lots of households. Plus you can probably have several beams "fan out" to cover larger areas.

Trig not required. Simple arithmetic.

Area of one circle is pi * r^2 (as in “pie are squared”.) There are 360 degrees in a circle. 60 degrees is a pie shaped one-sixth of a circle. Given 1 mile radius, we get a 0.532 square miles or 335 acres. With an urban single housing density of 6 houses to the acre (6000 sq ft lots) that is 2010 houses (this is so last year), so obviously this approach for internet connections is for very rural areas. As for what the beam width should be, consider the following. Narrow the beam of the hot spot and increase the number of sectors (pie shapes). Use two white channels with overlapping sectors and a dead band between sectors on each white channel. So, for example, 60 degree beam width, five sectors, with 12 degree dead bands, with two frequencies, would give 10 beams in a 3.14 square mile area (2010 acres). For those challenged or simply beyond the use of miles and pieds, 1 mile= 1609.344 m, 1 sq mile ~ 259 hectares, 0.523 sq mi ~ 135.6 hectares. 3.14 sq mi ~ 813.7 hectares (assuming I didn’t lose a digit somewhere).

I guess there's no official cut off point.... It's all semantics, but usually when you have to aim the antennas, you call it point to point.

As far as the FCC is concerned, it's not a matter of semantics - its very clear. FCC 15.247 defines the difference of an EIRP limited PtM and more liberal EIRP PtP emitter. I suspect at some "mesh" will enter the picture and all the mesh node transmitters will be neutered to the PtM limitation.

back in the days we built home brew down converters 2150/6 MHZ to chanel 7 to watch free moviestrust me 2 GHZ will go to the moon and backits all in the line of sight issues, power, and noise levels.the modulation index limits of the lower chanelsis going to keep low....slow

So basically we spend $1.8 million dollars to see how far a PTMP WiFi based 700MHz signal will go. I ran the same test in 900MHz about 3 weeks ago and used a 2x2 MIMO which has twice the throughput and it cost a wopping $800 and 4 hours of my time. Good to see that junior is getting an education on my taxpayer dime that could have easily been duplicated by hanging out with a WISP installer for a couple days. I said this test was a huge boondoggle when it was funded and at least that has been confirmed. It was a huge waste of taxpayer funding and is typical of academia making stuff up to get government grants.

Mate, I'd sooner see money spent on that than the bogus bullshit university papers such as one person getting a grant so he can research the 'bogan subculture' ( http://en.wikipedia.org/wiki/Bogan ) - btw, it is only a waste if the government don't run with it and decide instead to shelve it.

So basically we spend $1.8 million dollars to see how far a PTMP WiFi based 700MHz signal will go. I ran the same test in 900MHz about 3 weeks ago and used a 2x2 MIMO which has twice the throughput and it cost a wopping $800 and 4 hours of my time. Good to see that junior is getting an education on my taxpayer dime that could have easily been duplicated by hanging out with a WISP installer for a couple days. I said this test was a huge boondoggle when it was funded and at least that has been confirmed. It was a huge waste of taxpayer funding and is typical of academia making stuff up to get government grants.

Yeah, we should just close all the schools. Everyone can go to ITT Tech instead. We've probably invented everything we need already, all we need now is technicians to buy stuff off the shelf and do normal things with it.

Yeah, we should just close all the schools. Everyone can go to ITT Tech instead. We've probably invented everything we need already, all we need now is technicians to buy stuff off the shelf and do normal things with it.

Deet, I think you missed Rory's point. If what that guy did was a truly unique and innovative project, then it was worth it. But all it did was confirm 700 MHz had better tree penetration compared to 802.11b/g/a; which is not a game stopping discovery, especially as 802.11b/g/a is a dead-end technology compared to emerging 2 and 3 chain 802.11n MIMO.

From a product development standpoint, there are already manufactures working on software defined radio gear that could hit be in the commercial supply chain by fourth quarter; where that guy is still languishing in the proof of concept stage.

15.4.6.8 Slot timeThe slot time for the DSSS PHY shall be the sum of the RX-to-TX turnaround time (5 µs) and the energy detect time (15 µs specified in 15.4.8.4). The propagation delay shall be regarded as being included in the energy detect time.

If you read the original grant request, there were all sorts of concepts bandied about. In reality, and I pointed this out on the website, all they were doing was testing propagation. They spent more money making stuff up and writing the grant than it did for me to do an actual field test. Extrapolating attenuation between 900MHz and 700MHz isn't really rocket science and can be done by any Algebra 1 student who can understand logarithms. Of course, wrap it in academia speak and give them $1.8 million dollars are you just duplicated the hardest way possible what I train in a couple of days. This grant was garbage when it was written and the results are the typical taxpayer waste of money. I know companies that are already way ahead of this stupid grant in terms of design and field testing. There are hundreds of WISP and wireless guys out there that could have built and tested a 700MHz AP in 8 ours or less. All the equipment is already available off the shelf. This kid basically got the taxpayers to pay for his education in designing a board that is literally already done by the chipset manufacturers when they produce the chip. Totally ridiculous. I applaud the professor for finding a way to save his job by stealing taxpayers money so he can tell the university he has value. They probably took 50% right off the top. Tell you what, pay me $5K and I'll give you far better data on 700MHz propagation than the university gets out of making a simple PTP link work. Geez.

CSMA systems since coaxial Ethernet have used time slots for backoff, and CSMA/CA systems like Wi-Fi use them for prioritization and collision avoidance, WHT. Coax Ethernet was known as a "slotted Aloha" system.