In the 1970s, as cable television took root in metropolitan areas, I was working for ABC-TV. It had just dropped a significant chunk of change to build a new transmitter facility on the Sears Tower in Chicago, replacing the old Marina Towers site.

Even then, there was talk of the time when cable would make over-the-air TV obsolete. We speculated as to how many remaining viewers it took to justify the rent, upkeep, electricity and salaries for the site we were working to complete. Meanwhile, my friends in radio engineering simply said, "I'll worry when they wire the beach for cable."

To many who report on media and analyze the stocks of media companies, wireless Internet in all its forms is looking a lot like they've finally wired the beach — and this beach has video, audio, games, text, phone calls and more.

There are dire predictions that radio's best days have come and gone. Who can blame the pundit who sees only a simple consumer choice between listening to what some radio program director predicts that I (and 20,000 other people) want to hear, and choosing for myself exactly what I want when I want it?

As if to underscore this inescapable radio-is-obsolete reality, the radio sector's digital revenues are actually growing while overall sales results shrink.

Frank McCoy

Should we all be concerned that the days of the 1,000-foot tower are gone and that anyone with a computer and an Internet connection is a possible new competitor? Will radio as we know it become just another feature of cell phones? Will in-car Internet give commuters millions of station choices?

The answer is no.

Highway narrows ahead

The problem for such platforms isn't consumer demand. It's bandwidth.

Radio works efficiently by delivering the same content to all listeners at the same time. For each additional listener, there is no incremental additional overhead on the transmission side.

If I turn on my radio, the station I'm listening to doesn't have to add even one extra watt to accommodate me. Not so with our current scheme of Internet protocol (IP) delivery. On the Internet, every consumer of content requires a separate connection to the provider. It's a bit like a freeway where no two cars can occupy the same lane. Fifty morning commuters on their way to work require a 50-lane highway to get there.

It turns out that our information superhighway — the one that is going to eat radio's lunch — is really a very narrow road.

To see what I mean, let's look at the technologies that now define streaming audio across the Internet.

Perceptual coding data reduction has advanced remarkably in the past two decades but the rate of progress has flattened considerably. There's not much more magic smoke in that pipe.

Right now a 48 kilobit per second stream (Fraunhofer AAC+, for example) can deliver acceptable fidelity stereo audio. Further dramatic improvements, even a reduction to 24 kbps, seem unlikely without sacrificing audio quality to a degree that would inhibit listener adoption. We'll use 48 kbps as our benchmark.

Now, let's say you run the IT department for a company that employs 2,000 people at 60 separate sites around the country.

Presuming everyone is listening to Internet radio at 48 kbps, that's an aggregate bandwidth of 96 Mbps, equivalent to more than a T-1 at each site — just for employee entertainment.

This has already come up on the corporate radar, with Cisco, Juniper and others offering products to analyze data use by employees. A neighbor of mine is responsible for the agent services network for a major insurance company. His biggest bandwidth headache right now is YouTube. He's about to switch it off across all his networks. Distractions from work that cost the firm money? They're not long for this world. Businesses don't pay employees to watch funny video clips. Bring a radio to work if you want music.

In-car

But what about the holy grail of radio, the automobile?

We've all seen Bluetooth handsets that pair with car audio systems. These provide reliable audio streaming to car audiences. Ford and Microsoft teamed up to create Sync — essentially a PC in your dashboard. It streams nicely, too. Chrysler and others are offering a Wi-Fi (802.11x) access point that covers your car interior. All these require data connectivity from a cellular provider, offered as a flat-rate data plan (though these are rapidly disappearing) and available wherever the cellular provider has data coverage. In metropolitan areas this usually means everywhere.

From our earlier office example, we know 2,000 listeners will consume 96 Mbps of data. What about 25,000, which is about the AQH for a successful Chicago FM station? Or for the top 10 Chicago stations? The arithmetic is pretty simple: 250,000 listeners will consume 12 Gbps.

Shannon's Law sets an upper bound on how many bits can be stuffed into a given RF bandwidth. Technologically we're up against the practical limit, with peak performance of 3.7 bits per second per Hertz. But this is a "gross" number. Doppler errors, noise and all manner of other perils drive the reliable throughput down sharply for real-world mobile data delivery. Various solutions to improve reliability are required.

As an example, HD Radio uses a set of OFDM subcarriers in about 200 kHz of total sideband space. According to the Shannon rule this should yield almost 750 kbps but in reality it doesn't.

First, the OFDM carriers are duplicated in the upper and lower sidebands. This forfeits half the throughput but adds significantly to mobile reliability by largely overcoming multipath. After data duplication, framing and other overhead have taken their bites, we're down to a net capacity of less than 200 kbps. These same basic ratios apply to WiMax, 3G, 4G, NG, OhG, GWhiz (OK, I made up those last three) and all the rest where streaming is involved.

Will radio as we know it become just another feature of cell phones? Will in-car Internet give commuters millions of station choices?

It's worth noting that the Internet also requires duplex communication, something that HD Radio doesn't have to waste spectrum on. Duplex allows for re-sends, though, so maybe the statistical advantage swings that way slightly. Some forward delivery robustness can be sacrificed if you can try again as needed. But those re-sends consume bandwidth, too. And the tall-tower world of FM requires no intercell handoffs.

Case study

A geography example will help to illustrate the dilemma.

In Chicago, the Kennedy Expressway is 17.8 miles long, running northwest and southeast from the city center to O'Hare Airport. From eight to 20 lanes wide, the Kennedy sees peak daily commuter traffic of 387,000 vehicles.

Let's also assume that half that traffic occurs during morning and evening rush and average transit times during rush hour are 40 minutes. Let's assume three-quarters of the cars are streaming during their commute. We need to serve 16,125 cars, spread over a 17.8 mile linear distance. The throughput must be 774 Mbps net of all transmission and connection management overhead. This means we'll need a Gigabit-per-second gross.

Let's also assume that we have access to the entire spectrum in the former UHF TV Channels 51–73 that the feds have recently auctioned off, a total of 138 MHz.

Using Shannon's Law and the approximate bandwidth-to-data-throughput ratio we see for HDFM, we can deliver about 136 Mbps. Logically, we'd reuse frequencies along the cellular model, and our delivery capacity into any given cell is 1/4 that amount or about 34 Mbps. To cover the 17.8 miles and deliver 9.3 Gbps from cells with 34 Mbps capacity apiece requires 30 cell sites or one about every half mile. This doesn't seem so daunting an infrastructure challenge.

But this model assumes our cars are uniformly distributed on the expressway. Anyone who has ever been stuck in a gaper's block jam-up knows that isn't a realistic assumption. Inbound and outbound traffic concentrates at the ends of the expressway.

Most probably the capacity would need to be doubled, at minimum, for reliable throughput. For the cellular model that means twice as many cell sites. That's a cell site every 1,500 feet or so. Otherwise our commuters will experience buffering, dropouts, etc. and they'll just turn on the radio. Back-of-the-envelope, this probably represents about a fivefold or more increase in the number of cell sites in most metropolitan areas.

And the earlier assumption that we'd have access to the entire Channel 51–73 vacated spectrum is unrealistic, too. With multiple carriers all controlling discrete segments of spectrum, the uniformity of customer penetration across multiple cellular carriers will play into the reliability issue, too. If an outdoor ad on the Kennedy causes a thousand commuters to buy an iPhone, those subscribers may discover that AT&T's cell infrastructure is suddenly insufficient. Google Maps and YouTube already are taxing the capabilities of existing data plans and delivery infrastructure. Unlimited data plans are disappearing quickly from the market in response to the stresses.

In short, the greatest appeal of Internet radio remains its most fundamental problem: the requirement that every user have a separate, custom-per-user data stream.

As long as this requirement remains, over-the-air radio need not be concerned about meaningful encroachment from the Internet. The only practical solution is for our data consumers to share content — essentially a multicast broadcast where many users capture the same transmitted packets at the same time. But this means they'll listen to the same thing at the same time, which is what radio already does with much greater reliability and at far lower cost to the user.

Eventually the press-release-driven media may figure this out. So far they've been spoon-fed demos using one smartphone streaming one source. Hey, why not hold our own demonstration at the next NAB Show? Everybody should start streaming your own station during lunch, then invite the press to check out the results. I predict a wireless data train wreck.

The author is former president of engineering for American Media Services, a radio brokerage and developmental engineering firm. His email isfmc@ieee.org.

Pandora/Slacker seamless! Try driving around DC with an iPhone! Somedays seamless, next day hell on wheels!
Bandwidth is finite. Although PSS will help a lot, how much will the carriers charge to allow carriage on PSS!!

By streamer.
on 9/4/2009

MBMS – Multimedia Broad Multi Service and PSS, Adaptive Streaming within 3GPP Packet Switched Streaming Service is standards and solutions for distributing huge audio and video to mobile terminals.
http://en.wikipedia.org/wiki/Multimedia_Broad_Multi_Service
http://alturl.com/26i3 (Adaptive Streaming)
MBMS uses multi distribution in the core network instead of point-to-point links for each end device. The MBMS broad capability enables to reach unlimited number of users with constant network load. Further it also enables the possibility to broad information simultaneously to many cellular subscribers for example emergency alerts.
With adaptive streaming, the stations server can switch to a lower audio rate during the temporary link rate . Afterwards, the bit rate increases stepwise back to 48-128 kb/s. In this case, a continuous playout of audio and video was possible and the overall performance was significantly better than that without adaptive streaming.
PSS is supported by all major mobile telecommunication equipment providers as well as streaming equipment providers. The deployment of PSS-based streaming solutions in 2.5 and 3G networks is currently on the way.
This technologies makes mobile broad a relatively inexpensive technology compared to non-mobile broad technologies.

Posts are reviewed before publication, typically the next business morning. Radio World encourages multiple viewpoints, though a post will be blocked if it contains abusive language, or is repetitive or spam. Thank you for commenting!