CircleID: White Spacehttp://www.circleid.com/topics/
Latest White Space related postings on CircleIDenCopyright 2017, unless where otherwise noted.2017-08-17T14:21:00-08:00CircleID13045http://www.circleid.com/images/logo_rss.gifhttp://www.circleid.com/
Wireless Innovations for a Networked Societyhttp://www.circleid.com/posts/20170809_wireless_innovations_for_a_networked_society/http://www.circleid.com/posts/20170809_wireless_innovations_for_a_networked_society/
Last week, I had the honor of moderating an engaging panel discussion at Mozilla on the need for community networks and the important role they can play in bridging the digital divide. The panel highlighted the success stories of some of the existing California-based initiatives that are actively working toward building solutions to get under-served communities online.

The goals of the Mozilla challenge are simple: how do you connect the unconnected and how do you connect people to essential resources when disaster strikes? The technical challenge can be approached in many different ways, but the crux of the problem lies in understanding and meaningfully addressing specific community needs. Ideally, by empowering individuals at the grassroots level, the dream is also to cultivate new voices that shape the future of the web and members who partake in the digital economy, reaping the benefits of a connected society.

Championing Connectivity

A key take-away from the event was the recurring challenge most of these organizations have faced at some point — how to build a project that is sustainable in the long term? Often, despite projects being adequately funded and having a clear technical plan of action, lack of local leadership and community engagement to carry these projects forward can result in an abrupt end once the initial deployment is completed. Local champions for connectivity, people who understand that digital dividends can change lives and people who know the value of a connected society, are critical in developing a solution that serves the community and builds an empowered Internet citizenry.

As speaker Steve Huter recounted from his many years of experience at the Network Startup Resource Center, "Request driven models tend to evolve better and have more engagement from developers and beneficiaries working together. Often as technologists, we get excited about a particular technology and we have the solution in mind; but it is really important to make sure that we are addressing the needs of the specific community and solving the right model."

Breaking Barriers

Important work driving this mission is already underway in the San Francisco Bay Area. On the Mozilla panel, experts from three initiatives furthering this cause discussed their work, but more importantly provided guidance to the attendees on how to approach these Challenges. Speakers represented on the panel included:

Marc Juul from People's Open – a community owned and operated peer-to-peer wireless network in Oakland

Thu Nguyen from Flowzo – a start-up trying to fix the issue of last mile connectivity through multi-stakeholder cooperation

Steve Huter from Network Startup Resource Center – a 25 year old organization that has helped build Internet infrastructure in 100 countries around the world

Mozilla and the National Science Foundation are partnering to give away $2M in prizes for wireless solutions that help connect the unconnected, keep communities connected following major disasters, and which make use of urban infrastructure. Got an idea? Apply here.

Written by Fatema Kothari, Vice-Chair for San Francisco Bay Area ISOC Organization

]]>2017-08-09T08:56:00-08:00internetaccess_providersbroadbandinternet_governancenetworkstelecomwhite_spacewirelessPolicies to Promote Broadband Access in Developing Countrieshttp://www.circleid.com/posts/20140903_policies_to_promote_broadband_access_in_developing_countries/http://www.circleid.com/posts/20140903_policies_to_promote_broadband_access_in_developing_countries/
In the 2014 Istanbul-Turkey IGF workshop on policies to promote broadband access in developing countries organised by Rui ZHONG of ISOC China, we realized that while technological solutions are advancing rapidly, policy and regulations remain a significant barrier to affordable internet especially in the developing world.

According to a report by Alliance for Affordable Internet ( A4AI), the key to affordability is the policy and regulatory environment that shapes the different actors in the market. Reforms to make markets more open, competitive and socially efficient are often the best and quickest way to drive prices down and increase broadband use.

Kenya launched it's broadband strategy in 2013. The bold document projected to roll Internet in all schools and hospitals by year 2017, and increase the speeds for broadband in urban areas to 2GBPS and in rural areas to 500MBPS by 2030. Some policies for reducing cost of broadband and increasing access have been suggested over the years viz:

1. Proper use and monitoring of Universal Service Funds (USF). The USF should not only be used to provide infrastructure investments in under-served regions, but also promote digital literacy. Digital literacy will ensure the community knows the value of the Internet and how it can improve their lives. Better management of USF and community involvement in how the USF are used, together with Political goodwill are all necessary ingredients for successful USF implementation. In Kenya, USF was formed under an act of parliament in 2009, but the USF management board was inaugurated in July 2014. Operators are required to pay upto 0.5 per cent of their annual turnover to the USF kitty. The snail pace in implementing USF has ensured the dream for broadband access to the under-served regions is delayed, and even deferred. Transparent and consultative processes, incorporating stakeholder inputs and priorities is a must for the success of the USF. Currently service providers are protesting because they were not included in the management of the funds. Internet end user representatives like the Internet Society should also be part of the team that advises on how the USF should be used.

2. Reduced luxury tax on infrastructure equipment, end user devices, and services especially in undeserved regions. Those living below the UN poverty index of $2/day have other priorities like food, and health. Cost of broadband is as high as 90% of income on population with low per ca-pita income.

4. Sharing of resources by service providers. Masts, fiber cable, etc. The end user will foot the bill if every provider has to compete laying infrastructure that has less than 10% utilization overall. Allow service providers to use infrastructure setup through taxpayers money. This is more so through the legacy government owned telecommunication monopolies that litter the landscape across Africa. An example is the National Fiber Optic Cable (NOFBI) laid across Kenya by the government with a target reach of 80%. NOFBI not fully utilized. Private sector can make use of these resources instead of laying parallel infrastructure. In the broadband strategy, the government has pledged to increase the coverage of NOFBI by an additional 30,000KMs

6. Foster Innovation, like the use of TV white space, and other innovations. License Plain FREE community internet service in least privileged areas. An example is setting up free MESH networks like those pioneered by X-Lab and connecting the communities to local community servers with open access content like Wikipedia, open streetmaps, OwnCloud storage, news, local agricultural content, free e-books, municipal portals, chartrooms, and a directory for all these content.

7. Streamlined processes for infrastructure deployment. Efficient and effective access to public rights of way Coordinated with other infrastructure projects (fiber or duct laid during road works)

8. Establish Local and/or regional internet exchange point (IXP), and have data caching. The AXIS project in Africa though partnership of ISOC and AU has already setup about 4 IXPs and held training across the continent.

9. Energy. There cannot be access without affordable sustainable electricity. Electricity is very expensive per kilowatt in developing countries. This cost is usually passed down to end users. Developing countries should seriously look for permanent ways of solving their perennial energy problems. Computer laboratories can be powered through solar and wind energy.

10. Content. Develop policies that support relevant local content that users will feel the need to consume. Most societies have solved the content problem to a greater extend.

11. Data collection of key indicators to measure effectiveness of the strategies implemented.

and FINALLY

12. Move from talking to acting to develop concrete policies and better regulation and monitoring. All these are possible through collaboration and improved relationships between the business, governments and local communities.

]]>2014-09-03T08:03:00-08:00internetbroadbandinternet_governancenetworkspolicy_regulationwhite_spaceNo Free Super WiFi, But the US Still Needs Improved WiFi Coveragehttp://www.circleid.com/posts/20130208_no_free_super_wifi_but_us_still_needs_improved_wifi_coverage/http://www.circleid.com/posts/20130208_no_free_super_wifi_but_us_still_needs_improved_wifi_coverage/
The FCC has long battled for a more efficient deployment of unused spectrum, endeavouring to adapt rules governing 'white space' TV spectrum (largely gifted to broadcasters years ago, and generally in the 700MHz band) to newly released spectrum (in the 600MHz band). This will considerably improve wireless broadband coverage where it is needed most — predominantly in rural areas of the country which can benefit from the propagation characteristics of lower-band spectrum, but also in municipalities which struggle against the commercial interests of a small number of telcos and broadcasters.

Certainly, in the US's disjointed broadband sector there are considerable challenges ahead: telcos and broadcasters have lobbying clout to influence policy makers, and have weighed in against unlicensed spectrum. Any progress in the FCC's proposals could be many years away. Regulatory amendments and new legislation will take time, while technologies which can fully exploit WiFi's potential in these bands are either nascent or not yet developed.

Greater consumer use of wireless broadband in recent years has heightened the call for wider spectrum availability. An increasing number of connections in public hotspots are via handheld devices (smartphones, tablets etc), whereas only a few years ago access was almost exclusively via laptops.

In common with this growing demand, municipalities are increasingly promoting broadband as an essential service along the lines of infrastructure such as water and power, which are now taken for granted. Although there are numerous cities with WiFi networks accessible to the public, citywide or near citywide coverage tends to be restricted to government use, such as for public safety and to improve efficiency within government departments.

Though cities and towns across the US have been exploring ways to fund and build WiFi infrastructure it has not been easy, given the financial challenges and the blocking strategies employed by the powerful telecoms industry which has in at least 19 states pushed for laws blocking or preventing municipalities from offering WiFi or broadband services.

There is some reason for encouragement, though. Following the FCC's approval of the use of white space spectrum at the end of 2011, the first public 'WhiteFi' network went live in North Carolina early in 2012. The service was initially deployed for municipal functions (such as surveillance cameras and transmitting water quality data), and made use of several frequencies.

Yet this has been a rare success. In contrast, Seattle in mid-2012 aborted its plan for a citywide WiFi network after a decade of feasibility studies. The network was intended to offset the near monopoly of Comcast for internet services in the area. The city authorities subsequently reached an agreement with Gigabit Squared to develop and operate an FttH broadband network, dubbed Gigabit Seattle, initially in 12 areas.

The project, with immense potential to stimulate business opportunities and employment, as well as a range of related advancements in health care, education and public services, fits well with the FCC's recent 'Gigabit City Challenge', aimed at encouraging broadband providers and state and municipal officials to cooperate in developing faster networks to drive innovation, economic growth and competitiveness.

At present the challenge calls for one community in each state to offer a 1Gb/s service by 2015. Given sufficient traction and favourable economic and regulatory conditions, gigabit cities should become commonplace, building on pioneering deployments such as Kansas City and Chattanooga.

Regarding both FttH and WiFi, the US needs greater regulatory control and government involvement if schemes beneficial to the nation are not to wither on the vine of corporate interest.

]]>2013-02-08T09:18:00-08:00internetaccess_providersbroadbandnetworkspolicy_regulationtelecomwhite_spacewirelessFCC Proposes Super Wifi Networks Across the U.S.http://www.circleid.com/posts/fcc_proposes_super_wifi_networks_across_the_us/http://www.circleid.com/posts/fcc_proposes_super_wifi_networks_across_the_us/
The Federal Communications Commission (FCC) is proposing the creation of "Super WiFi" networks across the United States providing free, highspeed, long-range WiFi networks, according to a report from the Washington Post.

"The proposal from the Federal Communications Commission has rattled the $178 billion wireless industry, which has launched a fierce lobbying effort to persuade policymakers to reconsider the idea, analysts say. That has been countered by an equally intense campaign from Google, Microsoft and other tech giants who say a free-for-all WiFi service would spark an explosion of innovations and devices that would benefit most Americans, especially the poor."

]]>2013-02-04T11:42:00-08:00internetaccess_providersbroadbandnetworkswhite_spacewirelessMedical Body Area Networkshttp://www.circleid.com/posts/20120627_medical_body_area_networks/http://www.circleid.com/posts/20120627_medical_body_area_networks/
The Federal Communications Commission (FCC) in Washington has advanced its wireless health care agenda by adopting rules that will enable Medical Body Area Networks (MBANs), low-power wideband networks consisting of multiple body-worn sensors that transmit a variety of patient data to a control device. MBANs provide a cost effective way to monitor every patient in a healthcare institution, so clinicians can provide real-time and accurate data which allows them to intervene if necessary.

Wireless devices that operate on MBAN spectrum can be used to actively monitor a patient’s health, including blood glucose and pressure monitoring, delivery of electrocardiogram readings, and neonatal monitoring systems. MBAN devices will be designed to be deployed widely within a hospital setting and will make use of inexpensive disposable body-worn sensors. MBAN technology will also make it easier to move patients to different parts of the health care facility for treatment and can improve the quality of patient care by giving health care providers the chance to identify life-threatening problems or events before they reach critical levels.

The Commission allocates 40 MHz of spectrum at 2360-2400 MHz for MBAN use on a secondary basis. It will accommodate MBAN use through an expansion of the existing Medical Device Radio communication (MedRadio) Service. This structure, which will permit MBAN devices to operate on a ‘license-by-rule’ basis in which users will not have to apply for and receive individual station licenses, will lead to the rapid and widespread development of innovative new MBAN applications.

The 2010 National Broadband Plan recognized that the use of spectrum-agile radios and other techniques can significantly increase the efficient use of radio spectrum to meet growing demand for this valuable resource. The development of the MBAN concept illustrates how advanced technology can enable the more efficient use of spectrum to deliver innovative new services. Because MBAN devices will share the spectrum with existing primary users, the rules contain registration and coordination provisions to protect vital flight testing operations conducted by aeronautical mobile telemetry (AMT) licensees.

]]>2012-06-27T08:21:00-08:00internetmobile_internetnetworkspolicy_regulationwhite_spacewirelessCanada Emerging at the Forefront of LTEhttp://www.circleid.com/posts/20120313_canada_emerging_at_the_forefront_of_lte/http://www.circleid.com/posts/20120313_canada_emerging_at_the_forefront_of_lte/
Canada has made impressive progress in mobile broadband deployment in recent months. This is partly due to operators needing to arrest falls in revenue from mobile voice services by buttressing their data capabilities, as also by the stimulus to the market introduced through the auction of Advanced Wireless Services spectrum in 2008. This auction overhauled the wireless market, introducing a number of smaller players which have added to the competitive mix as well as furthered the development of LTE.

Despite provisions to encourage new markets entrants, the major auction winners were, unsurprisingly, Rogers, TELUS and Bell Mobility, together accounting for around 60% of the spectrum available. Nevertheless, five new entrants also received spectrum. Four of these have since launched services. Notably, Shaw Communications decided against investing in a wireless network which it predicted could have cost up to CAN$1 billion. Instead, it has favoured developing a cheaper WiFi solution (costing perhaps CAN$400 million). The company recently sold its remaining wireless assets, though is continuing to sit on its awarded spectrum. The WiFi network is unlikely to generate much revenue, and so its true value may lie in providing access to it to mobile network operators needing to offload data from their LTE and HSPA infrastructure.

Meanwhile, mobile broadband services are fast expanding across the country. Rogers and Bell Canada have launched commercial LTE services in a number of major cities, , with the top 25 markets to follow during 2012 (so reaching half of the population). Outside of the LTE footprint customers are provisioned with HSPA+ which covers about 96% of the population. Both companies expected to extend their LTE coverage to rural areas pending the government's decision on the 700MHz spectrum auction: Telus also planned to extend its LTE network to cover 25 million Canadians by the end of 2012, while the regional telco SaskTel will follow its initial launch (in Regina and Saskatoon) to other urban and rural areas from 2013 based on demand.

]]>2012-03-13T10:28:00-08:00internetaccess_providersbroadbandmobile_internettelecomwhite_spacewirelessOnly Structural Change Can Save the Mobile Industryhttp://www.circleid.com/posts/20120224_only_structural_change_can_save_the_mobile_industry/http://www.circleid.com/posts/20120224_only_structural_change_can_save_the_mobile_industry/
I regularly bring this issue forward, similar to the discussion in relation to the structural separation of the fixed networks, which I began just over a decade ago.

What we are seeing in the mobile industry is an infrastructure and a spectrum crunch.

The amount of spectrum needed to satisfy people's demand from mobile phones, tablets and soon a range of other smart devices is limitless. Mobile carriers are scrambling for spectrum, but it is already known that the spectrum that will become available from the digital dividend (reuse of broadcast spectrum) will not be enough.

Another strategy to obtain access to spectrum is to consolidate. In the USA AT&T tried to merge with T-Mobile in order to lay their hands on their spectrum, but for anti-competitive reasons that was blocked by the FCC.

In Australia Optus is buying the minnow vividwireless. Because of its small size there won't be a major regulatory problem here, but it highlights the quest for spectrum. And it must be said that had Optus wanted to buy Vodafone Australia, for example, it would no doubt have created regulatory issues.

So with limited availability of spectrum and significant regulatory problems the industry is facing some serious challenges.

The spectrum problem is also evident in the marketplace. Many people will have noticed that their mobile call quality has deteriorated over the last year or so and the network problems with Vodafone Australia are also well-known. All this relates to network capacity problems. Operators have to manage their networks so as to cope with all of the new mobile data traffic and some of that traffic management involves prioritising traffic, which can lead to deterioration elsewhere.

On the physical infrastructure side there is another crunch. In order to cope with the traffic and the spectrum limitations operators can build smaller cells and then reuse the spectrum. However an infrastructure configuration of this kind is far more expensive to roll out.

To give an idea of the scale of the challenges mobile operators are facing — in order to provide proper LTE services in London more than 70,000 base stations are needed. How many operators are in a position to do this? Interestingly, all of these sites will need to be linked to a fibre network. The question becomes — are we still talking about mobile networks, or are we talking about mobile feeders into the fibre network?

While Telstra is winding down its WiFi network the reality is that more and more mobile traffic needs to be offloaded from the mobile network through WiFi hotspots onto the fixed network. Many households are now experiencing congestion problems when 4 or 5 devices are trying to connect to the WiFi modem in the home. All smartphones and tablets automatically switch over to the WiFi modem if that is available. This, of course, is much cheaper for the user but it also works to avoid a collapse of the mobile network. In Australia it is estimated that 80% of smartphone and tablet usage is within the home, office or internet cafe. As a side issue, this also clearly shows the need for fibre-to-the-home networks.

Mobile operators would not voluntarily offload mobile broadband usage through WiFi connections to the fixed network if it was not strictly necessary, as they do not get any money from people using the WiFi network. So that really is a clear indication of the seriousness of the problem.

I have been predicting that these developments will lead to an urgent need for a more efficient use of infrastructure and of spectrum. If further network problems arise — and there is no doubt that this will happen — there will be an increased request for better quality services, similar to the warning by the Australian bank regulators to the banks regarding their network outages. These infrastructures are no longer a luxury; they are essential for the functioning of the society and the economy.

In the case of mobile networks, consolidation will eventually need to take place. With 800 global mobile operators, the cost structure associated with so much duplication will not support future infrastructure and spectrum investments. However, because of competition issues consolidation will need to be based on structural separation. This would also fit in very well with the digital economic developments, which require far more open access in order to provide the applications and services that are increasing by the day.

Since the arrival of the iPhone in 2007 the platform has ceased being the network — it has become the smartphone. This is the most fundamental change in the history of the mobile market.

The winners will be the first mobile operators who have the vision and understand that the mobile network has fundamentally changed to become basically a fibre network with mobile feed-ins — with smartphones, tablets and other smart devices as the platforms on which to build new business models. Competing on mobile/fibre infrastructure through duplication will not be the smartest way forward.

Operators and service providers will have to abandon the old mobile infrastructure and business models and align themselves with open digital economy models. Already many mobile services are moving into the cloud and are purely data-based (not voice). Soon there will be no need for the current complex mobile (voice-based) infrastructure structures. Such operations could be run at a fraction of the current cost. If the operators use this opportunity they could remain a competitive voice in the broader mobile ecosystem; if not, they will become road-kill on the (fibre) superhighway. There is no way that the outdated IMS technology, for example, can turn the clock back — the future lies in web-based OTT applications. We questioned this technology as far back as 2005.

The current developments in LTE, linked to the growing number of smart devices, are only speeding up these transformation processes. This is led by the operators. However these operators continue to cling to outdated business models, as if nothing is changing. Such an attitude will only lead to situations similar to what happened when Apple introduced the iPhone, an event that took the mobile operators completely by surprise and changed the mobile industry forever.

Also the lock-in options for SIM cards are under threat, as BuddeComm reported last year, and several jurisdictions are now looking at this issue. Roaming — a goldmine for the operators — is under threat, with new International Mobile Subscriber Identity (IMSI) cards.

With all these rigid systems linked to their mobile networks service innovation and market leadership has slipped away from the operators. Rather than looking at innovations for their customers carriers are looking at optimising their networks (e.g., LTE). Seventy-five per cent of the current Apple products didn't exist five years ago. Compare this with the product offerings from the telcos. Apple put innovation and customer experience ahead of profits — the telcos do the reverse. And which group of shareholders are smiling now?

Structural changes can of course be undertaken on a voluntary basis. This does not necessarily require regulation; however, history tells us that it is very hard for telco companies to make such decisions without being forced to do so. By that time many of the other business opportunities will again have slipped through their fingers.

Expect more of this kind of commentary over the next few years. I expect the crunch to happen between 2013 and 2015.

]]>2012-02-24T08:26:00-08:00internetaccess_providersmobile_internetnetworkstelecomwhite_spacewirelessSpectrum Crisis: Wireless Auctions Preferred Methodhttp://www.circleid.com/posts/spectrum_crisis_wireless_auctions_preferred_method/http://www.circleid.com/posts/spectrum_crisis_wireless_auctions_preferred_method/
Talk, conjecture and analysis have predicted a wireless spectrum crisis for years. The official word seems to project a culmination of dropped calls, slow loading of data, downright network access denials as impending by 2015. If so, then we should look at the current argument about how that additional spectrum can be disseminated to wireless carriers in a fair and balanced fashion. Public auctions are a preferred method, in my opinion, since they neither favor, nor impede any wireless carrier.

Mergers to Gain Spectrum

Conspicuously, companies have been plotting and formulating a way to gain additional spectrum through buy-outs or mergers to gain market dominance, or nicely put, a competitive edge since the AT&T/T-Mobile acquisition. But have these companies focused on good business dynamics in doing so, or have they been caught off-guard in predicting eventual spectrum depletion forcing a merger or acquisition to alleviate the problem? I say the latter and it smacks of bad market decisions that result in less competitiveness in the market if left unchecked. Fortunately, the FCC has been able to see through the smoke and mirrors to block such tactics.

Auctions are fair and unbiased

So, why have spectrum auctions? The fact is that having a public auction to sell addition spectrum is a market friendly and competitive answer to each wireless carrier needs. Instead of allowing companies to gobble up a scarce resource as they see fit; large companies merging with those having existing inventories, therefore creating less competition increased market dominance, is a recipe for disaster. Each wireless carrier, large or small, should have an equal chance to bid at an auction. This allows for possible partnerships for smaller bidders which could offset the dominance of large companies. In essence, fair and balanced. See (Genachowski: Auction Bill Could Limit Benefits of Spectrum Recovery)

Wireless Tiering: Profit based, not spectrum based

Those who cry foul on alliances which have been curtailed, as the FCC and DOJ have done in the past; those that potentially hurt smaller competitors, should come clean about their own alliances behind such rhetoric. Wireless companies have begun creating tiers which charge heavy users more for bandwidth usage than those who use a fraction of that amount. This is a normal progression of market dynamics indicating that more usage relates to higher monthly bills. Cisco has been promoting this concept for years that operators should be tiering to increase profits. So, wireless companies should not be confusing the reasons for tiering and/or throttling as being spectrum based rather than profit based. See (Spectrum crunch: all talk, no action, and consumers suffer)

Auction Spectrum: sooner rather than later

The FCC should move forward with spectrum auctions. This issue has been debated long enough and most of us are aware more is needed with the projected growth of wireless devices and applications. Simply put, auctions are the fair and reliable way to promote a continued competitive wireless market.

]]>2012-02-21T07:45:00-08:00internetbroadbandpolicy_regulationtelecomwhite_spacewirelessAT&T's Randall &amp; Stankey: Wireless Data Growth Half The FCC Predictionhttp://www.circleid.com/posts/20120131_att_randall_stankey_wireless_data_growth_half_the_fcc_prediction/http://www.circleid.com/posts/20120131_att_randall_stankey_wireless_data_growth_half_the_fcc_prediction/
John Stankey, President and CEO, AT&T: "Data consumption right now is growing 40% a year."40%, not 92%-120%. "Data consumption right now is growing 40% a year," John Stankey of AT&T told investors and his CEO Randall Stephenson confirmed on the investor call. That's far less than the 92% predicted by Cisco's VNI model or the FCC's 120% to 2012 and 90% to 2013 figure in the "spectrum crunch" analysis. AT&T is easily a third of the U.S. mobile Internet and growing market share; there's no reason to think the result will be very different when we have data from others.

With growth rates less than half of the predictions, a data-driven FCC and Congress has no reason to rush to bad policy. Wireless technology is rapidly moving to sharing spectrum, whether in-building small cells, WiFi, White Spaces, Shared RAN or tools of what the engineers are calling hetnets — heterogenous networks. The last thing policymakers should do is tie up more spectrum for exclusive use; shared spectrum often yields three to ten times as much capacity.

Bad compromises on the video spectrum are unnecessary because plenty of spectrum is unused. That includes the 20 MHz that M2Z would be building out today if Julius hadn't blocked them; the 20 MHz the cable companies are sitting on and want to sell to Verizon; and the 30 MHz or so Stankey identifies as fallow at AT&T.

40% growth is still substantial, but wireless technology is improving at a breathtaking pace. LTE has about 10x the capacity of 2.5G and 4x the capacity of 3G. LTE Advanced, deploying beginning 2013 at Verizon, is designed for 10x the capacity of LTE. Putting more spectrum to use would be great, but let's do it right.

Wireless speeds are actually going up dramatically, with AT&T delivering 2-5 megabits to most of the country and Verizon's LTE delivering 5-12 megabits to 2/3rds of the population. Verizon is ahead of schedule to bring 5 megabits+ to 92% of the country in 2013 and 96-98% in 2015-2016. AT&T and Sprint have raised capex to catch up. 80%+ of the U.S. will have a 5 megabit offering in 2013-2014, 90%+ by 2015 or sooner. That's without any additional spectrum.

Today's wireless networks are designed to be shared: towers, WiFi, White Spaces, DAS and small cells all working together. The best engineers in the world are working on RAN sharing, SON, hetnets, 8x8 MIMO and techniques I'm writing about in my next book, Gigabit Wireless. AT&T in fact is one of the world leaders in DAS, WiFi and femtos and behind the scenes a key thought leader. There's wonderfully exciting stuff I'll be doing my best to translate for non-engineers.

Takeaway: The future is sharing the airwaves so let's get the policy right.

]]>2012-01-31T14:36:00-08:00internetaccess_providersbroadbandmobile_internetnetworkspolicy_regulationtelecomwhite_spacewirelessNo Spectrum Shortage, Just an Allocation Problemhttp://www.circleid.com/posts/20110929_no_spectrum_shortage_just_an_allocation_problem/http://www.circleid.com/posts/20110929_no_spectrum_shortage_just_an_allocation_problem/
As a new study from Citi Investment Research & Analysis make clear, the US does not have a spectrum shortage. We've just allowed a relatively small number of carriers to control the spectrum. Quoting the study's summary:

"Today, US carriers have 538MHz of spectrum. And, additional 300MHz of additional spectrum waiting in the wings. But, only 192MHz is in use today."

Perhaps if we had an effective "use it or lose it" policy in place, or a heavy tax on unused spectrum a more vibrant market for this spectrum would emerge. But today, the problem is not a shortage of spectrum but the fact that what's out there is not being utilized.

Obviously things vary by geography, but Citi's summary is completely justified. Their methodology is thorough both as to who owns what and what is deployed county-by-county for 3100 separate counties. Here's the summary of what's in use:

and here are the details on what's currently owned by US carriers:

So why would we repack the TV broadcasters and auction off that spectrum when we've just finished putting in place unlicensed access to TV white spaces? Unlicensed spectrum will be heavily utilized while more exclusively owned spectrum will just add to the pool of under utilized resources.

]]>2011-09-29T09:34:00-08:00internetbroadbandpolicy_regulationtelecomwhite_spacewirelessSmartphones: Too Smart for Mobile Operators?http://www.circleid.com/posts/20110803_smartphones_too_smart_for_mobile_operators/http://www.circleid.com/posts/20110803_smartphones_too_smart_for_mobile_operators/
By: John de Ridder

In June, the net neutrality debate took an unexpected turn when the Netherlands leap-frogged the USA to became the first country to legislate for mobile net neutrality. Business models for fixed and mobile networks must shift toward volume charges.

The net neutrality debate has been seen not having much relevance outside the USA because the plight of carriers there was aggravated by unlimited usage. US carriers objected to carrying the extra traffic generated by the likes of YouTube and BitTorrent and others including Google objected to the carriers' crude attempts to manage traffic by restricting customers' access to such sites.

The issue came to a head in 2008 when the FCC ordered Comcast, a cable TV and Internet access provider, to cease blocking or downgrading certain users' access to some peer-to-peer download services. The FCC's ruling, however, was subsequently struck down on appeal over the FCC's authority to implement net neutrality regulations.

US state of play

The FCC rallied and in December 2010 issued a new policy (still to be tested in a federal court) that sets three basic rules for net neutrality1:

Transparency. Fixed and mobile broadband providers must disclose the network management practices, performance characteristics, and terms and conditions of their broadband services;

No blocking. Fixed broadband providers may not block lawful content, applications, services, or non-harmful devices; mobile broadband providers may not block lawful websites, or block applications that compete with their voice or video telephony services; and

It is interesting to note that only the first of the FCC's three rules applies equally to fixed and mobiles providers. But for mobiles, the no-blocking rule applies only to services "that compete with their voice or video telephony services” and mobile providers are not mentioned at all in the third rule because "existing mobile networks present operational constraints that fixed broadband networks do not typically encounter. This puts greater pressure on the concept of "reasonable network management" for mobile providers".

But mobiles are under pressure. Free Press wants the FCC to take a close look at Google's move to curtail access (making them "unavailable for download” via the Android Market) to independent tethering apps. Google says it is doing this in response to requests from wireless carriers. But Google's Droid partner Verizon says "Google manages what's available in the Android Market."

When Verizon acquired massive amounts of spectrum in the 700MHz "C Block" auction back in 2008 it promised to adhere to the FCC's "Open Access" rules which forbid carriers from trying to "deny, limit, or restrict the ability of their customers to use the devices and applications of their choice." In the new Report and Order (Paras 134-135), the FCC has hinted that its powers may not be restricted to users of this part of the spectrum. Again, this must be tested in court.

Dutch policy

The Netherlands has gone further. In April 2011, KPN announced plans to charge mobile customers extra for using Skype and WhatsApp (an application that for $2 pa enables smart phone users to send messages for no additional charge). KPN does not reveal much but last year Telstra's messaging revenues were over $1 billion and over 9 billion SMS were sent from its mobile phones. So, losing voice and message revenues to Skype, Facebook and WhatsApp could seriously dent profitability.

To charge users access to such services, KPN would need to look at the data being transferred, using "deep packet inspection." Following protests about possible privacy violations, politicians moved quickly to stop the plan. In June 2011, the Dutch parliament passed a bill which will force mobile Internet providers to let customers use Skype and other rival services on their networks without charging extra or giving preferential treatment to their own offerings (and not to place cookies without express permission from the end user).2

KPN has responded [Business Week 19 July] saying that from September the cheapest advertised price for one gigabyte of mobile data will be part of a euro50 ($70)/month package, compared with current packages under euro20 ($28) that include unlimited data. This is similar to the moves that US fixed carriers have made by moving towards Australian style monthly caps.

Expect to see mobile handset prices increase and more volume-based charging. The latter makes sense for both fixed and mobile networks and is the next logical move after caps.

]]>2011-08-03T12:46:00-08:00internetaccess_providersmobile_internetnet_neutralitypolicy_regulationtelecomwhite_spacewirelessThe Future of the Internet Economy: Chapter 2http://www.circleid.com/posts/the_future_of_the_internet_economy_chapter_2/http://www.circleid.com/posts/the_future_of_the_internet_economy_chapter_2/
The OECD held a "high-level" meeting in June 2011 that was intended to build upon the OECD Ministerial on The Future of the Internet Economy held in Seoul, Korea in June 2008. I was invited to attend this meeting as part of the delegation from the Internet Technical Advisory Committee (ITAC), and here I'd like to share my impressions of this meeting.

The presentations I heard at this meeting could be broadly classified into a number of themes, as outlined below.

Public Policy: The Internet as a brilliant success of Multi-Stakeholderism

The first theme was somewhat self-congratulatory in nature, and noted that the Internet has been very effective in achieving economic growth. One speaker cited from a McKinsey report that the level of economic growth attributable to the Internet in 15 years, as measured by GDP growth, equalled the level of GDP growth experienced in the Industrial Revolution over 50 years.

The speakers who talked to this theme espoused freedom of expression, freedom of governance, and freedom of enterprise — online. The Secretary General of the OCED proposed that the OECD, and its working methods of inclusion of governments, the private sector, civil society and the technical community, was uniquely positioned to further this effort. As he noted in his presentation to this meeting, "The OECD has already established many of the social norms that define the Internet today." He espoused a light touch public policy environment as a platform to provide growth, and a driver of innovation that improves efficiency and growth. In other words, when handled with some consideration and care from a perspective of public policy and governance, the Internet will continue to play the role of a critical enabling tool for wealth creation.

The prevalent meme of today appears to be "multi-skateholderism," which appears to relate to today's mixed environment of public and private sector activity, coupled with explicit recognition of civil society and other vested interests, including the technical sector as stakeholders in the process.

The tone of such presentations on the success of the open Internet and upon light touch public policies and multi-stakeholderism was generally upbeat, with some concessions to the challenges of security and net neutrality, but overall there was a sense that if the process was well structured, then such challenges could be properly addressed to the satisfaction of all.

In many ways this is little more than self-congratulatory rhetoric about the positive outcomes that have resulted from the general deregulation of the telecoms sector in the late 20th century and the associated shift of the model of service in this sector from a single public sector utility telecom operator to a diverse set of competitive private sector actors. However, an implicit subtext within this theme was a critical commentary on alternative approaches to coordination frameworks for national and international communications, notably the ITU-T, and a rather barbed criticism of the ability of such treaty-based institutions to perform the necessary structural changes to their institutional model that would allow the institution to reflect the broader set of stakeholders that are peer players in today's landscape. Perhaps behind the rhetoric is one more piece of preparatory activity in the extended leadup to the renegotiation of the world telecommunications treaty by the set of nation states that have some level of commitment a communications industry structure that is now largely based on private sector activity within a framework of open competition, and a general desire to reduce, to some extent, an indefinite continuance of the encumbrances, obligations, and structural cross-subsidies that are associated with the current treaty obligations that stand behind the ITU-T.

The Faltering of the Traditional Carrier

A number of speakers on the topic of broadband infrastructure were critical of today's network infrastructure. A salient comment I heard at one point was: "This sector really has a problem in meeting demand."

Some of the now-privatised telcos (for example, the presentation from Telecom Italia) were effectively claiming that with the impositions of net neutrality and the imposition of a public policy agenda of ubiquitous equitable access for all to a high speed broadband infrastructure funded through private capital investment was not a viable proposition.

The broader question was raised in a presentation from the Korean delegate, who raised the question as to who should fund broadband network infrastructure construction. The Australian presentation made that case that such large scale broadband infrastructure projects exceeded the capacity of private enterprise, and therefore the responsibility to lead such projects fell to the public sector. Although it has to be noted that this leadership comes at the considerable cost of around $2,000 per capita in the Australian case, and it therefore takes a relatively robust economy to underwrite such a significant level of public capital expenditure within the broader collection of public sector issues. Many other OECD economies appear to have largely left the activity of the construction of broadband network infrastructure to the agenda of the private sector, particularly where financing is concerned, and limited their involvement to cheering from the sidelines. The outcomes so far from such an approach are not exactly stellar.

Another carrier, AT&T, asserted that public communications policy in broadband infrastructure is being driven by a vocal minority rather than the mainstream and asserts that this imbalance in policy formulation will result in subsequent retrograde intervention that will restore what he termed as "20th century regulation." He argued for continuance of deregulation and a "hands-off" policy response by government. He noted that a policy priority of broadband access, at an affordable price, as a enabler of economic outcomes, and a lever to improve delivery of social services and utilities. Interestingly, he noted a $95B infrastructure investment by AT&T over the past 5 years and claimed that this cost could not feasibly be recovered from the end user base because the imposition of additional costs onto the consumer base would exclude large sectors of users from the network, and this would be counter to a an objective of ubiquity of access. Given the stated preference for continuation of an industry model that is a deregulated industry lead by private sector investment, it would appear that AT&T is constructing a case to forego the concept of network neutrality with respect to their carriage services, and they apparently wish to have the ability to impose additional costs on content industry actors if they want to have high speed visibility to users on AT&T's broadband network and recover a significant proportion of their investment in this manner.

Network neutrality is a significant issue in today's industry, and it appears to be used by the carriers and operators as a keyword for their lack of incentive for infrastructure investment beyond the existing cooper loop wired infrastructure, citing that net neutrality acts as an investment disincentive that brings the financial returns on capital investment in infrastructure below what they consider to be acceptable levels that are able to meet the cost of private capital in their enterprises. At the same time they are pointing to the lack of radio spectrum as the reason for a lack of further investment in mobile data infrastructure, and accusing application developers of generating mobile content applications that make extravagant use of bandwidth, and hence extravagant use of spectrum as being part of the problem they face.

With some small level of dissension, there appears to be a general admission that demand on today's Internet is not only outstripping current levels of supply, demand growth now is outstripping the sector's business plans, capital investment capability and even technical capability, and the resultant need to exercise common constraint in an environment of limited resources is counter to an industry whose relatively crude content and service models appear to be based on continued abundance of the basic commodity of bandwidth and ubiquitous connectivity.
Security and Privacy

This is one of those mantra topics - everyone agreed that security is a Good Thing (at least I heard noone argue against the concept!), and all speakers who touched upon this topic appeared to agree with the proposition that this was a current issue and by no means a solved problem. But where to go from here was definitely not so clear.

It was clearly recognised that the quantity, breadth and detail of information that is now online poses some serious concern. The risk profile of unintended information exposure now includes individuals, organisations and even nation states. The security industry is becoming overwhelmed with the onslaught of new threats on a continuing basis, and the underlying concern is that the current level of cyber attack may mutate at any time into attack profiles associated with cyber warfare between nation states.

Industry commentators perceive this topic to have a low priority in the political agenda, where politicians want lower prices and greater regulatory control, while the ability of the private sector to invest in the necessary resources and measures to support greater levels of online security is limited by the relatively low value placed on this activity by end users. In some ways the issue of security in todays networks, particularly as they relate to high end security measures that are capable of defending a national communications system against broad scale infrastructure attack of a scale and intensity anticipated in the context of a concerted and well resourced attack (such as envisaged in a cyber warfare attack, for example), is seen to be beyond the scope of conventional private sector infrastructure operators. At the same time the public sector is showing some signs of uncertainty as to how to engage with this agenda, as this is a matter that is well beyond simple regulatory responses.

Hand-in-hand with security is the topic of privacy. It was asserted that the challenge about privacy is not about technology, as today's technology is adequately capable of supporting privacy, but is about the nexus of privacy policies and technology. In order to implement scalable systems that respect and adhere to privacy policies and are functional, there is a need to invest in an effort to define common privacy and authentication standards, i.e., standards relating to the nature of credentials that appropriately define individuals and roles, reputation mechanisms and validation of such credentials and the associated topic of negotiation of trust. The privacy management reference model is looking at operational privacy management in online services, and public standards need to be considered in the development of services. There is some optimism that policy entropy and conflicting standards can be addressed, assuming that the various actors in the area talk to each other and work in the context of industry-driven standards that are based on interoperable implementations. There is the expectation that the industry can deploy systems that can manage privacy conflict and ensure compliance with public policy frameworks that would engender trust and confidence. It was suggested that governments need to support the effort to foster the greater use of standards organisations to facilitate the development of data privacy standards and their adoption.

IPR and Intermediaries

This is a long-standing issue in this sector. The copyright holders have been reluctant, or incapable, on the whole to modify their business model to adapt to the capabilities of computing systems and computer networks to replicate and redistribute content. In the face of monotonically declining sales revenue of traditional media, and the collapse of many of former major players in the media-based content distribution industry, the content industry resorted to legal means to attempt to curb the decline in their industry.

The Digital Millennium Copyright Act in the United States is perhaps the most well known, but no means unique, example of this push for legislative remedies to unauthorized redistribution of content, and the industry has, at least in the realm of the public policy debate successfully managed to apply a lexicon that includes emotive terms such as "theft", "illegal", and "piracy" to such redistribution activities and have this lexicon adopted by the broader industry and in public policy debates.

However, such actions have been largely unsuccessful in terms of reducing the level of such unauthorized redistribution of content and the associated revenue leak that such redistribution represents to copyright holders. The copyright industry has now turned its attention to attempts to coerce the carriage providers to act as co-opted vigilantes in the efforts to enforce intellectual property rights.

This effort runs counter to the general principle of the role of a common carrier, where, in somewhat approximate terms, the carrier is bound to respect the privacy of the parties to whom it has contracted to act as a carrier, and in return is not held to be liable for the content carried across its network. However, there is a strong push to have the public sector to force the carriage sector, and all others who act as "intermediaries' in the provision of services and content to users, to play an active role in enforcing the intellectual property rights of copyright holders of the material. Rather than starting from an assumption that carriage providers and intermediaries are not liable for the content they carry on behalf of users, the default position being pushed in the context of this OECD meeting is one of assuming that such liabilities already exist, and the consequent agenda is to "limit" such liabilities.

It has been pointed out by critics of this approach (such as in a recent blog on this topic) that the wording of the communiqué from this meeting that some of the stakeholders, notably the technical community according to this particular critic, acted in a way that played into the hands of the IPR efforts: "Lacking the historical perspective, ITAC failed to see the camels nose being inserted under the tent in the IPR and Intermediary Liability sections."

Some of the presentations at the meeting were staunchly in favour of the copyright industry's proposals for making carriers and ISPs liable for content. In particular the presentation by Vivendi went as far as claiming that the entire content creation industry would come to a complete halt if IPR theft was not halted using all available means. The assertion was made in this context that: "Copyright is a key component of economic growth."

An alternative view was put forward by Deezer (and presumably Pandora, were they to be present) is that "piracy" is just one competing service model for distribution of content, and the real goal of this industry should be to create business and service models for the distribution of content that represent a superior service proposition to users as compared to resorting to unauthorized redistribution of content in the form of "piracy". Such new service models should allow IPR to be respected and due royalties paid in the use of copyright material. From Deezer's reported commercial success, this is evidently an achievable objective.

In any case, the default position of assuming some unspecified level of liability on the part of intermediaries, including carriage providers, and the need to "limit" this liability with respect to copyright material was maintained in the deliberations prior to this meeting, and the Civil Society Information Society Advisory Committee (CSISAC) was unable to endorse the resultant communiqué.

IPv6 - The Elephant in the Room

Oddly enough for a meeting that was intended to discuss the public policy aspects of the internet's future growth and the maintenance of the Internet's openness and ability to innovate, evolve and generate societal wealth through efficient and novel forms of connectivity and communication, the one topic that implicitly threatens the entire framework of today's Internet rated barely a mention in the meeting, namely the exhaustion of the IPv4 address pool and the industry's marked indifference to adopt IPv6. It was the unacknowledged elephant in the room.

While one speaker, Vint Cerf, highlighted the need to place IPv6 adoption as a matter of urgent priority in the public policy agenda, and noted that without IPv6, innovation on the Internet will suffer and beneficial outcomes from an open and accessible communications environment would cease, and we simply have no alternatives at this point in time. He noted that if this meeting can conclude with the imperative to deploy IPv6 across all parts of the Internet, then it would be a useful meeting with a positive message. Oddly enough, the chairman's summary at the end of this particular session omitted any reference to IPv6, despite this topic being the major theme of Vint's presentation.

There was certainly an air of disconnection that persisted through the meeting on the continued omission of any mention of IPv4 address exhaustion and the risks posed to the further growth of the Internet if IPv6 is not adopted in a timely manner. It got to the point that when a speaker from the UK Regulatory Office subsequently mentioned IPv6 and the need for the public sector to actively support its adoption, parts of the audience broke out in spontaneous applause.

It appears that despite many years of active promotion of IPv6 the message is still not getting heard within the area of public policy. The comprehensive transition of the Internet to IPv6 is a central pillar of any expectation that the Internet can continue to grow and sustain a vibrant environment based on open competition and innovation. So far we appear to have failed to effectively make that case that in a networked environment that stalls on IPv6 the resultant NAT and ALG-ridden IPv4 environment is one where the current incumbents will hold all the addresses and any further competitive entry into the Internet by new actors, at both the levels of carriage and content services, would be effectively limited to the terms and conditions imposed by the incumbents. Such a scenario is about as good a definition of the failure of an open market as one could find, and its one that the Internet would do very well to avoid.

Where To From Here?

Somehow I'm missing the sense of driving optimism and opportunity that was associated with the 2008 OECD Ministerial on the Future of the Internet Economy. It's not clear to me that multi-stakeholderism is sufficiently powerful a mantra to shake off the issues that confront this industry as it slowly shifts into a phase of success-disaster.

Yes, the mobile market is a massive commercial success, so much so that we are now running out of useable spectrum space in the most populous parts of the networked world.

Yes, the wired internet is transforming our economies, so much so that the pressure to recable our infrastructure from copper to fibre is now an essential prerequisite to keeping pace with demand, but the capital is not there and the sustainable carrier business models are not there to undertake this effort.

Yes, the provision of content is a runaway success, but the copyright industry still cries foul and in an effort to curb some of the reported massive damage being inflicted to the entertainment industry there is an effort to rip apart the principle of common carrier and hold all elements of this industry liable for the unauthorised distribution of content.

And yes, we've managed to distribute billions of computers, but at the same time we've managed to create significant areas of vulnerability, and we are now witnessing the exploitation of these weaknesses shift from elements of organised crime to the distinct possibility of cyber warfare waged between nation states.

But I don't believe that any of these issues present insurmountable challenges. In seeking productive responses to these challenges we need to make sure that we are looking in the right place. These problems appear to arise from an intersection of rapid shift in the technology base of this industry intersecting a set of business and policy frameworks that are often somewhat conservative in their response to change. I would like to believe that many of the answers we are looking for lie in adaptation of business models and public policy frameworks, and the tools that will best assist this common effort are probably economic in nature.

For that reason I believe that the OECD has a valuable role in the coming months and years, and I am heartened to see the OECD continue to engage all stakeholders in a public dialogue that I hope will be ultimately fruitful and productive for the future of the Internet.

]]>2011-07-06T07:39:00-08:00internetaccess_providersbroadbandcybersecurityintellectual_propertyinternet_governanceipv6mobile_internetnet_neutralitypolicy_regulationprivacytelecomwhite_spacewirelessUnlicensed Wireless Broadcasting Spectrum in the USAhttp://www.circleid.com/posts/20110209_unlicensed_wireless_broadcasting_spectrum_in_the_usa/http://www.circleid.com/posts/20110209_unlicensed_wireless_broadcasting_spectrum_in_the_usa/
New developments that have been announced by the FCC in the United States have rekindled the decade-old debate on the use of the so-called 'white spaces' in broadcast spectrum that are to be used for telecoms purposes.

In September 2010, the FCC adopted a Second Memorandum Opinion and Order that updated the rules for unlicensed wireless devices that can operate in broadcast television spectrum at locations where that spectrum is unused by licensed services. This unused TV spectrum is commonly referred to as television 'white spaces'. The rules allow for the use of unlicensed TV devices in the unused spectrum to provide broadband data and other services for consumers and businesses.

To prevent interference to authorised users of the TV bands, TV bands devices must include a geo-location capability and the capability to access a database that identifies incumbent users entitled to interference protection, including, for example, full power and low power TV stations, broadcast auxiliary point-to-point facilities, PLMRS/CMRS operations on channels 14-20, and the Offshore Radio Telephone Service.

The database will tell a TV band device which TV channels are vacant and can be used at its location. The database also will be used to register the locations of fixed TV band devices and protected locations and channels of incumbent services that are not recorded in Commission databases. The rules state that the Commission will designate one or more entities to administer a TV bands database.

The TV bands databases will be used by fixed and personal portable unlicensed devices to identify unused channels that are available at their geographic locations. This action will allow the designated administrators to develop the databases that are necessary to enable the introduction of this new class of broadband wireless devices in the TV spectrum.

While there was widespread support for the announcement, I have not, over the past twenty years, noted a sufficient degree of significant change to suddenly propel this technology into a new future.

None of the previous issues have changed enough to warrant more enthusiasm this time. In particular this includes technical and standardisation issues, both for the technology itself and for the end-user devices needed to receive broadband services in this way. With all these different — and competing — companies now involved coordination will become a nightmare.

It will be interesting to see if, this time around, the technology will be able to make some commercial progress.

]]>2011-02-09T14:22:00-08:00internetbroadbandpolicy_regulationwhite_spacewirelessNew Technology Brings Wi-Fi to TV Antennahttp://www.circleid.com/posts/new_technology_brings_wi_fi_to_tv_antenna/http://www.circleid.com/posts/new_technology_brings_wi_fi_to_tv_antenna/
Josh Taylor reporting in ZDNet: "The CSIRO will tomorrow unveil a breakthrough in wireless technology that will allow multiple users to upload content at the same time while maintaining a data transfer rate of 12 megabits per second (Mbps), all over their old analog TV aerial. The technology, named Ngara, allows up to six users to occupy the equivalent spectrum space of one television channel (7 megahertz) and has a spectral efficiency of 20 bits per second per hertz..."

]]>2010-11-03T09:54:00-08:00internetaccess_providerswhite_spacewirelessUniversal White Spaces: Moving Beyond the TV Bandshttp://www.circleid.com/posts/universal_white_spaces_moving_beyond_the_tv_bands/http://www.circleid.com/posts/universal_white_spaces_moving_beyond_the_tv_bands/
The FCC's recent decision allowing license-exempt access to TV White Spaces, i.e. unused TV channels, is a small but very important step in spectrum policy. But, more important than the TV bands, is the policy approach and the fact that it was adopted in the face of extreme lobbying by well established vested interests.

As I've argued in the past, TV spectrum is not as valuable as people think. It's apparently desirable propagation characteristics are a function of 20th century technology, not of the physics of electromagnetic radiation. As radio technology progresses we'll find a lot more use for higher frequencies (2 GHz - 10 GHz). If nothing else, there's a lot more spectrum available higher up, and that makes it easier to send a lot more data, for example 10 Gbps links instead of 100-300 Mbps (today's Wi-Fi) or 1-8 Mbps (today's mobile broadband).

Even more than in the TV bands, most of that higher frequency spectrum is unused most of the time even in urban areas. This is the wireless problem of the 21st century. Based on radio technology of 1900-1930, we gave out government guaranteed monopolies on almost all wireless spectrum — monopolies which make less and less sense in the light of 21st century technology and never made sense given basic electromagnetic theory. As a result of these spectrum licenses (monopolies), most of our wireless spectrum is completely unused.

The FCC's TV white spaces order is a major step in getting access to this wasted national resource! Yes, there are decades of work ahead, but the white spaces order gives us a viable political approach to breaking open other bands. Looking back 20 or 30 years from now, we'll remember the TV white spaces order as the first step in remaking our national spectrum policy.