Pretty Advanced New Stuff from CCG Consulting

Main menu

Monthly Archives: February 2018

I’ve been investigating smart city applications and one of the features that many smart network vendors are touting is expanded public safety networks that can provide cameras and other monitoring devices for police, making it easier to monitor neighborhoods and solve crimes. This seems like something most police departments have on their wish list, because cameras are 24/7 and can see things that people are never likely to witness.

The question I ask today is if this what America wants? There are a few examples of cities with ubiquitous video surveillance like London, but is that kind of surveillance going to work in the US?

I think we’ve gotten our first clue from Seattle. The City installed a WiFi mesh network using Aruba wireless equipment in 2013 with a $3.6 million grant from the Department of Homeland Security. The initial vision for the network was that it would be a valuable tool to provide security in the major port in Seattle as well as provide communications for first responders during emergencies. At the time of installation the city intended to also make the surveillance capabilities available to numerous departments within the City, not just to the police.

But when the antennas, like the one shown with this blog, went up in downtown Seattle in 2013, a number of groups began questioning the city about their surveillance policies and the proposed use of these devices. Various groups including the ACLU voiced concerns that the network would be able to track cellphones, laptops and other devices that have MAC addresses. This could allow the City to gather information on anybody moving in downtown or the Port and might allow the City to do things like identify and track protesters, monitor who enters and leaves downtown buildings, track the movement of homeless people who have cellphones, etc.

Privacy groups and the ACLU complained to the City that the network effectively was a suspicionless surveillance system that monitors the general population and is a major violation of various constitutional rights. The instant and loud protests about the network caught City officials by surprise and by the end of 2013 they deactivated the network until they developed a surveillance policy. The city never denied that the system could monitor the kinds of things that citizens were wary of. That surveillance policy never materialized, and the City recently hired a vendor to dismantle the network and salvage any usable parts for use elsewhere in the City.

I can think of other public outcries that have led to discontinuance of public monitoring systems, particularly speed camera networks that catch and ticket speeders. Numerous communities tried that idea and scrapped it after massive citizen outrage. New York City installed a downtown WiFi network a few years ago that was to include security cameras and other monitoring devices. From what I read they’ve never yet activated the security features, probably for similar reasons. A web search shows that other cities like Chicago have implemented a network similar to Seattle’s and have not gotten the negative public reaction.

The Seattle debacle leads to the question of what is reasonable surveillance. The developers of smart city solutions today are promoting the same kinds of features contained in the Seattle network, plus new ones. Technology has advanced since 2013 and newer systems are promising to include the first generation of facial recognition software and also the ability to identify people by their walking gait. These new monitoring devices won’t just track people with cellphones and can identify and track everybody.

I think there is probably a disconnect between what smart city vendors are developing and what the public wants out of their city government. I would think that most citizens are in favor of smart city solutions like smart traffic systems that would eliminate driving backups, such as changing the timing of lights to get people through town as efficiently as possible.

But I wonder how many people really want their City to identify and track them every time they go within reach of one of City monitors. The information gathered by such monitors can be incredibly personal. It identifies where somebody is including a time stamp. The worry is not just that a City might misuse such personal information, but IT security guys I’ve talked to believe that many Municipal IT networks are susceptible to hacking.

In the vendors defense they are promoting features that already function well. Surveillance cameras and other associated monitors are tried and true technologies that work. Some of the newer features like facial recognition are cutting edge, but surveillance systems installed today can likely be upgraded with software changes as the technology gets better.

I know I would be uncomfortable if my city installed this kind of surveillance system. I don’t go downtown except go to restaurants or bars, but what I do is private and is not the city’s business. Unfortunately, I suspect that city officials all over the country will get enamored by the claims from smart city vendors and will be tempted to install these kinds of systems. I just hope that there is enough public discussion of city plans so that the public understands what their city is planning. I’m sure there are cities where the public will support this technology, but plenty of others where citizens will hate the idea. Just because we have the technical capabilities to monitor everybody doesn’t mean we ought to.

Like this:

The administration finally published their infrastructure plan last week. The document is an interesting read, particularly for those with a financial bent like me. There is no way this early to know if this plan has any chance to make it through Congress, or how much it might change if it does pass. But it’s worth reviewing because it lets us know what the government is thinking about infrastructure and rural broadband.

First, the details of the plan:

The proposed plan provides $200B of federal funding over 10 years;

$100B goes to States in the form of a 20% grant for new projects that won’t require additional future federal spending;

$50B is set aside as grants to states as a grant program for rural infrastructure. States can use the money as they wish;

$20B goes to projects that are in the demonstration phase of new technologies and that can’t easily attract other financing;

$20B would to towards existing federal loan programs including Transportation Infrastructure Finance and Innovation Act (TIFIA) and the Water Infrastructure Finance and Innovation Act (WIFIA).

Finally, $10 billion would be used to create a revolving fund that would allow the purchase, rather than the lease of federal infrastructure.

The funding for the program is a bit murky, as you would expect at this early stage. It appears that some of the funding comes from existing federal highway infrastructure funds, and one might suppose those funds will still be aimed at highways.

This plan gives governors and state legislators a lot of new money to disperse, meaning that every state is likely to tackle this in a different way. That alone is going to mean a varied approach to funding or not funding rural broadband.

The plan is completely mute in terms of broadband funding. This makes sense since the plan largely hands funds to the states. The program does not promote rural broadband, but it certainly does not preclude it. The most likely source of any funding for rural broadband would come out of the $50B rural pot of funding. We’ll have to wait and see what strings are attached to that money, but the proposal would largely hand this money to states and let them decide how to use it.

The larger $100B pot is to be used to provide up to 20% of the funding for projects and there are very few rural fiber projects that don’t need more than 20% assistance to make them viable. If the 20% funding basis is firm for this pot of funding I can’t see it being used much for broadband.

States are not going to like this $100B funding pool because this completely flips the role of the federal government in infrastructure funding. Today, for many road and bridge projects the federal government supplies as much as 80% of the funding, and this flips the 80% to the states. Because of this, States are likely to view this as an overall long-term decrease in federal infrastructure spending. The trade-off for the flip, though, is that the money is immediate and the states get to decide what to fund. Today, when the feds are involved it can take literally decades to finish some road projects.

The overall goal of the plan is to promote private investment in infrastructure projects, which contrasts to today where almost all infrastructures projects are 100% government funded. Historically public/private partnerships (PPPs) have played only a small role in US infrastructure spending. PPPs have been successful in other countries for helping to build infrastructure projects on time and on budget – which is a vast improvement over government projects that routinely go over on both. But incorporating PPP financing into infrastructure spending is going to take a change of mindset. That challenge is going to be complicated by the fact that most of this spending will be dispersed by the states. And it’s the states that will or will not embrace PPPs, so we’ll probably have a varied response across the country.

One of the most interesting ideas embedded in the plan is that projects should be funded in such a way as to cover the long-term maintenance costs of a project. That’s a big change from today where roads, bridges and other major projects are constructed with no thought given about the funding for the ongoing maintenance, or even for related costs of a project for environmental and other ancillary costs. This is going to force a change in the way of thinking about infrastructure to account for the full life-cycle cost of a project up-front.

I’ve read a few private reports from investment houses and their take on the plans. The analysis I’ve seen believes that the vast majority of the money will go to highway, bridges and water projects. That might mean very little for rural broadband.

One thing is for sure, though. If something like this plan becomes law then the power to choose infrastructure projects devolves largely to states rather than the federal government. Today states propose projects to the feds, but under this plan the states would be able to use much of the federal funding as they see fit.

There are states that already fund some rural broadband infrastructure, and you might suppose those states would shuttle some of this new funding into those programs. But there are other states, some very rural, that have rejected the idea of helping to fund broadband. Expect a widely varying response if the states get the power to choose projects.

In summary, this plan is not likely going to mean any federal broadband grant program. But states could elect to take some of this funding, particularly the $50B rural fund, and use it to promote rural broadband. But there are likely to be as many different responses to this funding as there are states. We have a long way to go yet to turn this proposal into concrete funding opportunities.

Like this:

The FCC just voted implement a plan to give up to $4.53 billion dollars to the cellular carriers over the next ten years to bring LTE cellular and data to the most remote parts of America. While this sounds like a laudable goal, this FCC seems determined to hand out huge sums of money to the biggest telecom companies in the country. This program is labeled Mobility II and will be awarded through an auction among the cellular companies.

As somebody who travels frequently in rural America there certainly are still a lot of places with poor or no cellphone coverage. My guess is that the number of people that have poor cellphone coverage is greater than what the FCC is claiming. This fund is aimed at providing coverage to 1.4 million people with no LTE cellphone coverage and another 1.7 million people where the LTE coverage is subsidized.

Recall that the FCC’s knowledge of cellphone coverage comes from the cellphone companies who claim better coverage than actually exists. Cellphone coverage is similar to DSL where the quality of signal to a given customer depends upon distance from a cellphone tower. Rural America has homes around almost every tower that have crappy coverage and that are probably not counted in these figures.

My main issue with the program is not the goal – in today’s world we need cellphone coverage except to the most remote places in the country. My problem is that the perceived solution is to hand yet more billions to the cellular carriers – money that could instead fund real broadband in rural America. Additionally, the ten-year implementation is far too long. That’s an eternity to wait for an area with no cellular coverage.

I think the FCC had a number of options other than shelling out billions to the cellular companies:

The FCC could require the cellular companies to build these areas out of their own pockets as a result of having taken the licensed spectrum. Other than Sprint, these companies are extremely profitable right now and just got a lot more profitable because of the recent corporate tax-rate reductions. The FCC has always had build-out requirements for spectrum and the FCC could make it mandatory to build the rural areas as a condition for retaining the spectrum licenses in the lucrative urban areas.

The FCC could instead give unused spectrum to somebody else that is willing to use it. The truth is that the vast majority of licensed spectrum sits unused in rural America. There is no reason that spectrum can’t come with a use-it-or-lose it provision so that unused spectrum reverts back to the FCC to give to somebody else. There are great existing wireless technologies that work best with licensed spectrum and it’s aggravating to see the spectrum sit unused but still unavailable to those who might use it.

Finally, the FCC could force the cellular carriers to use towers built by somebody else. I work with a number of rural counties that would gladly build towers and the necessary fiber to provide better cellphone coverage. It would cost the cellular carriers nothing more than the cell site electronics if others were to build the needed core infrastructure.

This idea of handing billions to the big telecom companies is a relatively new one. Obviously the lobbyists of the big companies have gained influence at the FCC. It’s not just this FCC that is favoring the big companies. Originally the CAF II program was going to be available to everybody using reverse auction rules. But before that program was implemented the Tom Wheeler FCC decided to instead just give the money to the big telcos if they wanted it. The telcos even got to pick and choose and reject taking funding for remote places which will now be auctioned this summer.

That same CAF II funding could have been used to build a lot of rural fiber or other technologies that would have provided robust broadband networks. But instead the telcos got off the hook by having to only upgrade to 10/1 Mbps – a speed that was already obsolete at the time of the FCC order.

Now we have yet another federal program that is going to shovel more billions of dollars to big companies to provide broadband that will supposedly meet a 10/1 Mbps speed. But like with CAF II, the carriers will get to report the results of the program to the FCC. I have no doubt that they will claim success even if coverage remains poor. Honestly, there are days as an advocate for rural broadband that you just want to bang your head against a wall it’s hard to see billions and billions wasted that could have brought real broadband to numerous rural communities.

Like this:

Lately I’ve looked at a lot of what I call a hybrid network model for bringing broadband to rural America. The network involves building a fiber backbone to support wireless towers while also deploying fiber to any pockets of homes big enough to justify the outlay. It’s a hybrid between point-to-multipoint wireless and fiber-to-the home.

I’ve never yet seen a business model that shows a feasible model for building rural FTTP without some kind of subsidy. There are multiple small telcos building fiber to farms using some subsidy funding from the A-CAM portion of the Universal Service Fund. And there are state broadband grant programs that are helping to build rural fiber. But otherwise it’s hard to justify building fiber in places where the cost per passing is $10,000 per household or higher.

The wireless technology I’m referring is a point-to-multipoint wireless network using a combination of frequencies including WiFi and 3.65 GHz. The network consists of placing transmitters on towers and beaming signals to dishes at a customer location. In areas without massive vegetation or other impediments this technology can now reliably deliver 25 Mbps download for 6 miles and higher bandwidth closer to the tower.

A hybrid model makes a huge difference in financial performance. I’ve now seen an engineering comparison of the costs of all-fiber and a hybrid network in half a dozen counties and the costs for building a hybrid network are in the range of 20% – 25% of the cost of building fiber to everybody. That cost reductions can result in a business model with a healthy return that creates significant positive cash over time.

There are numerous rural WISPs that are building wireless networks using wireless backhaul rather than fiber to get bandwidth to the towers. That solution might work at first, although I often see new wireless networks of this sort that can’t deliver the 25 Mbps bandwidth to every customer due to backhaul restraints. It’s guaranteed that the bandwidth demands from customers on any broadband network will eventually grow to be larger than the size of the backbone feeding the network. Generally, over a few years a network using wireless backhaul will bog down at the busy hour while a fiber network can keep up with customer bandwidth demand.

One key component of the hybrid network is to bring fiber directly to customers that live close to the fiber. This means bringing fiber to any small towns or even small pockets of 20 or more homes that are close together. It also means giving fiber to farms and rural customers that happen to live along the fiber routes. Serving some homes with fiber helps to hold down customer density on the wireless portion of the network – which improves wireless performance. Depending on the layout of a rural county, a hybrid model might bring fiber to as much as 1/3 of the households in a county while serving the rest with wireless.

Another benefit of the hybrid model is that it moves fiber deeper into rural areas. This can provide the basis for building more fiber in the future or else upgrading wireless technologies over time for rural customers.

A side benefit of this business plan is that it often involves build a few new towers. Areas that need towers typically already have poor, or nonexistent cellular cover. The new towers can make it easier for the cellular companies to fill in their footprint and get better cellular service to everybody.

One reason the hybrid model can succeed is the high customer penetration rate that comes when building the first real broadband network into a rural area that’s never had it. I’ve now seen the customer numbers from numerous rural broadband builds and I’ve seen customer penetration rates range between 65% and 85%.

Unfortunately, this business plan won’t work everywhere, due to the limitations of wireless technology. It’s much harder to deploy a wireless network of this type in an area with heavy woods or lots of hills. This is a business plan for the open plains of the Midwest and West, and anywhere else with large areas of open farmland.

County governments often ask me how they can get broadband to everybody in their county. In areas where the wireless technology will work, a hybrid model seems like the most promising solution.

Like this:

In the recently released 2018 Broadband Progress Report the FCC reluctantly kept the official definition of broadband at 25/3 Mbps. I say reluctantly because three of the Commissioners were on record for either eliminating the standard altogether or else reverting back to the older definition of 10/1 Mbps.

I’m guessing the Commissioners gave in to a lot of public pressure to keep the 25/3 standard. Several Commissioners had also taken a public stance that they wanted to allow cellular data to count the same for a household as landline broadband – and that desire was a big factor in lowering the definition since cellphones rarely meet the 25/3 speed standard.

The deliberation on the topic this year raises the question if there is some way to create a rule that would better define the speed of needed broadband. It’s worth looking back to see how the Tom Wheeler FCC came up with the 25/3 definition. They created sample profiles of the way that households of various sizes are likely to want to use broadband. In doing so, they added together the bandwidth needed for various tasks such as watching a movie or supporting a cellphone.

But the FCC’s method was too simple and used the assumption that various simultaneous uses of broadband are additive. They added together the uses for a typical family of four which resulted in bandwidth needs greater than 20 Mbps download, and used that as the basis for setting the 25/3 standard. But that’s now home broadband works. There are several factors that affect the actual amount of bandwidth being used:

For example, doing simultaneous tasks on a broadband network increases the overhead on the home network. If you are watching a single Netflix stream, the amount of needed bandwidth is predictable and steady. But if three people in a home are each watching a different Netflix the amount of needed bandwidth is greater than adding together the three theoretical streams. When your ISP and your home router try to receive and untangle multiple simultaneous streams there are collisions of packets that get lost and which have to be retransmitted. This is described as adding ‘overhead’ to the transmission process. Depending on the nature of the data streams the amount of collision overhead can be significant.

Almost nobody directly wires the signal from their ISP directly too all of their devices. Instead we use WiFi to move data around to various devices in the home. A WiFi router has an overhead of its own that adds to the overall bandwidth requirement. As I’ve covered in other blogs, a WiFi network is not impacted only by the things you are trying to do in your home, but a WiFi network is slowed when it pauses to recognizes demands for connection from your neighbor’s WiFi network.

Any definition of home broadband needs should reflect these overheads. If a household actually tries to download 25 Mbps of usage from half a dozen sources at the same time on a 25 Mbps, the various overheads and collisions will nearly crash the system.

The FCC’s definition of broadband also needs to reflect the real world. For example, most of the unique programming created by Netflix and Amazon Prime are now available in 4K. I bought a large TV last year and we now watch 4K when it’s available. That means a stream of 15-20 Mbps download. That stream forced me to upgrade my home WiFi network to bring a router into the room with the TV.

The FCC’s speed definition finally needs to consider the busy hour of the day – the time when a household uses the most broadband. That’s the broadband speed that the home needs.

We know household bandwidth needs keep increasing. Ten years ago I was happy with a 5 Mbps broadband product. Today I have a 60 Mbps product that seems adequate, but I know from tests I did last year that I would be unhappy with a 25 Mbps connection.

The FCC needs a methodology that would somehow measure actual download speeds at a number of homes over time to understand what homes area really using for bandwidth. There are ways that this could be done. For example, the FCC could do something similar for broadband like what Nielsen does for cable TV. The FCC could engage one of the industry firms that monitor broadband usage such as Akamai to sample a large number of US homes. There could be sample voluntary homes that meet specific demographics that would allow monitoring of their bandwidth usage. The accumulated data from these sample homes would provide real-life bandwidth usage as a guide to setting the FCC’s definition of broadband. Rather than changing the official speed periodically, the FCC could change the definition as needed as dictated by the real-world data.

The FCC does some spot checking today of the broadband speeds as reported by the ISPs that feed the national broadband map. But that sampling is random and periodic and doesn’t provide the same kind of feedback that a formal ongoing measuring program would show. We have tools that could give the FCC the kind of feedback it needs. Of course, there are also political and other factors used in setting the official definition of broadband, and so perhaps the FCC doesn’t want real facts to get into the way.

Like this:

The Schools, Health & Libraries Broadband Coalition (SHLB) announced a strategy to bring broadband to every anchor institution in the continental US. They estimate this would cost between $13 and $19 billion. They believe this would act as a first step to bring broadband to unserved and underserved rural communities.

While this sounds like a reasonable idea, we’ve tried this before and it largely hasn’t worked. Recall that the BTOP program in 2009 and 2010 funded a lot of middle mile fiber projects that brought broadband deeper into parts of the country that didn’t have enough fiber. That program required the BTOP middle mile fiber providers to serve all anchor institutions along the path of their networks and was a smaller version of this same proposal.

We’re approaching a decade later and a lot of the communities connected by BTOP middle mile grants still don’t have a last mile broadband network. There are some success stories, so I don’t want to say that middle mile fiber has no value – but for the most part nobody is making that last mile investment in rural areas just because the BTOP middle mile fiber was built.

BTOP isn’t the only program that has built fiber to anchor institutions. There are a number of states and counties that have built fiber networks for the express purposes of serving anchor institutions. There are also numerous fiber networks that have been built by school systems to support the schools.

In many cases I’ve seen these various anchor institution networks actually hurt potential last mile fiber investment. Anybody that is going to build rural fiber needs as many ‘large’ customers as it can get to help offset building expensive rural fiber. I’ve had clients who were thinking about building fiber to a small rural town only to find out that the school, city hall and other government locations already had inexpensive broadband on an existing fiber network. Taking those revenues out of the equation can be enough to sink a potential business plan.

At least BTOP fiber required that the network owners make it easy for last mile providers to get reasonably priced backbone access on their networks. Many of the state and school board networks are prohibited from allowing any commercial use of their network. I’ve never understood these prohibitions against sharing spare pairs of government fiber with others, but they are fairly common. Most come from State edicts that are likely prompted by the lobbyists for the big carriers.

I’m sure I’ll take some flak for my position, but I’ve seen the negative results of this idea too many times in the real world. Communities get frustrated when they see a gigabit connection at a school or City Hall when nobody else in the area has decent broadband. I’ve even seen government staff and officials who have fast broadband in their offices turn a deaf ear to the rest of the community that has poor or no broadband.

To make matters worse, many of the BTOP networks have run into economic difficulties. The companies that invested in BTOP bought into the hype that the middle mile fiber networks would attract last mile fiber investments, and they counted on those extra revenues for long-term viability. But a significant portion of the BTOP middle mile networks ended up being fiber to nowhere. Companies funded by BTOP needed to bring matching capital, and a number of the BTOP providers have had to sell their networks at a huge discount and walk away from their unpaid debt since the revenues to cover debt payments never materialized.

This also raises the question of who is going to maintain the enormous miles of fiber that would be built by this proposal. Somebody has to pay the electric bill to keep the fiber lit. Somebody needs to do routine maintenance as well as fix fiber cuts and storm damage. And somebody has to pay to periodically replace the electronics on the network, which have an average economic life of around ten years.

I feel certain I will get an inbox full of comments about this blog. I’m bound to get stories telling me about some of the great success stories from the BTOP networks – and they do exist. There are cases where the middle mile fiber made it easier for some ISP to build last mile fiber to a rural community. And certainly a lot of extremely rural schools, libraries and other anchor institutions have benefitted from the BTOP requirement to serve them. But I believe there are more stories of failure that offset the success stories.

I seriously doubt that this FCC and administration would release this much money for any kind of rural broadband. But this is the kind of idea that can catch the interest of Congress and that could somehow get funded. There is no politician in DC who will take a stance against schools and libraries.

I can think of much better ways to spend that much money in ways that would bring broadband solutions many whole rural communities, not just to the anchor institutions. That’s not enough money to fix all of our rural broadband issues, but it would be a great start, particularly if distributed in a grant program for last mile projects that requires matching private investment.

There is a lot of speculation that we might be seeing some money aimed at broadband due to the budget passed by Congress on February 9. That bill contains $20 billion for infrastructure spending spread evenly in fiscal years 2018 and 2019. On a floor speech as part of the vote, Senate Majority Leader Charles Schumer says the money will go towards “existing projects for water and energy infrastructure as well as expanding broadband to rural regions and improving surface transportation”.

Any broadband money that comes out of this funding will have to be spent quickly by the government. The fiscal year 2018 is already almost half over and ends on September 30 of this year. It’s likely that any grants coming out of such money would have to awarded before that September date to count as spending in this fiscal year. In order to move that fast I’m guessing the government is going to have to take shortcuts and use processes already in place. That probably means using the BTOP grant forms and processes again.

The short time frame for any of this funding also likely means that only ‘shovel-ready’ projects will be considered. But that aligns with statements made by the administration last year when talking about infrastructure projects. Anybody hoping to go after such grants better already have an engineered project in mind.

Assuming that funding follows the BTOP funding program, there were a few issues in those grants that ought to be kept in mind:

The grants favored areas that had little or no broadband. This is going to be more muddled now since a lot of rural America is seeing, or soon will be seeing broadband upgrades from the CAF II and A-CAM programs funded by the FCC. It’s doubtful that the big telcos are updating the national databases for these upgrades on a timely basis, so expect mismatches and challenges from them if somebody tries to get funding for an area that’s just been upgraded.

The BTOP grants required that anybody that wanted funding had to already have the matching funds in place. There were some notable BTOP failures from winners who didn’t actually have the funding ready, and I speculate tighter restrictions this time.

There were several requirements that added a lot of cost to BTOP programs – requirement to pay prevailing wages along with environmental and historic preservation reviews. There has been talk in Congress about eliminating some of these requirements, and hopefully that would happen before any funding. But that will take Congressional action soon.

The BTOP process surprisingly awarded a number of projects to start-up companies. Some of these start-ups have struggled and a few failed and it will be interesting to see if they make it harder for start-ups. The BTOP process also made it difficult, but not impossible for local governments to get the funding.

If there is going to be any money allocated for broadband, it’s going to have to be announced soon and one would think that deadline to ask for this funding is going to have to come soon – in very early summer at the latest.

The alternative to a federal grant program would be to award the $20 billion as block grants to states. If that happens it might be bad news for rural broadband. There are only a handful of states that have created state broadband grant programs. Any state with an existing program could easily shuttle some of this funding into broadband.

States without existing broadband programs will have a harder time. Most states will need legislative approval to create a broadband grant program and would also have to create the mechanisms for reviewing and approving these grants – a process that we’ve seen take a year in the few states that are already doing this.

It’s almost been two weeks since the budget was passed and I’ve read nothing about how the $20 billion will be used. Regardless of the path chosen, if any of this money is going to go to rural broadband we need to know how it will work soon, or else the opportunity for using the money this year will likely be lost.

Like this:

The current FCC has a clear bias towards the big cable companies, telcos and cellular companies. There is nothing particularly wrong with that since this FCC represents an administration that also is big-business oriented. Past FCC’s have clearly favored policies that reflected the administration in charge. For instance, the prior FCC under Tom Wheeler was pro-consumer in many ways and pushed things like net neutrality and privacy – issues that had the support of the administration but not of the giant ISPs.

However, in the last few decades the FCC has gotten a lot more partisan. It’s becoming rare to see a policy vote that doesn’t follow party lines. This isn’t true of everything and we see unanimous FCC Commissioner support for things like providing more spectrum for broadband. But FCC voting on any topic that has political overtones now seem to follow party lines.

The most recent example of the increased partisanship is evident with the release of this year’s 2018 Broadband Deployment Report to Congress. In that report Chairman Pai decided to take the stance that the state of broadband in the country is fine and needs no FCC intervention. The FCC is required to determine the state of broadband annually and report the statistics and its conclusions to Congress. More importantly, Section 706 of the Telecommunications Act of 1996 requires that the FCC must take proactive steps to close any perceived gaps in broadband coverage.

In order to declare that the state of broadband in the country doesn’t require any further FCC action, Chairman Pai needed to come up with a narrative to support his conclusion. The argument he’s chosen to defend his position is a bit startling because by definition it can’t be true.

The new broadband report, released on February 9 concludes that “advanced telecommunications capability is being deployed to all Americans in a reasonable and timely fashion. . . This finding does not mean that all Americans now have broadband access. Rather, it means that we are back on the right track when it comes to deployment“. The kicker comes in when the report says that the FCC’s data from ISPs “does not yet reflect the beneficial effects of the Commission’s actions in 2017,” such as “ending the adverse impact on investment caused by the [2015 Net Neutrality] Order. . . For instance, several companies, including AT&T, Verizon, Frontier, and Alaska Communications either commenced or announced new deployments in 2017.”

In effect, the FCC now says that broadband deployment is back on track due to its December 2017 net neutrality repeal. But the ‘facts’ it cites don’t support its argument. First, any broadband improvements made by the cited telcos in 2017 would have been authorized and started in 2016 before this new FCC even was in place. Further, a big percentage of the recent broadband deployments of these particular telcos are due to earlier FCC decisions prior to the Pai FCC. For example, AT&T was required as a requirement of the purchase of DirectTV to pass 12 million new residences and businesses with fiber. A lot of the broadband spending made by AT&T, Frontier and Alaska Communications are using CAF II funds given to them by the Wheeler FCC and which the companies are required to spend. None of those expenditures have anything to do with the repeal of net neutrality. And since net neutrality was only reversed a few months ago, it’s impossible to believe that any of the broadband spending in 2017 was due to this FCC. It’s far too early to see if that order will have any impact on rural broadband expenditures (something no industry experts expect).

This FCC Broadband Report concludes that deployment of broadband in the country is on track and reasonable. Yet the numbers in the report show that there are still 19 million Americans in rural America without access to adequate broadband. There are 12 million school-aged children who are suffering from the homework gap because they don’t have broadband at home.

By declaring that broadband deployment is adequate, Chairman Pai has let his FCC off the hook for having to take any actions to address the issue. But his stated reasons are based upon an argument that is clearly not supported by any facts. This would seem to put the Chairman in violation of his Section 706 obligations, although that’s probably something only Congress can determine.

I’m saddened to see the FCC become so partisan. This is not a new phenomenon and we saw partisan voting under the last several FCCs. Before that we had pro-big business FCCs such as the one under Chairman Michael Powell. But that FCC was far less partisan and still basically used the facts at its disposal in making decisions and setting policy.

The FCC has a mandate to balance what’s good for both the public and the telecom companies. In an ideal world the FCC would be a neutral arbiter that listens to the facts and sides with the best arguments. This trend towards a partisan FCC is bad for the industry because it means that major policies will flip-flop when we change administrations – and that’s not good for ISPs or the public. Partisanship does not excuse this FCC from abrogating its responsibility and making specious arguments not supported by facts. This FCC has taken partisanship too far.

Like this:

Several different events in the last week got me thinking about an interesting trend in the cable industry. First, in my community there is a Redbox outlet in a neighborhood grocery store. My wife and I were discussing how busy they seem to be in renting out movie DVDs. All of the Blockbuster and other movie rental outlets have closed. Until I moved to this neighborhood recently I hadn’t notice any video stores or related outlets in a long time. But this Redbox seemed to have a lot of business.

I also saw an article in FierceCable that noted that only 5% of US households have subscribed to a vMVPD – an online cable provider like Sling TV, DirecTV Now or Playstation Vue. My first thought is that a 5% market penetration seems pretty phenomenal for an industry that is barely two years old. But the article notes that while 5% of households are current subscribers to online programming, another 8% of the market has tried and dropped one of the services. Since only about 20% of the total households don’t have traditional cable service it makes you wonder what the real upper potential for this market might be – it might be a lot smaller than the vMVPDs are hoping for.

I also went to a Superbowl party. The half dozen families attending are from my neighborhood and it turns out all of the households are cord cutters and don’t subscribe to traditional cable service. I was the only one that used a vMVPD and I currently have a subscription to Playstation Vue. None of them had tried a vMVPD and they seemed to have no interest in doing so. (I only use Playstation Vue because it’s the cheapest way to get Big10 sports and Fox Sportsnet so I can watch Maryland sports teams – I rarely watch the other linear programming).

National broadband penetration rates are now at 84% of all households. I’ve seen many of the opponents of spending money to build rural broadband say that households just want broadband to watch video. Netflix has made a huge dent in the market and served nearly 55 million US homes at the end of 2017. Add to that some percentage of the 90 million homes that subscribe to Amazon Prime, and it seems like there might be some truth in that.

But if households are cutting the cord, why aren’t more of them buying one of the on-line cable alternatives? Those services have packages that carry only the most popular cable channels at half the price of buying traditional cable.

I think the answer is a combination of two factors. One of the predominant factors is price. Every family at the neighborhood party has kids and they dropped traditional cable because it was too expensive. That has to be the factor that explains why the Redbox outlet is doing so well. Most of the movies available from Redbox are also available online. But getting online means also having an Internet-enabled TV or else buying a Roku or other web interface. And even then, watching many of these newer movies means subscribing to yet a different online service. I think there is a cost barrier, or perhaps a technology barrier that is keeping households using a traditional DVD player and Redbox.

Two different households at the party told me that they were satisfied with just watching Netflix and the free programming available on YouTube. And that is the second important trend. Households are getting used to watching just a subset of the programming that is available to them. When somebody drops cable TV and doesn’t buy a vMVPD service it means they have walked away from all of the content that is available only in those two media.

Most of my neighbors still watch the major networks using rabbit ears (something I don’t do). So they are still watching whatever is available on local CBS, NBC, ABC and FOX. But the families on my street are learning to live without the Game of Thrones, or the Walking Dead. They are no longer watching ESPN, Discovery, Comedy Central, the Food Network and the hundreds of channels that make up traditional cable TV.

This means their kids are not growing up watching traditional cable networks, and thus are not developing any brand loyalty to those networks or their programming. If you don’t learn to love a network when you are a teenager, will you decide to watch it when you are older?

I don’t have any answers to these questions, and obviously I can’t define a trend just from talking with some of my neighbors. But I found it intriguing that they all had dropped traditional cable and had not replaced that programming with something online. This tells me that there must be a lot of people who are not enamored with linear programming whether it’s on cable or online. And a lot of people are convincing themselves that it’s okay to walk completely away from the big pile of programming that is offered by the cable TV networks.

This is potentially a watershed phenomenon, because somebody that walks away from traditional programming is probably not coming back. These folks are cord cutters who are literally walking away from most of the programming available on traditional cable. Those networks and their programming are going to become irrelevant to them. But interestingly they are still going to consume a lot of video content – just not that created by the traditional cable networks. In my mind these households are looking a lot like Generation Z in that they are foregoing traditional programming and watching something else.

The vMVPDs are banking that people will transition down to their smaller packages when they leave a cable TV provider. But will they? This is a phenomenon that you can’t determine from industry-wide statistics, other than perhaps by seeing the dropping number of paid subscriptions to the various cable networks. People like my neighbors are dropping cable due to the expense, but they are quickly learning to live without traditional cable programming and aren’t chasing the online alternatives.

Like this:

The big ISPs know that the public is massively in favor of net neutrality. It’s one of those rare topics that polls positively across demographics and party lines. Largely through lobbying efforts of the big ISPs, the FCC not only killed net neutrality regulation but they surprised most of the industry by walking away from regulating broadband at all.

We now see states and cities that are trying to bring back net neutrality in some manner. A few states like California are creating state laws that mimic the old net neutrality rules. Many more states are limiting purchasing for state telecom to ISPs that don’t violate net neutrality. Federal Democratic politicians are creating bills that would reinstate net neutrality and force it back under FCC jurisdiction.

This all has the big ISPs nervous. We certainly see this in the way that the big ISPs are talking about net neutrality. Practically all of them have released statements talking about how much they support the open Internet. These big companies already all have terrible customer service ratings and they don’t want to now be painted as the villains who are trying to kill the web.

A great example is AT&T. The company’s blog posted a letter from Chairman Randall Stephenson that makes it sound like AT&T is pro net neutrality. It fails to mention how the company went to court to overturn the FCC’s net neutrality decision or how much they spent lobbying to get the ruling overturned.

AT&T also took out full-page ads in many major newspapers making the same points. In those ads the company added a new talking point that net neutrality ought to also apply to big web companies like Facebook and Twitter. That is a red herring because web companies, by definition, can’t violate net neutrality since they don’t control the pipe to the customers. Many would love to see privacy rules that stop the web companies from abusing customer data – but that is a separate issue than net neutrality. AT&T seems to be making this point to confuse the public and deflect the blame away from themselves.

Stephenson says that AT&T is favor of federal legislation that would ensure net neutrality. But what he doesn’t say is that AT&T favors a bill the big companies are pushing that would implement a feel-good watered-down version of net neutrality. Missing from that proposed law (and from all of AT&T’s positions) is any talk of paid priority – one of the three net neutrality principles. AT&T has always wanted paid prioritization. They want to be able to charge Netflix or Google extra to access their networks since those two companies are the largest drivers of web traffic.

In my mind, abuse of paid prioritization can break the web. ISPs already charge their customers enough money to fully cover the cost of the network needed to support broadband. Customers with unlimited data plans, like most landline connections, have the right to download as much content as they want. The idea of an AT&T then also charging the content providers for the privilege to get to customers is a terrible idea for a number of reasons.

Consider Netflix. It’s likely that they would pass any fees paid to AT&T on to customers. And in doing so, AT&T has violated the principle of non-discrimination of traffic, albeit indirectly, by making it more expensive for people to use Netflix. AT&T will always say that are not the cause of a Netflix rate increase – but AT&T is able to influence the market price of web services, and in doing so discriminate against web traffic.

The other problem with paid prioritization is that it is a barrier to the next Netflix. New companies without Netflix’s huge customer base could not afford the fees to connect to AT&T and other large ISPs. And that barrier will stop the next big web company from launching.

I’ve been predicting that the ISPs are not going to do anything that drastically violates net neutrality for a while. They are going to be cautious about riling up the public and legislators since they understand that Congress could reinstate both net neutrality and broadband regulation at any time. The ISPs are enjoying the most big-company friendly FCC there has ever been, and they are getting everything they want out of them.

But big ISPs like AT&T know that the political and regulatory pendulum can and will likely swing the other way. Their tactic for now seems to be to say they are for net neutrality while still working to make sure it doesn’t actually come back. So we will see more blogs and newspaper ads and support for watered-down legislation. They are clearly hoping the issue loses steam so that the FCC and administration don’t reinstate rules they don’t want. But they realistically know that they are likely to be judged by their actions rather than their words, so I expect them to ease into practices that violate net neutrality in subtle ways that they hope won’t be noticed.