Pretty Advanced New Stuff from CCG Consulting

Main menu

Monthly Archives: March 2018

It’s really easy for ISPs to assume that customers want what we are selling. I have clients that march into a market and assume that the majority of customers in the market will gladly change to a fiber network if they are using an older technology today. But sometimes we find that customers don’t want our products.

I have a number of examples of this. Many years ago I helped a client who was building traditional HFC cable TV networks and one of the markets had a substantially lower customer penetration rate than he expected. It didn’t take much digging to find out that a substantial portion of the community were conservative Mennonites who just weren’t interested in cable. He might still have built the town anyway, but he had to pare back his earnings expectations for that particular portion of his buildout.

I have another client who was building fiber in small towns and also experienced low take rates. He had guessed on his likely take rate by counting the homes with a satellite dish and he assumed that a reasonable percentage of these would want fiber broadband since the town had almost non-functional DSL. But it turns out that this particular little town was full of folks who considered themselves as political outsiders and they didn’t trust any ISP in their homes.

Those two examples are the extremes, but I see this phenomenon even more when clients introduce new products. New Product launches often are not the success that an ISP is hoping for. Consider some of the following examples:

I had a rural client who got almost no takers for a new burglar alarm and security product. Turns out most of the people in this market didn’t even lock their doors and weren’t worried about security.

I had a new client roll-out the triple play in a college town. He expected a lower-than-average take rate of telephone and cable TV because of the students, but was shocked when he sold less than half of those products compared to his expectations.

I have a number of clients who are getting almost no takers for smart home products, while they know neighboring ISPs who are doing well with the products.

All of these situations could have been made better with some market research. We’ve always found that a survey, if done correctly is an accurate predictor of residential penetration rates. A valid survey must be administered randomly and also given to enough people to be valid. I’ve seen many clients rely on non-random surveys, such as sending out post cards – and then being surprised by the results of their new roll-out.

So asking customers is always a good idea. But sometimes it’s not particularly cost effective. It can easily cost $7,000 – $10,000 to do a proper survey and that might not make sense when just introducing a new product line into a small market.

But there are other techniques. One is to pre-sell with a campaign that says you’ll roll out a product if enough people in the market show interest. If the new product is something that people want badly enough you’ll find neighbors talking to neighbors about the campaign.

One interesting marketing I saw recently involved giving temporary upgrades for free to customers. I had a client who improved the network and could now offer faster broadband speeds. But after a round of marketing, nobody was upgrading so the ISP started offering a free upgrade for three months, with the provision that customers could go back to the slower speeds if they didn’t want the new product. Virtually every customer kept the faster product and paid the higher price after the trial.

Finally, you can never discount customer education. For example, I have a client selling smart home product that gives folks a free assessment of the ways it can help their home. They walk through the home and talk about all of the ways that it might make the customer’s life easier. They don’t use any hard sell techniques – it’s done by technicians and is strictly a factual listing of what might or might not work for each customer depending upon their home. Price is only brought up if the customer shows interest. This has resulted in significant sales of the new product line, which the client believes is because the customer has gained an understanding of the real-life benefits of the product.

All too many times I’ve seen traditional marketing programs fail. Many customers have grown immune to mailers and don’t even open mail. And people generally will not call for a new product when they don’t understand the immediate relevancy to them. When you find yourself running into ineffective marketing it’s time to get create and try something new and different.

Like this:

It didn’t take long for somebody say they will have a 6G cellular product. Somebody has jumped the gun every time there has been migration to a new cellular standard, and I remember the big cellular companies making claims about having 4G LTE technology years before it was actually available.

But this time it’s not a cellular company talking about 6G – it’s Charter, the second largest US cable company. Charter is already in the process of implementing LTE cellular through the resale of wholesale minutes from Verizon – so they will soon be a cellular provider. If we look at the early success of Comcast they might do well since Charter has almost 24 million broadband customers.

Tom Rutledge, the Charter CEO made reference to 5G trials being done by the company, but also went on to tout a new Charter product as 6G. What Rutledge is really talking about is a new product that will put a cellular micro cell in a home that has Charter broadband. This hot spot would provide strong cellular coverage within the home and use the cable broadband network for backhaul for the calls.

Such a network would benefit Charter by collecting a lot of cellular minutes that Charter wouldn’t have to buy wholesale from Verizon. Outside of the home customers would roam on the Verizon network, but within the home all calls would route over the landline connection. Presumably, if the home cellular micro transmitters are powerful enough, neighbors might also be able to get cellular access if they are Charter cellular customers. This is reminiscent of the Comcast WiFi hotspots that broadcast from millions of their cable modems.

This is not a new idea. For years farmers have been buying cellular repeaters from AT&T and Verizon to boost their signal if they live near the edge of cellular coverage. These products also use the landline broadband connection as backhaul – but in those cases the calls route to one of the cellular carriers. But in this configuration Charter would intercept all cellular traffic and presumably route the calls themselves. There are also a number of cellular resellers who have been using landline backhaul to provide low-cost calling.

This would be the first time that somebody has ever contemplated this on a large scale. One can picture large volumes of Charter cellular micro sites in areas where they are the incumbent cable company. When enough homes have transmitters they might almost create a ubiquitous cellular network that is landline based – eliminating the need for cellular towers.

It’s an interesting concept. A cable company in some ways is already well positioned to implement a more traditional small cell cellular network. Once they have upgraded to DOCSIS 3.1 they can place a small cell site at any pole that is already connected to the cable network. For now the biggest hurdle to such a deployment is the small data upload speeds for the first generation of DOCSIS 3.1, but cable labs has already released a technology that will enable faster upload speeds, up to synchronous connections. Getting faster upload speeds means finding some more empty channel slots on the cable network and could be a challenge in some networks.

The most interesting thing about this idea is that anybody with a broadband network could offer cellular service in the same way if they can make a deal to buy wholesale minutes. But therein lies the rub. While there are now hundreds of ‘cellular’ companies, only a few of them own their own cellular networks and everybody else is reselling. Charter is large enough to probably feel secure about having access to long-term cellular minutes from the big cellular companies. But very few other landline ISPs are going to get that kind of locked arrangement.

I’ve always advised clients to be wary of any resell opportunity because the business can change on a dime when the underlying provider changes the rules of the game. Our industry is littered with examples of companies that went under when the large resale businesses they had built lost their wholesale product. The biggest such company that comes to mind was Talk America that had amassed over a million telephone customers on resold lines from the big telcos. But there are many other examples of paging resellers, long distance resellers and many other telco product reselling that only lasted as long as the underlying network providers agreed to supply the commodity. But this is such an intriguing idea that many landline ISPs are going to look at what Charter is doing and wonder why they can’t do the same.

Like this:

I started to write a blog a few weeks ago asking the question of whether we should be regulating big web companies like Google and Facebook. I put that blog on hold due to the furor about Cambridge Analytica and Facebook. The original genesis for the blog was comments made by Michael Powell, the President and CEO of NCTA, the lobbying arm for the big cable companies.

At a speech given at the Cable Congress in Dublin, Ireland Powell said that edge providers like Facebook, Google, Amazon and Apple “have the size, power and influence of a nation state”. He said that there is a need for antitrust rules to reign in the power of the big web companies. Powell put these comments into a framework of arguing that net neutrality is a weak attempt to regulate web issues and that regulation ought to instead focus on the real problems with the web for issues like data privacy, technology addiction and fake news.

It was fairly obvious that Powell was trying to deflect attention away from the lawsuits and state legislation that are trying to bring back net neutrality and Title II regulations. Powell did make same some good points about the need to regulate big web companies. But in doing so I think he also focuses the attention back on ISPs for some of the same behavior he sees at the big web providers.

I believe that Powell is right that there needs to be some regulation of the big edge providers. The US has made almost no regulations concerning these companies. It’s easy to contrast our lack of laws here to the regulations of these companies in the European Union. While the EU hasn’t tackled everything, they have regulations in place in a number of areas.

The EU has tackled the monopoly power of Google as a search engine and advertiser. I think many people don’t understand the power of Google ads. I recently stayed at a bed and breakfast and the owner told me that his Google ranking had become the most important factor in his ability to function as a business. Any time they change their algorithms and his ranking drops in searches he sees an immediate drop-off in business.

The EU also recently introduced strong privacy regulations for web companies. Under the new rules consumers must opt-in the having their data collected and used. In the US web companies are free to use customer information in any manner they choose – and we just saw from the example of Cambridge Analytica how big web companies like Facebook monetize consumer data.

But even the EU regulations are going to have little impact if people grant the ability for the big companies to use their data. One thing that these companies know about us is that we willingly give them access to our lives. People take Facebook personality tests without realizing that they are providing a detailed portrait of themselves to marketeers. People grant permissions to apps to gather all sorts of information about them, such a log of every call made from their cellphone. Recent revelations show that people even unknowingly grant the right to some apps to read their personal messages.

So I think Powell is right in that there needs to be some regulations of the big web companies. Probably the most needed regulation is one of total transparency where people are told in a clear manner how their data will be used. I suspect people might be less willing to sign up for a game or app if they understood that the app provider is going to glean all of the call records from their cellphone.

But Powell is off base when he thinks that the actions of the edge providers somehow lets ISPs off the hook for similar regulation. There is one big difference between all of the edge providers and the ISPs. Regardless of how much market power the web companies have, people are not required to use them. I dropped off Facebook over a year ago because of my discomfort from their data gathering.

But you can’t avoid having an ISP. For most of us the only ISP options are one or two of the big ISPs. Most people are in the same boat as me – my choice for ISP is either Charter or AT&T. There is some small percentage of consumers in the US who can instead use a municipal ISP, an independent telco or a small fiber overbuilder that promises not to use their data. But everybody else has little option but to use one of the big ISPs and is then at their mercy of their data gathering practices. We have even fewer choices in the cellular world since four providers serve almost every customer in the country.

I was never convinced that Title II regulation went far enough – but it was better than nothing as a tool to put some constraints on the big ISPs. When the current FCC killed Title II regulation they essentially set the ISPs free to do anything they want – broadband is nearly totally unregulated. I find it ironic that Powell wants to see some rules the curb market abuse for Google and Facebook while saying at the same time that the ISPs ought to be off the hook. The fact is that they all need to be regulated unless we are willing to live with the current state of affairs where ISPs and edge providers are able to use customer data in any manner they choose.

As a nation we are approaching an 85% overall penetration of residential broadband. The following statistics come from the latest report from the Leichtman Group and compares broadband customers at the end of 2017 to the end of 2016.

4Q 2017

4Q 2016

Change

Comcast

25,869,000

24,701,000

1,168,000

4.7%

Charter

23,903,000

22,593,000

1,310,000

5.8%

AT&T

15,719,000

15,605,000

114,000

0.7%

Verizon

6,959,000

7,038,000

(79,000)

-1.1%

CenturyLink

5,662,000

5,945,000

(283,000)

-4.8%

Cox

4,880,000

4,790,000

90,000

1.9%

Altice

4,046,200

3,962,500

83,700

2.1%

Frontier

3,938,000

4,271,000

(333,000)

-7.8%

Mediacom

1,209,000

1,162,000

47,000

4.0%

Windstream

1,006,600

1,051,100

(44,500)

-4.2%

WOW!

730,000

718,900

11,100

1.5%

Cable ONE

524,935

513,908

11,027

2.1%

Cincinnati Bell

308,700

303,200

5,500

1.8%

Fairpoint

301,000

306,624

(5,624)

-1.8%

95,056,435

92,961,232

2,095,203

2.3%

These large ISPs control over 95% of the broadband market in the country – so looking at them provides a good snapshot of the industry. Not included in these numbers are the broadband customers of the smaller ISPs, the subscribers of WISPs (wireless ISPs) and customers of the various satellite services. Cable companies still dominate the broadband market and have 61.2 million customers compared to 33.9 million customers for the big telcos.

These overall numbers don’t tell the whole story since there is a lot of broadband upgrades happening around the country. Cable companies are starting to upgrade to DOCSIS 3.1 which will mean even faster download speeds. A number of telcos are building fiber to replace DSL, which they are hoping will stop the erosion of DSL customers fleeing to cable companies. There also is continued mergers in the industry, but Leichtman numbers include the impact from mergers into the prior year’s numbers – which is why the table of 2016 customer won’t match up with the same Leichtman table from a year ago.

It’s obvious that the cable companies are still taking DSL customers away from telcos in the cities of America. The cable companies added 2.7 million customers last year, and a significant percentage of these are from DSL customers converting to cable modems. But the telcos as a whole lost only 626,000 customers as a group. What accounts for this difference?

Part of the answer has to be the FCC’s CAF II program. The telcos were give money to bring broadband to over 4 million rural households and the telcos all report that they’ve finished some portion of that buildout by the end of 2017. But 2017 was only the second of a six year program and we should be seeing more broadband customers for AT&T, CenturyLink, Frontier and the other big telcos starting with the 2018 statistics.

The company that surprises me the most is AT&T. They told the FCC and stockholders that they had built fiber to reach 4 million new customers by the end of 2017, out of their goal of 12 million new potential new customers. This goal of 12 million new fiber passings was a condition of their merger with DirecTV. Perhaps they lost a mountain of DSL customers – but you would expect between CAF II and the new fiber builds to see an increase in AT&T broadband subscribers.

CenturyLink is also a bit of a puzzle. They say that they built fiber past 900,000 passings last year and I would have expected to see some benefit from that effort. Granted, they would have converted a lot of DSL customers to fiber, but you would expect them to finally have a competitive advantage with fiber compared to cable networks. And they also claim a substantial rural build from CAF II.

Probably the most important thing about broadband growth in 2017 is that is 30% less than the growth in 2016. We are finally seeing broadband reach it’s peak, and if the rate of growth continues on that curve we are only a few years away from reaching broadband equilibrium where the only broadband growth will come from new housing units entering the market. That will be the point when we will see the big ISPs panic and start struggling to find ways to satisfy Wall Street’s demand for ever-continuing profits.

Like this:

The Omnibus Budget bill that was passed by Congress last Thursday and signed by the President on Friday includes $600 million of grant funding for rural broadband. This is hopefully a small down payment towards the billions of funding needed to improve rural broadband everywhere. As you might imagine, as a consultant I got a lot of inquiries about this program right away on Friday.

The program will be administered by the Rural Utility Service (RUS). Awards can consist of grants and loans, although it’s not clear at this early point if loan funding would be included as part of the $600 million or made in addition to it.

The grants only require a 15% matching from applicants, although past federal grant programs would indicate that recipients willing to contribute more matching funds will get a higher consideration.

When I look at the first details of the new program I have a hard time seeing this money being used by anybody other than telcos. One of the provisions of the grant money is that it cannot be used to fund projects except in areas where at least 90% of households don’t already have access to 10/1 Mbps broadband. One could argue that there are no longer any such places in the US.

The FCC previously awarded billions to the large telcos to upgrade broadband throughout rural America to at least 10/1 Mbps. The FCC also has been providing money from the A-CAM program to fund broadband upgrades in areas served by the smaller independent telephone companies. Except for a few places where the incumbents elected to not take the previous money – such in some Verizon areas – these programs effectively cover any sizable pocket of households without access to 10/1 broadband.

Obviously, many of the areas that got the earlier federal funding have not yet been upgraded, and I had a recent blog that noted the progress of the CAF II program. But I have a hard time thinking that the RUS is going to provide grants to bring faster broadband to areas that are already slated to get CAF II upgrades within the next 2 ½ years. Once upgraded, all of these areas will theoretically have enough homes with broadband to fail the new 90% test.

If we look at past federal grant programs, the large incumbent telcos have been allowed a chance to intervene and block any grant requests for their service areas that don’t meet all of the grant rules. I can foresee AT&T, CenturyLink and Frontier intervening in any grant request that seeks to build in areas that are slated for near-term CAF II upgrades. I would envision the same if somebody tried to get grant money to build in an area served by smaller telcos who will be using A-CAM money to upgrade broadband.

To make matters even more complicated, the upcoming CAF II reverse auction will be providing funds to fill in the service gaps left from the CAF II program. But for the most part the homes covered by the reverse auctions are not in any coherent geographic pockets but are widely scattered within existing large telco service areas. In my investigation of the reverse auction maps I don’t see many pockets of homes that will not already have at least 10% of homes with access to 10/1 broadband.

Almost everybody I know in the industry doesn’t think the large telcos are actually going to give everybody in the CAF II areas 10/1 Mbps broadband. But it’s likely that they will tell the FCC that they’ve made the needed upgrades. Since these companies are also the ones that update the national broadband map, it’s likely that CAF II areas will all be shown as having 10/1 Mbps broadband, even if they don’t.

There may be some instances where some little pockets of homes might qualify for these grants, and where somebody other than telcos could ask for the funding. But if the RUS strictly follows the mandates of the funding and won’t provide fund for places where more than 10% of homes already have 10/1 Mbps, then this money almost has to go to telcos, by definition. Telcos will be able to ask for this money to help pay for the remaining CAF II and A-CAM upgrades. There is nothing wrong with that, and that’s obviously what the lobbyist who authored this grant language intended – but the public announcement of the grant program is not likely to make that clear to the many others entities who might want to seek this funding. It will be shameful if most of this money goes to AT&T, CenturyLink and Frontier who were already handed billions to make these same upgrades.

I also foresee one other effect of this program. Anybody who is in the process of seeking new RUS funding should expect their request to go on hold for a year since the RUS will now be swamped with administering this new crash grant program. It took years for the RUS to recover from the crush of the Stimulus broadband grants and they are about to get buried in grant requests again.

Like this:

It was just a year ago where there were numerous industry articles asking if cord cutting was real. There were many who thought that cord cutting would fizzle out and would not be a big deal for the cable industry. But the numbers are not from Leichtman Research Group for the end of 2017 and it shows that cord cutting is now quite real. The following numbers compare the fourth quarters of 2017 and 2016.

4Q 2017

4Q 2016

Change

Comcast

22,357,000

22,508,000

(151,000)

-0.7%

DirecTV

20,458,000

21,012,000

(554,000)

-2.6%

Charter

16,997,000

17,236,000

(239,000)

-1.4%

Dish

11,030,000

12,025,000

(995,000)

-8.3%

AT&T

3,657,000

4,281,000

(624,000)

-14.6%

Cox

4,200,000

4,290,000

(90,000)

-2.1%

Verizon

4,619,000

4,694,000

(75,000)

-1.6%

Altice

3,405,500

3,534,500

(129,000)

-3.6%

Frontier

961,000

1,145,000

(184,000)

-16.1%

Mediacom

821,000

835,000

(14,000)

-1.7%

Cable ONE

283,001

320,246

(37,245)

-11.6%

Total

88,788,501

91,880,746

(3,092,245)

-3.4%

These companies represent roughly 95% of the entire cable market, so these numbers tell the story of the whole market. From what I can see from many of my clients, many small cable companies are likely doing even worse than the big companies.

What’s probably the most significant from these numbers to me is that the overall industry cable penetration dropped to 70% by the end of 2017, down from a high of a few years ago of 75%. There were 126.2 million households at the end of 2017, per statistica, and only 70% of them are buying traditional cable – and that number has certainly dropped more into 2018.

The rate of growth of cord cutting is increasing. In 2016 the industry lost just over 1 million customers and in one year that grew to over 3 million.

It’s not hard to see where these customer went. FierceCable reported recently that 5% (over 6 million) of US households subscribe to a vMVPD service – these are online services that carry smaller bundles of traditional cable channels like Sling TV, Playstation Vue and DirecTV Now. It’s easy to forget that just a year ago most of these services were just getting started.

It’s worth noting that AT&T overall saw only a minor drop in total cable subscribers. While AT&T and their DirecTV subsidiary lost 1.2 million customers, DirecTV now has just over 1.1 million customers. But this still has to be hurting the company since analysts all believe that the margins on the vMVPD services are much slimmer than traditional cable.

Of other note are the large percentage losses of cable customers at Dish, Frontier and Cable One.

Another way to consider these losses is on a daily basis, and the industry lost nearly 8,500 customers per calendar day during the year.

It’s obvious in looking at these number that the cable industry is now in the same kind of free fall we saw a decade ago with landline telephones. The phenomenon is widespread and 3 million cord cutters means this is every neighborhood in the country. I believe that the pace of cord cutting will continue to accelerate. It’s looked around my own neighborhood and I can’t find anybody who hasn’t either cut the cord or is thinking about doing so.

What surprises me the most is that the big cable companies are not in screaming to the Congress and the FCC to change the rules governing traditional cable. Those rules force the big channel line-ups, and the cord cutting shows that people can be happy with far less than what the programmers are selling. The cable company could be offering more of the skinny bundles offered by the vMVPDs and could retain more bundled customers.

Like this:

US Representative Anna Eshoo of California has submitted a ‘dig once’ bill every year since 2009, and the bill finally passed in the House. For this to become law the bill still has to pass the Senate, but it got wide bipartisan support in the House.

Dig Once is a simple concept that would mandate that when roads are under construction that empty conduit is places in the roadbed to provide inexpensive access for somebody that wants to bring fiber to an area.

This would apply to Federal highway projects, but also to state projects that get any federal funding. It encourages states to apply this more widely.

For any given road project there would be ‘consultation’ with local and national telecom providers and conduit would be added if there is an expected demand for fiber within 15 years.

The conduit would be installed under the hard surface of the road at industry standard depths.

The conduits would contain pull tape that would allow for easy pulling of fiber in the future.

Handholes would be placed at intervals consistent with industry best practices.

This all sounds like good stuff, but I want to play devil’s with some of the requirements.

The initial concept of dig once was to never pass up the opportunity to place conduit into an ‘open ditch’. The cost of digging to put in conduit probably represents 80% of the cost of deployment in most places. But this law is not tossing conduit into open construction ditches. It instead requires that the conduit be placed at depths that meet industry best practices. And that is going to mean digging at a foot or more deeper than the construction that was planned for the roadbed.

To understand this you have to look at the lifecycle of roads. When a new road is constructed the road bed is typically dug from 18 inches deep to 3 feet deep depending upon the nature of the subsoil and also based upon the expected traffic on the road (truck-heavy highways are built to a higher standard than residential streets). Typically roads are then periodically resurfaced several times when the road surface deteriorates. Resurfacing usually requires going no deeper than a few inches into the roadbed. But at longer intervals of perhaps 50 years (differs by local conditions) a road is fully excavated to the bottom of the roadbed and the whole cycle starts again.

This means that the conduit needs to be placed lower than the planned bottom of the roadbed. Otherwise, when the road is finally rebuilt all of the fiber would be destroyed. And going deeper means additional excavation and additional cost. This means the conduit would not be placed in the ‘open ditch’. The road project will have dug out the first few feet of the needed excavation, but additional, and expensive work would be needed to put the conduit at the safe depth. In places where that substrate is rock this could be incredibly expensive, but it wouldn’t be cheap anywhere. It seems to me that this is shuttling the cost of deploying long-haul fiber projects to road projects, rather than to fiber providers. There is nothing wrong with that if it’s the national policy and there are enough funds to pay for it – but I worry that in a country that already struggles to maintain our roads that this will just means less road money for roads since every project just got more expensive.

The other issue of concern to me is handholes and access to the fiber. This is pretty easy for an Interstate and there ought to be fiber access at every exit. There are no customers living next to Interstates and these are true long-haul fibers that stretch between communities.

But spacing access points along secondary roads is a lot more of a challenge. For instance, if you want a fiber route to be used to serve businesses and residents in a city this means an access point every few buildings. In more rural areas it means an access point at every home or business. Adding access points to fiber is the second most labor-intensive part of the cost after the cost of construction. If access points aren’t where they are needed, in many cases the fiber will be nearly worthless. It’s probably cheaper in the future to build a second fiber route with the proper access points than it is to try to add them to poorly designed existing fiber route.

This law has great intentions. But it is based upon the concept that we should take advantage of construction that’s already being paid for. I heartily support the concept for Interstate and other long-haul highways. But the concept is unlikely to be sufficient on secondary roads with lots of homes and businesses. And no matter where this is done it’s going to add substantial cost to highway projects.

I would love to see more fiber built where it’s needed. But this bill adds a lot of costs to building highways, which is already underfunded in the country. And if not done properly – meaning placing fiber access points where needed – this could end up building a lot of conduit that has little practical use for a fiber provider. By making this a mandate everywhere it is likely to mean spending a whole lot of money on conduit that might never be used or used only for limited purposes like feeding cellular towers. This law is not going to create fiber that’s ready to serve neighborhoods or those living along highways.

Like this:

For the second year in a row Turner Sports, in partnership with CBS and the NCAA will be streaming March Madness basketball games in virtual reality. Watching the games has a few catches. The content can only be viewed on two VR sets – the Samsung Gear VR and the Google Daydream View. Viewers can buy individual games for $2.99 or buy them all for $19.99. And a viewer must be subscribed to the networks associated with the broadcasts – CBS, TNT, TBS and truTV.

Virtual reality viewers get a lot of options. They can choose which camera to watch from or else opt for the Turner feed that switches between cameras. When the tournament reaches the Sweet 16 viewers will receive play-by-play from a Turner team broadcasting only for VR viewers. The service also comes with a lot of cool features like the ability to see stats overlays on the game or on a particular player during the action. Games are not available for watching later, but there will be a big library of game highlights.

Last year Turner offered the same service, but only for 6 games. This year the line-up has been expanded to 21 games that includes selected regionals in the first and second round plus Sweet Sixteen and Elite Eight games. The reviews from last year’s viewers were mostly great and Turner is expecting a lot more viewers this year.

Interestingly none of the promotional materials mention the needed bandwidth. The cameras being used for VR broadcasts are capable of capturing virtual reality in 4K. But Turner won’t be broadcasting in 4K because of the required bandwidth. Charles Cheevers, the CTO of Arris said last year that a 720p VR stream in 4K requires at least a 50 Mbps connection. That’s over 30 times more bandwidth than a Netflix stream.

Instead these games will be broadcast in HD video at 60 frames per second. According to Oculus that requires a data stream of 14.4 Mbps for ideal viewing. Viewing at slower speeds results in missing some of the frames. Many VR viewers complain about getting headaches while watching VR, and the primary reason for that the headaches is missing frames. While the eye might not be able to notice the missing frames the brain apparently can.

One has to ask if this is the future of sports. The NFL says it’s not ready yet to go to virtual reality until there is more standardization between different VR sets – they fear for now that VR games will have a limited audience due to the number of viewers with the right headsets. But the technology has been tried for football and Fox broadcast the Michigan – Notre Dame game last fall in virtual reality.

All the sports networks have to be looking at the Turner pricing of $2.99 per game and calculating the potential new revenue stream from broadcasting more games in VR in addition to traditional cable broadcasts. Some of the reviews I read of last year’s NCAA broadcasts said that after watching a game in VR that normal TV broadcasts seemed boring. Many of us familiar with this feeling. I can’t watch linear TV any more. It’s not just sitting through the commercials, but it’s being captive to the stream rather than watching the way I want. We can quickly learn to love a better experience.

Sports fans are some of the most intense viewers of any content. It’s not hard to imagine a lot of sports fans wanting to watch basketball, football, hockey or soccer in VR. Since the format favors action sports it’s also not hard to imagine the format also drawing viewers to rugby, lacrosse and other action sports.

It’s possible that 4K virtual reality might finally be the app that justifies fast fiber connections. There is nothing else on the Internet today that requires that much speed plus low latency. Having several simultaneous viewers in a home watching 4K VR would require speeds of at least a few hundred Mbps. You also don’t need to look out too far to imagine virtual reality in 8K, requiring a data stream of at least 150 Mbps – which might be the first home application that can justify a gigabit connection.

Like this:

The FCC voted yesterday that telecom deployments are now exempt from any environmental or historic preservation reviews. This is seen as the first step at the FCC making it easier to deploy 5G.

It’s an interesting rule change because in my experience those rules have more often applied to federally funded broadband projects than to local ones. For example, the BTOP stimulus grants added costs to every project by requiring both environmental and historic preservation reviews – even when it was obvious they didn’t apply. The vast majority of telecom deployments want to put fiber or telecom equipment into already-established rights-of-way. It’s hard to justify doing an environmental review when fiber is to be laid on the shoulder of an existing road or on poles. And the vast majority of 5G equipment will be on utility poles, light poles or buildings, so it’s hard to think that there can be much environmental impact from using established rights-of-way.

But that doesn’t mean that there is never a reason for a locality to have these requirements. Consider my town of Asheville, NC. There is a neighborhood around the Biltmore mansion that has strict zoning codes to keep it looking historic. The City ought to have the option to review and approve 5G or any utility deployments that might clutter the historic nature of such an area. Cities have often forced utilities into the ground in historic districts, but 5G transmitters can’t be put underground, by definition. I’ve seen some proposed small cell transmitters that are large and unsightly, and it doesn’t seem unreasonable for a community to have some say into where such gear can be used. Do we really need to see unsightly telecom equipment in Williamsburg, the Gettysburg battlefield or near the Liberty Bell? Local communities also need to have some say before a telecom deployment disturbs graves or archaeological sites.

The same goes for an environmental review. A better rule would be to only allow an environmental review when new telecom facilities are to be built into virgin rights-of-way – and where there is some local concern. We don’t really want to allow somebody to lay fiber through a sensitive wetland or bird sanctuary without some local say in the deployment.

These rules are the first step in what is perceived as the FCC desire to preempt all local control over 5G deployments. This FCC created various Broadband Deployment Advisory Committees (BDAC) to look at various industry issues and one of these committees looked at ‘Removing State and Local Regulatory Barriers”. It’s fairly obvious from the name of the group that they made a long list of local regulations that should be preempted.

That BDAC group essentially recommended that the FCC override all local control of rights-of-ways or any kind of review of telecom infrastructure deployment. Their recommendations read like a wish list from the lobbyists of the large cellular carriers and ISPs. If the FCC enacts all of the BDAC group’s recommendations, they will have handed over control of the 5G deployment process to wireless carriers and ISPs with no local say in the process.

I am certainly sympathetic to carriers that encounter major barriers to infrastructure deployment. I will have a blog coming soon on a particular egregious abuse of local authority that is greatly increasing the cost of a rural fiber deployment. But I’ve worked with hundreds of fiber deployments and mostly local rules are sensible and realistic. For example, cities have legitimate concerns over fiber deployments. They usually insist in getting records so that they have some record of what is deployed under their streets. They often require contractors to use sensible traffic control and to clean up after construction. And they often have fees which compensate the city for processing permits, for locating existing utilities and for inspecting the construction. If these kinds of rules are overridden by the FCC we’ll soon see horror stories of fiber builders who dig up streets and then walk away with no consequences. In my experience local rules are needed to stop utilities from taking shortcuts to save money.

I was talking to a colleague about this topic and they asked if we really need to be concerned as much about 5G as we are about fiber deployments. After some thought my answer is yes – the same sort of common sense local rules need be allowed for 5G. I picture a future where there will be multiple companies deploying 5G into neighborhoods. It’s not hard to picture wireless devices of various sizes hanging from every available pole and structure. It’s not hard to envision wireless providers erecting 100’ poles on streets to reach above the tree canopy. It’s not hard to envision 5G providers drastically trimming trees to give them line of sight to homes. I know I want my city to have some say in this before AT&T and Verizon make a mess out of my own street.

I am sure these new rules will be challenged in court. The legal question will be if the FCC has the authority to override local laws on these issues. I have no idea of how the law might apply to environmental or historic preservation reviews. But if the FCC tries to do the same with 5G pole attachments they run smack into the Telecommunications Act of 1996 which gives States (and by inference, localities) the ability to craft their own local laws concerning poles, conduits and rights-of-way. It’s always a tug of war when the FCC tries to override states and the courts are almost always the final arbiter of these attempts.

My prediction is that we are going to see more stringent data caps in our future. Some of the bigger ISPs have data caps today, but for the most part the caps are not onerous. But I foresee data caps being reintroduced as another way for big ISPs to improve revenues.

You might recall that Comcast tried to introduce a monthly 300 GB data cap in 2015. When customers hit that mark Comcast was going to charge $10 for every additional 50 GB of download, or $30 extra for unlimited downloading.

There was a lot of public outcry about those data caps. Comcast backed down from the plan due to pressure from the Tom Wheeler FCC. At the time the FCC probably didn’t have the authority to force Comcast to kill the data caps, but the nature of regulation is that big companies don’t go out of their way to antagonize regulators who can instead cause them trouble in other areas.

To put that Comcast data cap into perspective, in September of 2017 Cisco predicted that home downloading of video would increase 31% per year through 2021. They estimated the average household data download in 2017 was already around 130 GB per month. You might think that means that most people wouldn’t be worried about the data caps. But it’s easy to underestimate the impact of compound growth and at a 31% growth rate the average household download of 130 GB would grow to 383 gigabits by 2021 – considerably over Comcast’s propose data cap.

Even now there are a lot of households that would be over that caps. It’s likely that most cord cutters use more than 300 GB per month – and it can be argued that the Comcast’s data caps would punish those who drop their video. My daughter is off to college now and our usage has dropped, but we got a report from Comcast when she was a senior that said we used over 600 GB per month.

So what are the data caps for the largest ISPs today?

Charter, Altice, Verizon and Frontier have no data caps.

Comcast moved their data cap to 1 terabyte, with $10 for the first 50 GB and $50 monthly for unlimited download.

AT&T has almost the stingiest data caps. The cap on DSL is 150 GB, on U-verse is 250 GB, on 300 Mbps FTTH is 1 TB and is unlimited for a Gbps service. They charge $10 per extra 50 GB.

CenturyLink has a 1 TB cap on DSL and no cap on fiber.

Cox has a 1 TB cap with $30 for an extra 500 GB or $50 unlimited.

Cable One has no charge but largely forces customers who go over caps to upgrade to more expensive data plans. Their caps are stingy – the cap on a 15 Mbps DSL connection is 50 GB.

Mediacom has perhaps the most expensive data caps – 60 Mbps cap is 150 GB, 100 Mbps is 1 TB. But the charge for violating the cap is $10 per GB or $50 for unlimited.

Other than AT&T, Mediacom and Cable One none of the other caps sound too restrictive.

Why do I think we’ll see data caps again? All of the ISPs are looking forward just a few years and wondering where they will find the revenues to increase the demand from Wall Street for ever-increasing earnings. The biggest cable companies are still growing broadband customers, mostly by taking customers from DSL. But they understand that the US broadband market is approaching saturation – much like has happened with cellphones. Once every home that wants broadband has it, these companies are in trouble because bottom line growth for the last decade has been fueled by the growth of broadband customers and revenues.

A few big ISPs are hoping for new revenues from other sources. For instance, Comcast has already launched a cellular product and also is seeing good success with security and smart home service. But even they will be impacted when broadband sales inevitably stall – other ISPs will feel the pinch before Comcast.

ISPs only have a few ways to make more money once customer growth has stalled, with the primary one being higher rates. We saw some modest increases earlier this year in broadband rates – something that was noticeable because rates have been the same for many years. I fully expect we’ll start seeing sizable annual increases in broadband rates – which go straight to the bottom line for ISPs. The impact from broadband rate increases is major for these companies – Comcast and Charter, for example, make an extra $250 million per year from a $1 increase in broadband rates.

Imposing stricter data caps can be as good as a rate increase for an ISPs. They can justify it by saying that they are charging more only for those who use the network the most. As we see earnings pressure on these companies I can’t see them passing up such an easy way to increase earnings. In most markets the big cable companies are a near monopoly and consumers who need decent speeds have fewer alternative as each year passes.Since the FCC has now walked away from broadband regulations there will be future regulatory hindrance to the return of stricter data caps.