Having recently won a landmark victory in upholding the Federal Communications Commission’s authority over establishing and enforcing Net Neutrality regulations, it really wouldn’t surprise me if FCC chairman Tom Wheeler was hesitant to continue pushing his luck, particularly when it comes to investigating the popular zero-rating trend.

The argument is, of course, that despite the fact that zero-rating offers certain streaming video services at no data cost, that these sorts of practices offer broadband providers a great deal of power of subscribers’ online activity, allowing them to direct users towards certain favoured services, giving carriers leave to throttle or otherwise manage data streams, and ostensibly allowing them to block (or at least render completely irrelevant) services unwilling to participate. So will an investigation finally begin?

Video streams offered by Verizon’s own GO90 is not counted towards consumer cap

As the table shows, with the exception of Sprint all of the top nationwide carriers offer some sort of zero-rating program, with T-Mobile leading the way with its ever-expanding Binge On service, while AT&T and Verizon offer similar, yet distinctly more in-house, services to exempt certain content from data caps.

While on the face of it zero-rating seems like an amazing pro-consumer service, where subscribers get access to data gobbling video streams without it counting against their monthly allotments, consumer advocacy groups including Fight the Future and Free Press reportedly have delivered 100,000 letters to the FCC, written by consumers criticizing data caps and such exemptions.

For its part, the FCC is not convinced. “We’re collecting information, as I’ve been telling you for months, we’re in ongoing discovery mode to try and have an understanding of just what is the spectrum that we’re dealing with here so that we can deal with these issues on a case by case basis,” Wheeler said, on a day when protesters had delivered petitions on the issue.

As one of the issues that has been near and dear to my heart since I began writing here at thetelecomblog (and so perhaps it’s appropriate it serves as my last post), I have to say I’m interested to see where this goes. In fact, given the deviously clever nature of zero-rating—that it masks clear Net Neutrality violations in a pro-consumer service—I would argue that this might serve as the greatest test the FCC will ever face in establishing and preserving a free and open Internet.

If the Commission has the stones to tackle something people like, but which is ultimately not in the public’s best interest, I think we’ll be well on our way towards true net neutrality. Unfortunately if history has shown us anything, what we’re in for is months of heel dragging followed by the inevitable knuckling under. Where’s John Oliver when you need him?

In the 2004 dystopian action movie “I, Robot,” the main character (played by Will Smith) harboured a great deal of resentment towards advanced robotic assistants because of their inability to make complex moral decisions. In fact, as you find out through the course of the film, a robot had made a choice to save his life, rather than that of a young girl, based on the logical calculations of both their chances of survival during a catastrophic car accident. The point was simple, the decision making power of robots will always be flawed because they lack the emotional capacity to make nuanced moral choices.

While a decade ago considering moral theory as it relates to robotics might have seemed like some futuristic thought experiment, today it has become a reality, as the advent of self-driving cars is presenting unique moral challenges, particularly related to what decisions robotic cars should make in the event of a crash.

The fact of the matter is that while self-driving cars purport to deliver advantages related to more efficient traffic systems, reduced accidents and lower emissions, even robots will get into accidents, and autonomous vehicles will have to decide how to respond to those accidents and make decisions as to who might be injured in them: passengers or pedestrians.

In that split second of an automobile accident the driver may make thousands of instantaneous moral decisions, particularly related to the safety of passengers, pedestrians, and self. Not only that, but repeat the same accident scenario 1000 times with 1000 different people, and you might get 1000 different outcomes, each person making unique instinctual choices (as far as it’s possible to make “choices” in such instances).

In fact, as you read this you might think you’re immediate response would be naturally altruistic, that you would look to save others before yourself, or perhaps you’re more concerned about you and yours, thinking of personal safety first and foremost. Say what you will about one’s innate propensity towards either end, they are decisions that we make, and thus they’ll need to be decisions that autonomous vehicles make as well.

But here’s the rub: According to new research by the University of California, when people were asked whether self-interest or the public good should predominate when it comes to programming moral principles into self-driving robotic cars, while most approve of the concept of self-driving cars potentially sacrificing a passenger (or passengers) to save others, those same people would rather not purchase or ride in such vehicles. Or to put it another way “participants were less likely to purchase a self-driving car that would sacrifice them and their passengers.”

“Defining the algorithms that will help AVs make these moral decisions is a formidable challenge. We found that participants in six Amazon Mechanical Turk studies approved of utilitarian AVs (that is, AVs that sacrifice their passengers for the greater good) and would like others to buy them, but they would themselves prefer to ride in AVs that protect their passengers at all costs,” the study authors said.

Simply put, people like the idea of self-sacrifice, but they have trouble when it might be demanded of them.

Going forward it will be interesting to see how the automotive and technology industries meet these unique challenges, finding ways to effectively meet the seemingly incompatible objectives of: responding consistently, not causing public outrage, and not alienating buyers. Given the outcome of the study mentioned above, finding an algorithm that aligns with complex and nuanced (and not to mention, fluctuating) human values will be challenging indeed.

Granted Google has unveiled its own Nexus line of devices, but over the years these projects have been done in partnership with a variety of Android vendors, almost like Google throwing a bone to the likes of HTC, Huawei, and LG.

For a second let’s say that all these rumours prove true, that Google is planning on closing the Android ecosystem, likely charging licensing fees for Android, and otherwise not allowing partners to alter that platform, given the changes in Google’s Android traditional revenue streams, it makes complete sense.

You see, for years now Google has made money off Android in one way, and one way only, advertising. The company has linked its entire revenue chain to partners building unique devices that get people to buy apps and click on advertising. While for years I’ve questioned the viability of such a strategy, there’s no question that it worked…at least while the advertising industry operated like the Wild West.

Not only that, but if you’re going to close the Android ecosystem, what better way than on your own top of the line Android device? The search giant will be able to better control Android development, updates, and security, as well as better control the product that goes to market.

Of course this would serve as a titanic shift in the mobile world, as for the first time Google and Apple, both of whom dominate the mobile OS market, would be competing head-to-head in mobile hardware as well; saying nothing about what it would do to the already shaky Android ecosystem.

In today’s world you might think that the last people to be reached by broadband service would be those in remote or rural locations, particularly given the fact that many of us in North American urban centres likely consider broadband access and affordability to an inexorable and ubiquitous part of our very existence. In fact, I’ll admit that I’ve long considered city life and broadband access to exist hand-in-hand…but not so.

To put in another way, in a classic tale of haves and have-nots, being part of the so-called unconnected billions—and by the study’s own numbers, it’s about 2.2 billion people unconnected in cities alone—has little to do with where you live within a given country, and everything to do with how much money you have while living there.

According to the findings of the study, conducted by research group Maravedis, there not only exists a shockingly high number of unconnected people within city centres, but within each of those individual cities there exists a wide disparity of broadband access. This means that a sizeable population percentage inside many large cities is excluded from the digital age, “either because they cannot afford the service or because the service is simply not available in their neighbourhood.”

Now perhaps that’s not entirely shocking, given that many of the larger cities in the developing world still lack the resources and infrastructure to provide widespread broadband service, but again surprisingly the study found that even in large, developed cities, like New York and Shanghai, more than a quarter of the population were still without broadband service.

Granted the number of unconnected in urban centres varied by region and country, as one might expect, as cities in the 2/3s world tended to have a higher percentage of those without broadband access than did cities in Europe or North America. To that end, it makes sense then that the study found that urban centres across the Middle East, followed by Asia, topped the list of the highest percentage of the popular without Internet access. But again, those are people living in large, urban centres, the places one might think Internet infrastructure, coverage, and availability would be at their highest.

I mean, consider the moonshot Internet connection campaigns launched by Google and Facebook in recent years, where balloons and drones are being deployed to beam Internet access down to places where Internet infrastructure is weakest. Now when we think of those kinds of far-off, unconnected places we undoubtedly assume we’re talking about rural life around the world, but not so, meaning that perhaps Google and Facebook might want to focus on connecting cities first, and expand from there.

All that to say, where a person lives, whether in the city or in the country, seems to have less to do with the available access to broadband service than does wealth and affluence, and that means that some of those unconnected billions may be our friends and neighbours, not just some faceless population of people on the other side of the world. What it also means is that cities, not telcos or ISPs, need to start taking responsibility for delivering broadband infrastructure, considering it a public utility for everyone rather than a luxury for the rich.

For more than a year now telcos and vendors have been championing the Internet of Things (IoT) as the greatest new vertical revenue stream for operators, arguing that the development of 5G network technology will open up a brave new world for ubiquitous, sustained connectivity of our entire digital existence. The sustained connection of ever-proliferating devices equals increased profits…or so the dream goes.

As Entwistle said earlier this week, “I’m perfectly prepared to accept that the internet of things is extraordinarily interesting to equipment makers and vendors, to systems integrators, to policy makers, and to people concerned with the social role of communications services in our lives, but there is an awful lot of noise about the internet of things that doesn’t actually translate into, to put it strongly, a whole hill of beans for the telecoms operator who’s looking to sell services to achieve revenue per customer or revenue per device.”

Entwistle provided several reasons why IoT may not be the saviour the telecom industry thinks it is: First, while the growth of IoT present unique opportunities for vendors, system integrators and other market participants, it offers much less for network operators (if they remain solely operators of course). As Entwistle explains, “Traffic volumes and revenue yields tend to be low and margins tend to be thin,” meaning that even as data traffic increases, this won’t deliver the windfall telcos are hoping for.

Finally, with the hype surrounding IoT, particularly in niche markets like driverless cars, regulators are now getting involved, and that could end very badly for the telecom industry, as pressure to commit to “non-commercial investments” as part of the greater good of growing IoT for everyone will likely mean losses for telcos. As Entwistle notes, when that happens, “Distortion, waste and ‘crowding out’ are the likely results.”

“The telecoms operator will not see a single penny from any one of those devices; they sell a 5Gbps fibre into the data room of the hospital today, and in 10 years’ time they’ll probably still be selling a 5Gbps connection, or 10Gbps fibre at half the price of today’s 5Gbps fibre,” he claimed.

“I can’t see any business case for a telecoms operator.”

Now telecom operators are completely unprepared for this radically different reality, he said, with many simply assuming that multiplying devices means multiplying revenues. Not so.

“An operator said: ‘We will have 1,000 times as many devices and we only need 1,000th of the ARPU [average revenue per user] in order to build a business as big as our existing business’. That’s not a business plan, that’s just multiplying two numbers together and making a brave assumption,” he argued.

Now I have to say that Entwistle makes some compelling points, that much of the hype surrounding the growth of the Internet of Things means little to the telecom industry when it comes to growing profits and finding untapped revenue streams. In fact, IoT seems to be creating a perfect storm for the telecom industry, where they’ll have the network resources necessary to grow IoT, yet due to regulations and restrictions, will likely have less opportunity to profit off of their participation then, say, IoT hardware vendors.

Even as Canadians attempt to sort out their own Net Neutrality regulations, repairing the debacle left by previous federal administrations, there is one Canadian city that is hoping to pave the way forward towards establishing high speed broadband service as a public utility, even if federal regulatory bodies are unwilling to define it as such.

“BridgeNet is a key element in our Intelligent City initiative,” said New Westminster City Councillor Bill Harper. “This is part of a strategy to attract knowledge-based startups and high-tech companies into the city. There are a lot of pieces to this plan, but the idea is to come up with a cohesive strategy for building a health-care cluster.”

Frustrated by the slow upgrade schedule of the country’s main telecom companies, the city has decided to piggyback the installation of gigabit broadband service to its other public utilities, meaning that whenever a road is repaired or a new community created, Internet is now added as part of the infrastructure.

If you ever wondered what treating broadband service as a public utility looks like, well here it is folks.

“Our economy is shifting from basically the old industrial model to the new digital innovation model,” Harper said, adding that the hope is that by pushing Internet to the community that it will foster business growth and less disparity regarding information access.

That’s not to say the city is going into business as an Internet service provider, though, only that it has taken on the task of creating the backbone network to serve the city. Harper noted that four ISPs have already signed on to lease the network from the city, proceeds from which will be used both to recoup the capital investment and to reinvest into continued expansion of the project.

According to Harper, it makes sense for the city to be involved in the development and expansion of high speed gigabit broadband service because the city is uniquely equipped to provide the infrastructure at a lower cost. “This is a city-funded and city-owned network, and because the conduit is already there the installation is much less expensive,” said Harper. The city will spend $9 million over five years on the installation, an investment it hopes to see returned within a decade.

As it stands, the network will first serve the city’s municipal centre, business district, office buildings, and new high-density residential developments, “That revenue will be re-invested to expand the system to the rest of the city over time,” Harper explained.

So as the telecom industries both north and south of the border continue to rail against treating Internet access as a public utility, consider the City of New Westminster as a prime example of what that really looks like, where cities invest in their local broadband infrastructure for the betterment of the entire community, not needing to rely on the languorous upgrade schedules of telcos to dictate when people can get connected to fast, effective and affordable Internet service.

Over the years the one thing that as set Google’s Android operating system apart from Apple’s own iOS is that Android has always been open-sourced (sort of), available to all to tinker and modify…if they’re willing to live by Google’s rules of course. By contrast, Apple has always controlled its proprietary platform, not allowing anyone else to modify it or use it in any way. Two paths to success, both with significant positives and huge drawbacks.

For Android the greatest drawback has always been fragmentation, that with so many people deploying so many different versions of Android, that security and cohesion across all devices becomes a serious issue. For iOS, the problem has always been Apple’s draconian control, dictating everything with a “take it or leave it” attitude.

But as friendly and open as Android has appeared over these last few years, it seems Google may be poised to follow Apple’s path towards proprietary control, as rumours continue to circulate that given the company’s desire to gain better control over Android development that it will have to combat fragmentation, and the only way to do that would be to standardize a closed version of the platform.

Such speculation has once again come to the fore during this week’s Recode conference, where Google’s CEO Sundar Pirchai discussed at length his firm’s desire to “put more thought” into its own branded Nexus line of devices. For many, such thought is veiled code for take more control, and to do that, Google is going to have to change the way Android is run.

As one blogger notes, it’s likely that Google is going to want to increase the user experience on its own Nexus devices by offering more features built into its platform. But ensuring those features perform up to standards and don’t drain battery, the company will need to implement tweaks and changes to Android’s code. What many suspect is that this process of tweaks and changes will invariably lead Google to the realization (or perhaps simply admitting) that the only way to ensure a high level of consistent user experience will be to create a fully closed, proprietary version of Android for its Nexus line.

Then, once that closed version of Android is in the wild, and of course assuming that it’s considerably better than the open source version we currently have, Android partners will begin lining up to adopt it in their own Android devices.

That would then lead, I assume, to a two-tiered approach to Android deployment, the free, open-source version for entry level partners in the Android ecosystem, the closed, proprietary version for the serious, established vendors.

While open source is what has made Android so popular, with growing security concerns and an increasingly differentiated user experience across the entire ecosystem, by creating a proprietary version of Android Google would be able to fix the fragmentation issues that have always plagued the platform, giving the company better control of software updates and, thus, better control over device security and overall experience. Of course given that Google has always held considerable control over Android anyway, perhaps calling it open source has always been a bit of a misnomer.

Proving that people are little more than trained seals, slapping their fins together when you toss them a free snack, T-Mobile acknowledge via Twitter late last week that it had lost Domino’s as part of the latest un-carrier promotion, explaining that the company simply couldn’t keep up with customer demand for the offered free two-topping pizzas.

But more to the point, this just goes to show you how woefully inadequate I am at gauging consumer behaviour, as once again I’ve attempted to give consumers the benefit of the doubt, arguing that they won’t be fooled by such woeful attempts at pandering, only to be once again surprised when they are.

For T-Mobile, though, the departure of Domino’s pizza was simply a bump in the road, all part of the carrier’s strategy to rotate through various promotions and partners to give customers something new and fresh every week. During the first week, the company offered pizza, free Frostys at Wendy’s, free GoGo Wi-Fi and company stock, while week two saw the removal of the stock option and the addition of a $50 credit with ride-sharing service Lyft, and it seems the carrier will continue with Lyft as the replacement for Domino’s going forward, although any other new additions have yet to be announced.

It also seems the carrier will be depending heavily on MLB related content promos this week as well, as they’ve quietly added a $20 coupon to the MLB online store along with contests to win MLB gift cards and a trip to the mid-season All-Star game.

As evidence of just how attractive free stuff is for people, during the first week of promotions earlier this month, the overwhelming volume of consumer traffic crashed the T-Mobile app, causing significant delays and a PR headache that saw company head John Legere take to Twitter to bed for patience.

But again, I simply can’t believe people are flocking to these cheap promotions, for while I thought T-Mobile was really on to something when it started abolishing overages, eliminating contracts, and otherwise revolutionizing the wireless market, this latest round of un-carrier promotions seems like nothing more than unabashed pandering to the lowest common denominator. Whatever works, I guess.

]]>http://www.thetelecomblog.com/2016/06/21/dominos-abandons-t-mobile-promo-due-to-huge-pizza-demand/feed/0The Internet is a Public Utility. So has anything changed?http://www.thetelecomblog.com/2016/06/20/the-internet-is-a-public-utility-so-has-anything-changed/
http://www.thetelecomblog.com/2016/06/20/the-internet-is-a-public-utility-so-has-anything-changed/#commentsMon, 20 Jun 2016 09:30:11 +0000neutralityupheldregulationsarbitrarilyratingthrottledsponsoredblockedhttp://www.thetelecomblog.com/?p=29343

As we reported last week, a U.S. Court of Appeals upheld the Federal Communications Commission’s legal authority to implement and enforce Net Neutrality regulations, a landmark victory for the open Internet movement. With it the FCC has had its reclassification of broadband service as a public utility affirmed, and we now officially live in an age where Internet (like water, power, and phone service) is a publicly regulated service. So has anything changed?

The ironic thing about the entire fight over Net Neutrality is that the problems the regulations were conceived to battle, arbitrary network throttling, prioritized service, blocking traffic, have already been conceded by the broadband industry, with almost every ISP moving on to bigger and better (and not to mention far more complicated and convoluted) ways to manage Internet traffic the way they want.

What this means is after years of battling we finally have the regulatory structure in place to keep the broadband service providers from doing the things they were doing five years ago…practices they’ve long given up in favour of other, more nebulously ethical practices that fall outside the purview of the FCC’s rules.

Now one might argue that the last five years have been necessary to get the broadband industry to evolve into what we see today, and to that end, I suppose Net Neutrality has served its purpose. Without this fight we might very well have seen the establishment of a two-tiered Internet with fast and slow lanes, where the most costly traffic (like Netflix) and the least profitable traffic (like the lowest tiered service plan) are throttled, blocked, or otherwise “managed” to make room for those who paid for priority service.

Consider this: Under Net Neutrality regulations carriers aren’t allowed to arbitrarily favour any traffic over others, so instead of doing that, they made some data traffic free to the user (zero-rating), achieving the same goal of driving users to certain content or services, but with a complete end-around of the FCC’s rules.

Or this: Under Net Neutrality regulations carriers are not allowed to arbitrarily throttle data speeds, so instead they offer services that contain opt-out provisions about throttling, so consumers tacitly agree to have their data speeds slowed in order to access the free content or services they desire. Again, a complete end-around.

Or this: By the FCC’s regulations service providers are not allowed to block any content, so instead carriers have devised a way to have content “partners,” who again offer content or services free to the consumer, whereby all other non-partners are left out in the cold. Not blocked, to be sure, but not equal either.

So while we might see the broadband industry follow through on its threats to bring this fight to the Supreme Court, I happen to think they learned their lesson from Verizon’s ill-advised actions the first time around, where such threats and lawsuits only resulted in more regulations, not less, and that given the fact the industry has already found FCC-approved workarounds to the regulations, that maybe it’s time to leave well enough alone. So is this a victory for Net Neutrality and the fight for an open Internet? I’m not so sure.

In the telecom world (heck, in every service industry) company CEOs love to talk about customer service. Oh my goodness they like talking about it! Nary has a quarterly conference call gone by without some talking head assuring investors that customers come first, that customer service is at the heart of everything they do, and assuring everyone that consistent customer satisfaction is the top priority.

Why then do we continue to be inundated with stories of poor customer service? Now perhaps it’s that such stories offer more attractive clickbait than say, “Customer Satisfied with Telecom Service,” or maybe it’s because we’ve all had one or two bad experiences ourselves, so we happen to notice the negative ones more readily. Or perhaps it’s that company CEOs don’t want to admit what I’ve known for decades, that for many companies (particularly those that answer to shareholders) customer service isn’t the priority; it’s simply a means to an end…the end being the growth of profits.

And it’s for that reason that we’re seeing such a generational disconnect when it comes to customer service, as companies resist thinking about new service strategies out of fear that they’ll cost the company more. But as I’ve learned over the years, when profits come before service, the latter will always negatively influence the former.

During this past week we covered a story about customer satisfaction ratings of major American wireless carriers, and I’ll admit to my surprise, T-Mobile emerged as the top rated carrier when it came to satisfaction, despite the fact that it trailed in almost all major network performance related categories. I mean, on the face of it that seems completely counter-intuitive, that customers of a carrier that lags behind in network reach and reliability, out performs the leaders of all such traditional performance metrics. But T-Mobile has done one thing that sets it apart; it has put service before profits.

But as I said, far too often CEOs are more concerned about the bottom line than they are about the customers themselves, evidenced again by the overall industry concern regarding churn.

You might think that concern about churn, customer turnover, would mean at its core that companies are concerned about customer retention and satisfaction, yet despite all the talk in that regard, inefficient, irrelevant legacy customer services processes speak to the opposite, that companies have little time to really consider what makes customers happy.

But often times companies avoid overhauling ineffective legacy processes because of the fear that it will cost them money. Spoiler alert: It will. Getting customer service right is an investment, but while it might reduce your revenue stream in the short term when compared to more cutthroat competitors, what I’ve found is when you truly focus on customer service you’ll always have the resources to grow and prosper, and you’ll have satisfied, loyal customers along with it.

So what’s the best way to begin thinking practically about improving your company’s customer service? Start to think like a customer. One of the most effective exercises I’ve ever done was approaching my own company as a customer, viewing our customer service processes through outside eyes, and thinking to myself what I would want out of a customer service experience. I was amazed at just how quickly the inefficiencies came to light, and really just how easy it was to fix them.

“We have always expected this issue to be decided by the Supreme Court, and we look forward to participating in that appeal,” said David McAtee, senior EVP and general counsel for AT&T, a similar sentiment echoed by our friendly neighbourhood trade association, the CTIA.

“The wireless industry remains committed to preserving an open Internet and will pursue judicial and congressional options to ensure a regulatory framework that provides certainty for consumers, investors and innovators,” said CTIA President and CEO Meredith Attwell Baker.

As always, much of the talk out of the telecom industry was about market uncertainty, stifling innovation and the possible detriment to customer satisfaction, none of which I find convincing in the slightest.

“We believe the industry will take its appeal to the U.S. Supreme Court,” stated Wells Fargo Securities senior analyst Jennifer Fritzsche, noting the decision was not unexpected. “In terms of next steps, we look for the industry to announce its appeal, and would note the process likely will span into a new administration, which might ultimately change the dynamic of the ruling and [net neutrality] rules. The other unknown is whether Congress will step in and act in light of the court’s ruling. … The decision, if it stands, will also impact the pricing and network management tools available to mobile carriers.”

As expected, some trade groups voiced the ongoing cry that this court decision to uphold Net Neutrality will result in uncertainty in the market.

“The court’s decision means today’s dynamic, ever-changing Internet will face the strict, inflexible rules designed to regulate our grandparents’ phone service,” said Telecommunications Industry Association CEO Scott Belcher. “We continue to believe the FCC has overstepped its authority and we are deeply disappointed by the decision.”

“Today’s decision from the D.C. Circuit creates uncertainty for manufacturers and is a major disincentive to investment in this essential infrastructure,” added Linda Kelly, SVP and general counsel for the National Association of Manufacturers. “The Manufacturers’ Center for Legal Action and the NAM will continue to fight the FCC’s misguided policy – in the courts and in Congress – to ensure manufacturers’ growing technology infrastructure needs can be met.”

What’s interesting in all this though is that Verizon, once the loudest voice in the fight against Net Neutrality, has decided to take a different course, attempting to curate its own adoption of the FCC’s regulations by dictating its own terms of open Internet action. Not the first time we’ve seen that either, as carriers pay hollow lip service to the regulations as they try to find ways to undermine the rules altogether.

While T-Mobile may not have scored the highest marks in a new report from Market Force Information regarding network coverage, for data speeds, or even for most reliable service, its ongoing un-carrier campaign has demonstrated, without a doubt, that none of those things really matter in generating overall customer satisfaction.

In fact, in the recent survey of 8,600 mobile customers, T-Mobile edged out Verizon for top spot in customer satisfaction (with AT&T a distant 3rd and Sprint so far in 4th that they didn’t seem to be in the conversation at all), and that was despite the fact Verizon scored highest in all the aforementioned traditional network metrics. What T-Mobile was able to do, however, was win the hearts of the people by providing them with the best value, the most flexibility, the greatest ease of changing plans, and the best access to “new cell technology.”

Aside from ranking first in satisfaction while not topping the charts in any of the traditional performance categories, what’s even stranger is that T-Mobile placed last among nationwide carriers regarding the frequency of dropped calls, yet despite that, has watched its satisfied subscriber base continue to grow.

So what does this tell us? First, traditional network metrics don’t matter. As Sprint CEO Marcelo Claure noted recently, the difference between networks these days is negligible, with often a difference of 1% in network reliability between the top four nationwide networks. With that in mind, perhaps it’s no surprise that Verizon was able to top all of the performance related categories, yet come in second in customer satisfaction, because it clearly lags behind in one vitally important category: value.

Second, it tells us that wireless customers are more satisfied when they feel a company cares about their needs, as opposed to a company simply providing the best network technology. T-Mobile has gone out of its way of late to present its un-carrier promotion as a campaign for the little guy, finally changing the way customers interacting with carriers. Verizon, on the other hand, has resisted all such changes, holding steadfast to its domineering control over its customers, which in turn leaves many people feeling that they’re paying far more than customers of other companies, but getting very little extra in return.

Aside from demonstrating that value and customer care lead to satisfaction far better than network technology and performance, the survey showed something else, that T-Mobile, despite its third place among nationwide carriers, is quickly moving into second place in terms of competition for Verizon, the nation’s largest carrier by subscribers.

AT&T had a decent showing, not scoring tops in any category and tied just once for last in terms of plan flexibility, but it seems to clearly be sliding into third place, not in size, but at least in popularity. Sprint, as one might expect, had a horrendous showing, scoring at the bottom in five of nine categories.

It should also come as no surprise that when talking about establishing long term customer loyalty, it was overall value that topped coverage and data plans as the most likely reason to seek out a new carrier, which is good news for T-Mobile.

The win constitutes a landmark decision for the FCC and the Obama administration, whose combined efforts to regulate the broadband industry were upheld and the FCC’s authority to regulate the industry was fully maintained.

The appeals court majority rejected the onslaught of challenges presented by the telecommunication industry, as lawsuits were levied against the rules almost the moment they were announced, a significant achievement given the fact that the same appeals court had twice rejected the FCC’s earlier efforts to impose different incarnations of its Net Neutrality rules.

The rules have long sought to establish and protect an open Internet for all, barring broadband service providers from a host of dubious network management practices. While having faced defeat numerous times before, at the hands of the same appeals court no less, the FCC took a different route last year when it reclassified broadband service as a public utility, with an aim to “protect free expression and innovation on the Internet and promote investment in the nation’s broadband networks. The Open Internet rules are grounded in the strongest possible legal foundation … As part of this decision, the [FCC] also refrains (or forbears) from enforcing provisions of Title II that are not relevant to modern broadband service.”

As expected, the broadband industry was not pleased. In fact, a telecom industry consortium, The National Cable and Telecommunications Association, spared no expense in railing against the decision, saying in its initial legal challenge that, “These rules will undermine future investment by large and small broadband providers, to the detriment of consumers.” Unfortunately, even with the appeals court decision, the NCTA is unlikely to stop its work to curb Net Neutrality.

“We are reviewing today’s split decision by the D.C. circuit panel and will carefully review the majority and dissenting opinions before determining next steps,” the NCTA said in a statement. “While this is unlikely the last step in this decade-long debate over internet regulation, we urge bipartisan leaders in Congress to renew their efforts to craft meaningful legislation that can end ongoing uncertainty, promote network investment and protect consumers.”

That said, it had appeared over the last several months that the mood regarding Net Neutrality was beginning to soften, as it seemed adherence to the regulations was something service providers thought they could curate, or perhaps even use as a bargaining chip in other FCC negotiations. Although one could argue that the changing moods were likely a result of the realization that the FCC was going to win this round.

“Today’s ruling is a victory for consumers and innovators who deserve unfettered access to the entire web, and it ensures the internet remains a platform for unparalleled innovation, free expression and economic growth,” said FCC Chairman Tom Wheeler in a statement on today’s ruling. “After a decade of debate and legal battles, today’s ruling affirms the commission’s ability to enforce the strongest possible internet protections – both on fixed and mobile networks – that will ensure the internet remains open, now and in the future.”

While I’ll say again that this is the first landmark victory for the FCC’s laudable Net Neutrality regulations, it still seems far from the end, as opponents still have the option of appealing the decision in the Supreme Court.

Last week House Minority Leader Nancy Pelosi announced that the iconic iPhone was not “invented” by Apple’s Steve Jobs, as many might think, but instead was the product of tireless research on the behalf of the Federal government. You heard that right, the Feds made your iPhone…if Pelosi is to be believed of course.

“Anybody here have a smartphone?” the California Democrat asked attendees at a hearing on the Democratic National Convention platform. “In this smartphone, almost everything came from federal investments in research. […] They say Steve Jobs did a good idea designing it and putting it together. Federal research invented it.”

But does that give the Federal government the right to take credit for the iPhone? Not a chance.

Recently I came across a syndicated rerun of the History Channel show “Modern Marvels,” which for the last two decades has given viewers a behind-the-scenes look at the production and development of various technologies we use in our lives. This particular episode, originally aired in 2001, focused on the dominance of the tech company Palm, and its then famous PDA organizer. As I watched the explanation of the features of the now defunct handheld device, I couldn’t believe just how much of that technology we saw appear several years later in the first iteration of Apple’s iPhone. Sleeker and more advanced, to be sure, but the same basic tech nonetheless.

That episode immediately sprang to mind when Pelosi’s comments came across my desk, not only because they come with the clickbait shock value we’ve all come to expect from modern online news reporting, but because it’s absolutely true. As Pelosi’s full quote goes on to say, “GPS, created by the military, flatscreens, LLD [sic], digital camera, wireless data compression, research into metal alloys for strength and lightweight, voice recognition — the list goes on and on…. They say Steve Jobs did a good idea designing it and putting it together. Federal research invented it.”

It would be like crediting Archimedes for the invention of the airplane, given that he understood and designed the first basic propeller screw or because he understood lift. Sure he knew the basics, and should be credited for such, there’s no way we would say he was the inventor of the first airplane.

With that in mind, let’s give credit where credit has always been due, to Apple and the late Steve Jobs. Sure we might say mobile development was more innovation than invention, but it took such visionaries to take what the Feds might have invented to create something useful and desirable. Or as Forbes writer Time Worstall explains, “Governments are really bad at innovations even if they can be useful in inventions. Or, if you prefer, the Feds really didn’t create the iPhone, Jobs and Apple did.”

“The late Steve Jobs and the team at Apple that made the iPhone would be the first to tell you that they didn’t invent many of its core technologies we now take for granted,” he said. “Leader Pelosi counted Steve Jobs as friend and meant no disrespect to his legacy, but the point she was making is a valid one. Leader Pelosi believes that Steve Jobs and his colleagues at Apple, deserve enormous credit for taking federally-backed innovations off the shelf, refining them, commercializing them and turning them into a beautiful device that changed the world.”

While data caps have long been part of the mobile scene, broadband providers have slowly and subtly been implementing them across the gamut of Internet connections, particularly for wired, home use. On the face of it providers are hoping the same explanation given for capping wireless use will be accepted by wired customers as well, that capping data is the most effective way to management networks to the benefit of everyone, users and providers alike.

In fact, there is very little evidence to support the idea that there is a scarcity of Internet bandwidth, or that capping data will help better manage networks, and it’s even something CEOs of some of these ISPs now readily admit.

Now the fact that most customers are unlikely to hit the caps currently is beside the point, for the rise of streaming video and other such online content likely means that those caps will be insufficient for the majority of customers in the very near future, allowing AT&T to reap the rewards of overage penalties (something is threatened to lose out on in the wireless market).

It is, to put it simply, nothing more than a way for AT&T to charge people more for the service they were already using, arbitrarily penalizing those who don’t want to shell out the extra money for unlimited data use. Not only that, but as other rival CEOs are quick to admit, those data caps are nothing but a cash grab, completely pointless when it comes to better network management and better broadband service.

Dane Jasper, CEO of Sonic, a California-based internet provider, said recently that “The cost of increasing [broadband] capacity has declined much faster than the increase in data traffic.” Further, he noted that his company’s costs for managing broadband have dropped from 20% of revenue to 1.5% in the past few years.

Other CEOs have said the same, as Frontier Communications boss Dan McCarthy told trade publication Fierce Telecom, “There may be a time when usage-based pricing is the right solution for the market, but I really don’t see that as a path the market is taking at this point in time.”

In fact, the cost of managing networks and providing Internet access has decreased dramatically over the last few years, meaning it doesn’t cost carriers that much to provide the data customers are using.

According to St. Louis broadband company Suddenlink CEO Jerry Kent, companies don’t have to spend a lot to keep up with customers’ data demands. “Those days are basically over,” he said on an investor call last year, “and you are seeing significant free cash flow generated from the cable operators as our capital expenditures continue to come down.”

Simply put, with recent moves by the likes of AT&T to pointlessly toll broadband access and even charge to opt out of the company’s mandatory targeted advertising, one thing is becoming abundantly clear: that the things we used to enjoy, unlimited wired online access and a modicum of privacy, are now suddenly luxuries of the digital world, meaning we’ll have to pay more not to get more, but to keep getting what we’ve always been getting before.