Stop drinking the 5G bathwater

The telecoms industry is fatally caught between reinventing circuits with 5G, and an envy of vertical application businesses. Survival is not mandatory.

Back in September 2003, fourteen years ago, I started my Telepocalypse blog whilst working in a strategy role at Sprint in Overland Park, KS. The permanent record of the Wayback Machine has the evidence of what I subsequently wrote.

As Sprint didn’t have any policy on blogging or social media, I didn’t break any rules. However, writing (albeit implicitly) how your employer’s product plans are inherently delusional is a pretty sure way to a short tenure in product development. In under a year I was gone.

At the time I made a pretty accurate assessment at the time of the prospects for telco voice and messaging services in the face of “over the top” competition. You can even read the anonymised strategy report I later sold to both the handset divisions of Motorola and Nokia as an independent consultant.

Sometimes in professional life you end up watching this kind of corporate slow-motion crash, with its inevitable horror and destructive denouement. The participants insistently corral themselves into a future with with an unpleasant ending that makes Thelma and Louise look like strategic geniuses. At least they got a friendship and a kiss out of it before their descent of doom!

And so the wheels of the Telepocalypse turn. Last week I presented at the Federation of Communications Services in City of London as the opening speaker. I’ve been turning down all speaking offers for some time, and this was the first I had accepted in ages. It’s a smaller event with the right people in the room.

One of the “right people” is Dan Warren, who presently holds the title of Head of 5G Research at the Samsung R&D Institute. He is perhaps better known in one of his previous roles as a very visible Senior Director of Technology of the GSM Association, where among other achievements he fathered VoLTE into this world, so we all can still talk on 4G LTE phones as well as we did on 2G ones.

Dan was open about his personal skepticism of how 5G is presently being positioned. His relationship to 5G might best be thought of in terms of the views of certain Anglican bishops with respect to God. He’s clearly a senior and secure member of the institutionalised form of belief, but individuals are allowed to go a long way “off scripture” with respect to the theoretically non-negotiable doctrines.

Dan raised several important issues. The first is that backwards compatibility is forcing the industry into inappropriate design and architecture choices. Specifically, he stated that TCP/IP and GTP, two core protocols ubiquitous in mobile, are unfit for purpose. This is not news in the fixed world, with 128 Technology being the present leader in abandoning the failed TCP/IP model and developing products on a new paradigm.

Dan also discussed how network “slicing”, where you split apart the resource, lacks its necessary counterpart of “splicing”, where you aggregate resources back together again. Furthermore, he described the currently performance requirements for very low latency and unbroken coverage and availability as being marketing boasts not backed by actual customer demand (at the price it would cost to deliver) or technological reality.

It wasn’t just Dan who painted a somewhat downbeat future for mobile. Adrian Barnard of Alpha Beta Solutions gave a grounded vision of an industry “drinking its own bathwater”, having indulged in many flights of executive management fantasy over the years. Each attempt to crawl out of the network into the application and content space results in an unsightly mess down the road.

I was particularly taken by his phrase of mobile network operators (MNOs) and fixed carriers being “unfriended by regulators”. The present “best effort” paradigm is creating increasing user dissatisfaction. Its Sovietesque mode of production — you get what you get, when we decide to deliver it, and tough luck if the quality is poor — is driving growing public and political revolt.

Nothing else in modern technological life has the capricious experience of broadband Internet access. 5G is presently mainly focused at attempting to extend and improve this legacy 1970s model, with some lip service to new vertical cloud and industrial applications.

The times most definitely are a changin’, and in telecoms the new mantra is fitness-for-purpose, both in terms of meeting end user demand, and constructing appropriate supply. Regulators are being forced, if they are to retain their legitimacy, to hold ISPs and carriers to account for the experience they deliver.

After all, “best effort” is just a bullshit excuse for the inability to engineer performance.

These conversations eerily echoed some from the start of last week, when I visited Miguel Ponce de Leon down at Waterford Institute of Technology in Ireland. They certainly win the accolade of “best view from a conference room”!

Miguel helps to run the EU’s ARCFIRE project, where I am on the advisory board. They are performing large-scale experiments with the RINA architecture (a modern replacement for TCP/IP that would also eliminate obsolete tunnelling protocols). I am sure he won’t mind me sharing some of our private conversation.

We explored how telcos and their equipment vendors were locked in an embrace of death. They constantly double down on the broken “beads on a string” model of packet networks, despite its manifest technical, economic and experiential failure, certainly when compared to traditional voice and messaging.

Yet when telcos are presented with an alternative paradigm like RINA, it is rejected. It simply doesn’t fit how people currently think about the network. It’s a fundamental reframing from a communications service to a computing one. That doesn’t go down well if your status and security is attached to delivering the next iteration of the incumbent paradigm.

The same has been true of the quality attenuation framework and ∆Q mathematics for managing performance of distributed systems. The application demand side wants something better, but the telco supply side has little interest in providing it. The status quo has, until now, been too comfortable.

This is what today’s mobile experience failure looks like. It’s my data connection using my mobile hotspot. Packets in the downstream from the UK to Ireland are taking over a minute to arrive. Yes, they are being buffered up for delivery for over 60 seconds. This is pure madness.

It is the result of a systemic failure. It is not the problem of Apple (my laptop), Huawei (my hotspot), or Three (providing the UK SIM and Irish network). The user buys into an ecosystem, and this ecosystem is delivering an absolutely rotten outcome.

The industry as a whole has a performance engineering problem, and 5G is going to make it worse, not better. This example is how the dynamic properties of the system are not being correctly managed. 5G makes the dynamics increasingly dominant, but does nothing of note to address the engineering issues that resulted in this appalling experience.

More of the same methods will lead to more of the same outcomes.

And the industry’s current effort? To reduce the static radio overhead from 25ms to 2ms. It would be hilariously funny if it wasn’t so seriously misbegotten. (Just to compound the credibility catastrophe, a key application is “autonomous vehicles”, which, um, don’t need a connection by definition.)

It’s not that improvements in the radio network are undesirable or unworthy. Quite the opposite: they are the communications bedrock on which the computing service is constructed. RAN engineering is a mature discipline done by highly experienced physicists and professional engineers.

But at present the distributedcomputing service built from those radio resources is an industry science and engineering dead zone. The protocols and practises we are using are simply not up to the jobs we are asking of them.

The root cause is a category error about the nature of packet networking. This arose because it was historically necessary and expedient to package the new distributed computing service as if it was a legacy telecoms circuit. 5G is just another iteration on that theme, and only serves to deepen the performance engineering deficit. Fitting the compute onto the communications as an afterthought isn’t working.

In other words, 5G is in conflict with the underlying computer science and mathematics. This doesn’t end well, as without strong foundations, any edifice will lean and fall if built too “tall”. It was very much in my mind at the FCS event, with the 1670s craft-built St Paul’s Cathedral out of one window, and the engineered 2010s Shard skyscraper out of another.

5G is attempting more than our “medieval” network craft can deliver. It doesn’t mean the 3G churches and 4G cathedrals of the past were defective. It’s just when you reach a certain size, cathedrals tend to fall down in storms and at the slightest earth tremor. They cannot withstand the dynamic loads that taller structures must withstand to a known safety margin.

If you want safety, then choose engineered performance based on science, not emergent based on craft

On its present course, 5G could conceivably “Nokia” the whole mobile industry, it’s that far off-beam technically. It opens the window of opportunity for a “game changer” to come along, like Apple did with the iPhone, and displace the current mobile industry ecosystem. A spectrum cartel won’t protect you: the Industrial Internet unleashes forces that are bigger even than today’s mobile industry.

But it is not all doom and gloom. We can put RINA-like principles into the current network, and do RINA-on-TCP/IP and TCP/IP-on-RINA. There’s a simple migration path, like from dedicated hosting to shared cloud.

We can calibrate the existing networks to understand their quality floor, and begin the journey from emergent to engineered performance. We have the high-fidelity measurement systems to do this. They are sufficiently mature and scalable for mass deployment in the short to medium term on which 5G is working.

We can switch to a composable resource model so that slicing and splicing result in predictable outcomes. Managing digital supply chains and the resulting experience doesn’t have to be a dark craft full of black magic. It’s just ordinary engineering and reasoning about supply and demand, and the difference between the two.

There is a migration path to a saner technical model with strong computer science and mathematical foundations. It means acknowledging that instead of developing 5G of the circuit-based broadband Internet model, we are really developing 1G of the new cloud application access one.

5G is presently built on hubris. History tells you how much that can hurt. To avoid a Telepocalypse when building a network skyscraper, ensure you put the science and engineering foundations in place first.

You know who to call for help.

The view departing Dublin by ferry last Tuesday. Too pretty not to share it.

For the latest fresh thinking on telecommunications, please sign up for the free Geddes newsletter.

Get your fresh thinking withfree Geddes newsletter

I am an expert on the telecommunications business. I help senior executives to make sense of what is happening, anticipate what is coming, and to act decisively in the face of uncertainty. My long-term professional goal is to facilitate three paradigm shifts: for data networking to become a true science; for voice to evolve its own native form of hypermedia; and for cloud-based enterprises to have the most efficient and effective possible means to communicate with their customers - Martin Geddes. Contact us here