It could be argued that before the switch-over from analogue to digital television broadcasting, the value of terrestrial broadcasting was on the decline. Faced with fierce competition from cable and satellite, each offering 10 or more times the number of programmes, terrestrial television was a poor cousin whose main use was often to deliver public service broadcasting and, through general interest obligations imposed by national governments, to provide an accessible (free-to-air) service to 95% or more of a country's population (measured both geographically and demographically).

As digital switch-over has taken hold, terrestrial broadcasting has had a repreive and is now able to offer true multi-channel television. Where previously it was only possible to broadcast a single television station on a single frequency, that frequency can now hold 10 or more standard definition (SD) channels, or 4 or 5 HD channels. Where there may have been 6 analogue stations on air, there can now be upwards of 60 stations. In many cases, 60 stations is enough for the average viewer and the bouquets of channels offered on cable or satellite may now begin to seem expensive compared to free-to-air DTT. In many countries the draw of cable and satellite TV is no longer the sheer variety of channels available, but the premium content that is on offer. Pay-TV services offering sport and movies continue to be popular, but such premium content is not usually available on DTT. Nonetheless, for many viewers DTT is perfectly sufficient.

But just as digital terrestrial TV (DTT) has had a new lease of life, the other broadcasting platforms are once again biting at its heels with new service offerings. Whilst 3D television seems to have taken a back seat for the time being, new, even higher definition television, is stepping to the fore. Ultra-High Definition (UHD) has twice the resolution of standard HD and large UHD televisions are already on show and on sale in many retailers. At present there is minimal UHD content, however a hand full of new UHD channels are being launched and, for example, the World Cup football in Brazil will be broadcast in UHD.

To be able to see the difference that UHD makes compared to standard HD, a very large television set is needed (42 inches or greater) and it could therefore be argued that UHD will always be a niche product. Then again, many broadcasters believed that HD would be a niche, but it is becoming the de facto standard and average television sizes are on the increase.

Technically speaking, UHD requires a bit-rate of around 20 Mbps. Whilst such bit rates are relatively easy for cable and satellite networks to deliver, broadcasting UHD over DTT would require at least half of a DVB-T2 multiplex, and the most advanced video (HEVC) codecs. In practice this means that were a terrestrial frequency currently carries 10 or more SD, or 4 or 5 HD channels, it might, at best, be able to offer 2 UHD channels.

But that is not the end of the story. Just around the corner is super-high vision (SHV), sometimes called 8K, which once again doubles the resolution of the picture compared to UHV. SHV will require around 75 Mbps to be broadcast and at this point, whilst cable and satellite are still in the game, DTT is no longer able to broadcast even a single programme on a single frequency (without very complex transmitter and receiver arrangements that would require, for example, householders to install new, and potentially more than one, antenna). Of course to benefit from SHV, an even larger TV screen will be necessary, but with the growth in home cinema, it is can perhaps be expected that in time, a goodly proportion of homes will want access to material in this super high resolution.

So where does that leave DTT? Arguably, within a few years, it will once again be unable to compete with the sheer girth of the bandwidth pipe that will be provided by cable and satellite networks. It's probably worth noting at this point that most IP-based video services (with the possible exception of those delivered by fiber-to-the-home) will also be unable to deliver live SHV content. This time there will be no reprieve for DTT as it simply does not have the capacity to deliver these higher definition services.

What is therefore to be done with DTT? Is it necessary to provide continued public service, universal access, free-to-air services that were the drivers for the original terrestrial television networks? Is its role to provide increased local content which might be uneconomic to broadcast over wide areas? Should it be used to deliver broadcast content to mobile devices where it has more than sufficient capacity to provide the resolution needed for smaller screens? Or, should it be turned off completely, and the spectrum it occupies be given over to something or someone else?

In countries where cable and satellite penetration is already high, there is arguably nothing much to lose by switching DTT off. In Germany, for example, RTL have already withdrawn from the DTT platform and there is talk of turning off the service completely. In countries that have not yet made the switch-over, it might be more cost effective to make the digital switch-over one that migrates to satellite (and cable where available) than to invest in soon-to-be-obsolete DTT transmitters.

Broadcasters should be largely agnostic to the closure of DTT. After all, their business is producing content and as long as it reaches the audiences, they ought not to care what the delivery mechanism is. Other radio spectrum users (e.g. mobile phones, governments) would surely welcome the additional spectrum that would become available. So who loses? Those companies who currently provide and operate the DTT transmitter networks, such as Arqiva in the UK, Teracom in Sweden and Digitenne in the Netherlands, who stand to lose multi-million pound (or Euro) contracts. For these organisations the stakes are high, but even the most humble economist would surely admit that the benefits elsewhere outweigh the costs. So let's turn off DTT - not necessarily today - but isn't it time to plan for a 'digital switch-off' to follow the 'digital switch-over'?

It seems that following the ESOA submission to Ofcom concerning the apparent errors in the RealWireless study on spectrum demand for mobile data reported by Wireless Waffle on 15 Febuary, the offending report has now been re-issued (note the publication date is now 11 April 2014) with the axis on Figure 44 which shows data traffic density re-labelled from 'PB/month/km˛' (PetaBytes) to 'TB/month/km˛' (TeraBytes), thereby reducing the calculated data traffic by a factor of 1000 and now making the document internally consistent. Well done Ofcom and RealWireless, though they could have publicly admitted the apparent error, instead of quietly re-issuing the document with no fanfare. Presumably this now makes ESOA look rather silly.

But... even a 10th grade student could complete the sum that is behind the ITU data forecasts and realise that the axis should have read 'PB' all along (and therefore that the internal inconsistencies are not fixed and that the data in the ITU and RealWireless models is still hundreds of times too large). Here, for you to try, are the values - taken from the ITU's 'Speculator' model - and the maths you need to apply. The values are for 'SC12 SE2' which represents people using 'high multimedia' services in urban offices and is with the ITU model in its 'low market' market setting (it has a higher one too).

User density:

120,975 users per km˛

Session arrival rate per user:

3.3 arrivals per hour per user

Mean service bit rate:

13.83 Mbps

Average session duration:

81 seconds per session

Now for the maths...

First, multiply the first two numbers to get 'sessions per hour per km˛'. (120,975 × 3.3 = 399,217.5)

Then multiply this by the average session duration to get 'seconds of traffic per hour per km˛'. (399,217.5 × 81 = 32,336,617.5)

Then multiply by the mean bit rate to get 'Megabits of traffic per hour per km˛'. (32,336,617.5 × 13.83 = 447,215,420)

To make the numbers more managable, divide by 8 to get from bits to bytes, then by 1,000,000 to get from Megabytes to Terabytes (447,215,420 ÷ 8,000,000 = 55.9)

So the traffic assumed by the ITU model for people using 'high multimedia' services in urban offices is 55.9 Terabytes per hour per square km. But the figure in the graph in the RealWireless report is per month, so we need to scale this up from hours to months. We now have the thorny question of 'how many hours are there in a day', which for mobile data traffic is not necessarily 24 as you might expect. If the above figures are meant to represent the busy hour (the busiest hour of the day), it would not be right to multiply the value by 24 to get daily traffic, as this would assume every hour to be as busy as the busiest. As a conservative measure, let's assume that the daily traffic is 10 times that of the busiest hour. So daily traffic per square km would be 559 TeraBytes (55.9 × 10 just in case you couldn't work this out in your head).

The number of days in a month is relatively easy to work out, it's 30.4 on average (365.25 ÷ 12). So monthly traffic per square km would be 559 × 30.4 = 16,994 TeraBytes per month per km˛.

This is the monthly data for just one urban traffic type in the ITU model, there are 19 others. Ignoring the others completely, Figure 44 of the RealWireless report should show monthly traffic in urban areas for the ITU model being 17,000 TeraBytes per month per square km, include the other activities that urban office workers undertake and the value should be much higher still. But it now shows as being just over 100 TB/month/square km for the ITU and less for the RealWireless prediction, 100 or more times too low. Oh dear!

So having corrected the figure in the RealWireless report, it is now wrong. It was correct before. And it still does not tally with the total data forecast for the UK that is in the same report.

Surely there are people at Ofcom who own a calculator, have a GCSE in maths, and possess a modicum of professionalism such that they would want to check the facts before blithely allowing their suppliers to fob them off with an 'oops, we mis-labelled an axis' argument. Presumably they thought that it was ESOA who couldn't handle a calculator properly.

Following the recent Wireless Waffle piece on Valles Marineris sized chasm in the values used by the ITU in predicting the demand for IMT spectrum in 2020 spotted by the European Satellite Operators Association in their response to Ofcom's mobile data consultation, others have noted similar gulfs.

Telecoms analyst Tim Farrar (pictured right) published an article in GigaOm entitled 'Note to the telecom industry: beware of false models'. In it he takes a different view to ESOA. The ESOA response tries to use the values in the ITU's 'Speculator' model to define the data traffic that the UK would experience in 2020 and discover that applying the values in the ITU model yields results that far exceed forecasts. The GigaOm article instead looks directly at the values found in the ITU model and concludes that they are up to 1000 times too high which generally concurs with findings of the ESOA analysis.

Next the European Broadcasting Union (EBU) have chipped in. Their document, 'Crystal balls, tea leaves or mathematics' critically examines the ITU's model and similar to the others concludes that there are a 'number of erroneous elements'.

Wireless Waffle has been able to get hold of a copy of the 'Speculator' and so exclusively for you, here are some of the values that are causing people such as ESOA, Mr Farrar and the EBU such consternation:

This represents the chance of not being able to make a call (i.e that there is a 99% chance of success).

Population Density

Maximum of 222,333 per sq km

This occurs in 'SE2, SC12' which equates to interactive high multimedia use in offices in dense urban areas.

Mean Service Bit Rate

SC6 (streaming super high multimedia): Up to 1 GbpsSC11 (interactive super high multimedia): Up to 1 Gbps

Really? 1 Gbps on average!

The population density figure for urban offices using 'interactive high multimedia' is brain achingly odd. For other uses in urban offices, the population densities are significantly lower, so it is not clear why the use of these interactive high multimedia would be so prevalent in offices compared to other applications. Have the ITU assumed that all office workers do all day is play games and watch videos?

A mean (average) service bit rate of 1 Gbps seems excessively excessive. If this was the peak service rate then, maybe, just maybe, this would be possible (and only possible on LTE-Advanced networks, not on the others). But to assume that it is an average seems just crazy.

Of course the big question is, what would the 'Speculator' say, if the values input to it were more realistic? To try and answer this question requires some kind of estimation of what realistic actually means. Whilst we make no claims for the realism of any of the values proposed below, here are some alternative values...

Parameter

New Value

Notes

Spectrum Efficiency

For GSM/UMTS/LTE: 0.55 to 1.5 bits/second/Hz/cell. For LTE-Advanced: 1.1 to 3 bits/second/Hz/cell

The values for LTE-Advanced are taken from the ITU's own Report M.2134. Those for GSM/UMTS/LTE are half the LTE-Advanced values (roughly in line with the original ratios).

Call Blocking Rate

2%

A value that more operators would recognise.

Population Density

Reduced so that the weighted average values are the same as those in the ESOA report for the UK (e.g. ~11000 per sq km in Urban areas).

This should mean that running the ESOA calculations would at least yield the correct population for the UK.

Mean Service Bit Rate

Capped at 100 Mbps.

Seems a little more reasonable based on the technologies likely to be in use by 2020.

The big question is obviously therefore, what does this do to spectrum demand? The original and revised figures are shown in the table below.

Setting

GSM/UMTS/LTE

LTE-Advanced

Total

Original

Revised

Original

Revised

Original

Revised

Low

440 MHz

580 MHz

900 MHz

480 MHz

1340 MHz

1060 MHz

High

540 MHz

660 MHz

1420 MHz

600 MHz

1960 MHz

1260 MHz

What does this tell us? Oddly, in both cases, the demand for GSM/UMTS/LTE spectrum has increased. This is probably due to the lower spectrum efficiency that these technologies have been assumed to achieve. Conversely, the total spectrum demand has dropped significantly and all of this reduction has come from spectrum for LTE-Advanced.

But what is most striking about these calculations is not necessarily the differences in the results, but the simplicity with which it is possible to present alternative values and find a different outcome. For example, no effort has been made in the above analysis to check the way in which the ITU model apportions traffic between the 2G/3G networks and the LTE-Advanced network. Could, for example, it be argued that by 2020 major carriers in advanced markets (e.g. USA) will have moved all of their data traffic to LTE-Advanced and that only 2G will remain for legacy voice services. This would almost certainly serve to vastly reduce the amount of 2G/3G spectrum that would be needed, whilst providing only a modest increase in the amount of spectrum that would be needed for LTE-Advanced, given the technology's improved spectrum efficiency. In this case, the total requirement would probably fall further. Or could it be that we will all be living in a virtual environment, with Google glasses projecting us a view of the world in full HD as we stroll around the office - requiring umpteen times more data than the ITU model predicts.

The fact is that any model of this kind, no matter how many brains were employed in developing it, can never be more than a 'best guess', especially when looking 7 to 10 years into the future. Weather forecasters struggle to predict the level of precipitation 7 to 10 days into the future and no-one in their right mind would decide if they needed to carry an umbrella a week next Tuesday based on their forecast. Nor should the vast wireless community take decisions based on this one forecast, it would be irresponsible of them to do so and if the weather changes, they may end up getting soaked!

When using your mobile phone, smart phone or tablet have you ever noticed that next to the signal strength bars (usually found at the top of the screen), there is often a letter (or two) that seem to change almost at random as you move around, and even sometimes when you aren't moving at all? Have you ever wondered what these letters are there for and what the implications of them changing from one letter to another are? Well wonder no more because the answers are about to follow, as Wireless Waffle explains it all...

Beware - this means you are connected to a network outside your home country and data costs could be astronomical! The R is sometimes shown in a triangle.

X

No signal

-

On some phones, an X appears above the signal bars if there is no signal at all.

Note that the typical connection speeds given above are those that are generally achieved in real-life. Though in theory the technologies used can offer faster connections, much depends on how many users are in a cell and what they are doing, how close to the centre of the cell you are, whether you are stationary or on the move, and a whole host of other factors.

Arrows (sometimes coloured, and sometimes integrated into the signal bars) pointing up and down are also illuminated. This just shows whether you are downloading (the downward arrow) or uploading (the upward arrow) data to the mobile network.

In addition, the number of bars shown on your signal meter will also affect how good your connection is. So a '3G' connection with all signal bars lit might be better than a 'H+' connection with only one bar lit. However a full strength signal may not necessarily mean a fast connection as most phones show the 'strength' of a signal and not the 'quality' It's quite possible to have a full strength signal that's suffering lots of interference and thus is bad quality.

What does any of this matter? It doesn't really, but if you are wanting to view a YouTube video and your phone is showing 'G' or 'E', the chances of you getting a fast enough connection are virtually nil.