It's been a long while since anyone at Wireless Waffle installed any satellite dishes, however as part of a project to improve language skills, it was decided that the WW HQ would be fitted with the kit needed to receive German television. This is the sad story of the trials and tribulations of what should have been a simple job in the hope that it may help others trying the same thing not to fall into the same traps that befell our attempts!

Firstly, a visit to Lyngsat and a browse through the dozens of satellites that cover Europe quickly yielded the fact that the channels that were wanted could be found on various Astra 1 satellites at an orbital position of 19.2 degrees East (19.2E). As a ready reckoner, the following orbital positions are the 'hot-slots' for various European languages:

English - 28.2E

French - 5W

German - 19.2E

Italian - 5W or 13E

Polish - 13E

The next thing to do is find out what size of dish is needed to receive the satellite that's of interest. This is more complex as it requires a knowledge of the satellite's footprint and the strength of signal at a particular location. For 19.2E in the UK, even a 55cm dish should be fine pretty much everywhere, so a Triax 54cm dish was duly purchased together with a suitable wall bracket and an Inverto LNB.

The mounting of the dish on the wall was relatively straightforward, having made sure that there were no obstructions in the line-of-sight from the dish to the satellite (such as trees or other buildings). With the dish on the wall, the next step is to align it so that it is pointing at the satellite. In general a rough idea of the right direction can be gathered if you know your latitude and longitude and the satellite you wish to receive through many online tools (such as dishpointer.com).

Getting the dish pointing in roughly the right direction is not too difficult, but even a small dish needs to be pointing with an accuracy of better than plus or minus 1 degree (bigger dishes have to be even more accurately aligned) and so some form of fine tuning is needed.

In analogue days gone past, by far the best way to align a dish was to connect it to a satellite receiver, and connect the satellite receiver to a television, and put the whole lot in a place where the TV could be seen from the dish. With the satellite receiver tuned to a channel on the appropriate satellite, it was then just a matter of moving the dish about until a signal could be seen on the TV. Once the signal was found, gently moving the dish from side-to-side and up-and-down to a point where the quality of the picture was maximised was all that was needed. Of course the same method can still be used today, but there has to be a less crude way, right? Right...

The SLX Satellite Finder costs less than a few metres of CT-100 coax, and provides both a visual indication of signal strength (using the in-built meter) and an audible indication (using the in-built buzzer). All that is then required to use this to align a dish is a 'patch lead' so that the dish can be connected to a socket on the meter and then a lead coming from the (indoor) satellite receiver connected to the other socket on the meter to supply power. So far, so good.

Now, turn on the satellite receiver and return to the dish. In theory, the meter should only register a signal if the dish is pointing at a satellite. However, the modern Inverto LNB was obviously doing a far better job of receiving than the systems that the crusty SLX meter was being designed to work with resulting in a full-scale meter deflection (and an annoying beep that could not be turned off) almost regardless of the position of the dish. No amount of experimentation yielded anything other than full-strength or nothing, and the full-strength indication happened across a wide arc of the sky and with the elevation angle of the dish anything within 10 degrees of that which should have been right. In a word, beeping useless!

Not to be defeated, and rather than cart the TV and receiver outdoors, a second, seemingly more modern meter was purchased, the SF-95DR Satellite Finder. This proved to be marginally better, but having the dish within 'a few' degrees of the right position still yielded a full-scale signal. At least the beep could be turned off.

An old trick from the analogue days to reduce the signal to make fine tuning the position of the dish easier if the signal was very strong, was to cover the dish in a damp tea-towel. The water in the towel will attenuate the signal making the signal weaker and thus the dish easier to align. This trick was tried using the SF-95DR but alas, only resulted in the need to keep picking up a damp tea-towel from the floor, every time the wind blew it off.

Eventually, more through luck than skill, a point was found where the meter indicated a peak that was within a degree or so of nothingness in nearby directions, suggesting that the dish was aligned to a satellite. An excited scan of the receiver revealed some signals but alas, from the wrong satellite (13 East instead of 19.2 East). Of course the meter would no more know which satellite it was pointing at than an amoeba would know the difference between a car and a lorry, just that both seem pretty big. More fiddling, and a slightly damper tea-towel and a second 'peak' was found. Another tune of the receiver and 'Allelujah!' channels that were being transmitted from 19.2 East were found. But only from one transponder...

What could this mean? Was it that the dish was roughly aligned but that only the very strongest signal was being received? Was it that the LNB was faulty? Was there a fault in the cable from the dish to the receiver indoors? Any (or all) of these could be the problem and with nothing more to go on, it seemed that the only way to resolve the issue was to resort to carting the TV and receiver outdoors so that the screen could be seen from the location of the dish. Doing this would mean that the 'signal strength' and 'quality' bars on the receiver's on-screen menu display could be used to point the dish more accurately.

A new patch lead from the dish to the receiver was fitted with F-connectors (thereby ruling out any problem with the coax feeding indoors). Power up... And the receiver is showing 100% signal strength (very good!) but a signal quality of only 60% (OK but not brilliant). No amount of dish repositioning would yield any improvement and still just the one transponder was receiveable. Before giving up and ordering a new LNB, and with an increasing level of suspicion building up, the meter was taken out of line so that the dish was connected directly to the receiver without the meter in circuit.

Hey presto...! Now the receiver was showing 100% signal and 80% quality and, wait for it, all of the transponders on the satellite could be received. A final fine-tune of the dish position and the quality of reception was increased to 90% - not a bad result at all. Moving the TV and receiver back indoors to the other end of the original run of coax and this excellent result was maintained. It seems that the meter may have been overloaded by the signal from the satellite and was somehow distorting the signal (possibly it was generating harmonics or intermodulation products).

So the lessons from this cautionary tale are:

Don't use cheap 'satellite finder' meters to help align dishes, they cause more problems than they solve.

Stick to the tried and tested methods and just move a TV and receiver to a place where they can be seen from the dish and use the receiver's signal meter for alignment.

Damp tea-towels should be used for wiping down surfaces in kitchens and not for the setting-up of sensitive electronic equipment.

At this point you're probably thinking that this is the end of this cautionary tale, but you'd be wrong... there's more to come! Stay tuned to Wireless Waffle for our next extremely uninspiring episode of: HOW NOT TO INSTALL A DISH.

There is growing evidence to suggest that Google is planning to enter space by launching satellites of its own. Two seperate pieces of news point in this direction. Firstly, Google has announced plans to purchase Skybox. Skybox operates low-earth orbit satellites whose purpose is to take high resolution images, of exactly the kind that are used by Google maps. Following the recent lifting of US government restrictions on the use of images with better than 50 cm resolution, the move by Google to own its own earth imaging satellites makes complete sense.

Why is the use of Ku-Band more complex? Ku-Band is already very heavily used for satellite broadcasting, as well as for a number of satellite broadband networks (e.g. networks such as Dish in the USA). As it is proposed that the WorldVu network will be non-geostationary (e.g. the satellites will move around in the sky as seen from the Earth), the downlinks will have to be switched off when the satellites are in a position that could cause interference to existing geostationary satellite services (which will generally be when the WorldVu satellites are over the equator, and whilst a few degrees either side of it). This is made worse because any uplink that could cause interference also needs to be switched-off as well if it is pointing at the geostationary arc. The same is true of O3B, but the more dense packing of Ku-Band satellites will make the situation far more complex.

For example, there are over 60 Ku-band satellites visible in the sky in the UK. As the arc as viewed from the UK is 70 degrees from end-to-end, this means there is approximately one satellite every degree.

All of this switching on and off every time a satellite is over the equator (and, of course, when a satellite disappears beyond the horizon), and the requisite requirement to connect to a different satellite at those times to maintain a connection is both complex and also creates the environment for tremendous problems with 'dropped calls', if handover between satellites fails for any reason. Similarly, this switching will cause severe jitter (changes in timing) which in itself can cause problems for some internet applications (e.g. streaming).

Finally, and probably the weirdest issue with WorldVu, is the antennas that are planned to be deployed for user ground stations. It is suggested that Google plan to use antennas based on meta-materials. O3B have also, apparently, signed a deal to work on the development of meta-material based antennas. But at present such antennas have been proven only at Ka-Band (and then only in a developmental and not commercial form), and not at the Ku-Band proposed by Google. Even if they could be made to work at Ku-Band, there would be a loss in efficiency making transmission from Earth to space next to impossible and the fact that the antennas can only be steered around +/- 45 degrees will mean that some satellites, even when in view, will not be able to be connected to.

There is a suggestion that Google's purchase of Skybox will provide a potential platform for an early launch of the WorldVu space segment. One the one hand it makes some sense. If you are launching one satellite, why not make it multi-purpose. Though it would increase the size and weight of the satellite, and the launch cost too, having the cameras directly connected to the Internet might make sense. Then again, eyes in the sky connected to the Internet is eerily similar to the world-changing paradigm that is posited in Arthur C. Clarke's book, The Light Of Other Days. Big Brother will definitely be able to watch you, as will your neighbour, your partner, the government, and anyone else with voyeuristic tendencies who wants to. Who's zoomin (in on) you?

It seems that following the ESOA submission to Ofcom concerning the apparent errors in the RealWireless study on spectrum demand for mobile data reported by Wireless Waffle on 15 Febuary, the offending report has now been re-issued (note the publication date is now 11 April 2014) with the axis on Figure 44 which shows data traffic density re-labelled from 'PB/month/km˛' (PetaBytes) to 'TB/month/km˛' (TeraBytes), thereby reducing the calculated data traffic by a factor of 1000 and now making the document internally consistent. Well done Ofcom and RealWireless, though they could have publicly admitted the apparent error, instead of quietly re-issuing the document with no fanfare. Presumably this now makes ESOA look rather silly.

But... even a 10th grade student could complete the sum that is behind the ITU data forecasts and realise that the axis should have read 'PB' all along (and therefore that the internal inconsistencies are not fixed and that the data in the ITU and RealWireless models is still hundreds of times too large). Here, for you to try, are the values - taken from the ITU's 'Speculator' model - and the maths you need to apply. The values are for 'SC12 SE2' which represents people using 'high multimedia' services in urban offices and is with the ITU model in its 'low market' market setting (it has a higher one too).

User density:

120,975 users per km˛

Session arrival rate per user:

3.3 arrivals per hour per user

Mean service bit rate:

13.83 Mbps

Average session duration:

81 seconds per session

Now for the maths...

First, multiply the first two numbers to get 'sessions per hour per km˛'. (120,975 × 3.3 = 399,217.5)

Then multiply this by the average session duration to get 'seconds of traffic per hour per km˛'. (399,217.5 × 81 = 32,336,617.5)

Then multiply by the mean bit rate to get 'Megabits of traffic per hour per km˛'. (32,336,617.5 × 13.83 = 447,215,420)

To make the numbers more managable, divide by 8 to get from bits to bytes, then by 1,000,000 to get from Megabytes to Terabytes (447,215,420 ÷ 8,000,000 = 55.9)

So the traffic assumed by the ITU model for people using 'high multimedia' services in urban offices is 55.9 Terabytes per hour per square km. But the figure in the graph in the RealWireless report is per month, so we need to scale this up from hours to months. We now have the thorny question of 'how many hours are there in a day', which for mobile data traffic is not necessarily 24 as you might expect. If the above figures are meant to represent the busy hour (the busiest hour of the day), it would not be right to multiply the value by 24 to get daily traffic, as this would assume every hour to be as busy as the busiest. As a conservative measure, let's assume that the daily traffic is 10 times that of the busiest hour. So daily traffic per square km would be 559 TeraBytes (55.9 × 10 just in case you couldn't work this out in your head).

The number of days in a month is relatively easy to work out, it's 30.4 on average (365.25 ÷ 12). So monthly traffic per square km would be 559 × 30.4 = 16,994 TeraBytes per month per km˛.

This is the monthly data for just one urban traffic type in the ITU model, there are 19 others. Ignoring the others completely, Figure 44 of the RealWireless report should show monthly traffic in urban areas for the ITU model being 17,000 TeraBytes per month per square km, include the other activities that urban office workers undertake and the value should be much higher still. But it now shows as being just over 100 TB/month/square km for the ITU and less for the RealWireless prediction, 100 or more times too low. Oh dear!

So having corrected the figure in the RealWireless report, it is now wrong. It was correct before. And it still does not tally with the total data forecast for the UK that is in the same report.

Surely there are people at Ofcom who own a calculator, have a GCSE in maths, and possess a modicum of professionalism such that they would want to check the facts before blithely allowing their suppliers to fob them off with an 'oops, we mis-labelled an axis' argument. Presumably they thought that it was ESOA who couldn't handle a calculator properly.

A consortium led by the Media Development Investment Fund (MDIF), and calling itself Outernet is planning to launch hundreds of small satellites (at 30x10x10 cm at their largest, they are about the size of a loaf of bread) to 'broadcast' the Internet. The idea is that selected portions of the internet will be broadcast using UDP-based WiFi multicasting (as well as, potentially, DVB and DRM).

Stepping aside from the political questions about who will decide which portions of the Internet will be broadcast - and which will not - there is the much bigger question of whether or not it is even possible to broadcast WiFi successfully from a satellite. There are several technical issues to overcome:

The satellites, presumably in low earth orbit, will be several hundred kilometers above the planet, so the path loss will be significant.

They will have to overcome interference from terrestrial WiFi networks on the same channel.

The low earth orbit means they will not be stationary in the sky, leading to Doppler shift on the received signal.

The power of the transmitter on the satellites is not known, but we can backwards calculate what it would have to be in order to deliver a service. WiFi typically needs to receive a signal of around -90dBm (1 picoWatt of power) in order to function, and preferably more (especially for faster connection speeds), but let's take that as the baseline.

At a frequency of 2450 MHz, the free space path loss over 500 km (a typical height for low earth orbit satellites) is just over 154 dB. In reality, atmospheric absorption will increase the path-loss, as will clouds and rain, but let's assume it's a relatively clear, low humidity day. The satellite will therefore have to have a radiate a power of 154 - 90 = 64 dBm in order to achieve the necessary signal level on the ground. This is a power of just over 1.5 kiloWatts. At a satellite height of 150 km (about the minimum possible), path loss is around 10 dB less, meaning it would have to radiate a power of 150 Watts.

If the transmit antenna has a gain of 10 dBi, which is very feasible, the transmitter power requirements end up being 150 Watts at a height of 500 km and 15 Watts at a height of 150 km. Note that no transmitter is 100% efficient, and the satellite would have to have receivers and control systems too, so the power requirements would be greater than that which the transmitter alone requires. If it is also assumed that the satellite is over the dark side of the Earth for some proportion of time, and has to rely on batteries, the solar power generation requirements increase, or alternatively the satellites would have to switch off at night.

Of course, high gain antennas could be used on the ground, but this would then require special equipment for the satellite to be received and would go against the concept of receiving the signal on 'smartphones and tablets'.

It is not possible to easily generate 150 Watts of power on a satellite the size of a loaf of bread. A typical satellite solar panel can generate around 300 Watts per square meter of area. The total surface area of the 'loaf' would be 0.14 square meters, meaning it could potentially generate 42 Watts of power if all faces were covered in solar panels and were in full sunlight (which is, of course, impossible as at least one face would be in shadow).

Of course, the 'loaf' could have its solar panels unfold after it is launched to make a bigger panel, so the 0.5 sq metres required to generate 150 Watts might just be possible. But this would still only provide power when the satellite was in daylight. To be powered at night it would need to generate at least double the power (one lot for the transmitter and another to charge the battery) and contain a battery capable of holding the charge. This would again be difficult on a satellite of this size.

The above transmitter power calculations assume that there is no interference on the channel. If standard WiFi channels are to be used, then depending on the location it could be expected that there would be other signals around causing interference. Assuming that the main use of the satellite will be in areas where there are no other forms of Internet connection, we could take it that there would not be WiFi interference, and so arguably we could look upon the satellite kindly and disregard this effect.

On the Doppler shift issue, at 2450 MHz, the received frequency of a low earth orbit satellite will vary by around +/- 50 kHz as it passes overhead. The IEEE standard for WiFi specifies a maximum frequency error of +/- 25 parts per million (ppm) for the 2.4 GHz band. This equates to roughly +/- 60 kHz meaning that the Doppler shift of the satellite leaves it just within acceptable frequency tolerances.

So, in conclusion:

If the 'loaf' was at 150km height it might just be able to generate enough power to transmit a WiFi signal that is strong enough to be received on the Earth. At a height of 500 km, extending solar panels would be necessary. For use at night even larger panels, plus batteries would be needed.

Any terrestrial interference in the band would largely obliterate the satellite signal, so it would only really be receivable in remote areas (which is, after all, it's main intention).

Doppler shift is just within acceptable tolerances.

So is it technically feasible. The Wireless Waffle answer is 'just about'. But if it was launched, it would increase WiFi interference levels over the majority of the planet, especially outdoors.

And you can bet that if it did work, those Governments that censor Internet access would find ways to jam the signal either terrestrially or by building their own 'loaf sat', increasing WiFi interference further. The loaf-sat-wars may be just about to get toasty...