Posted
by
samzenpus
on Friday July 06, 2012 @09:58AM
from the where-in-the-world dept.

Dr. Ramsey Faragher graduated from the University of Cambridge in 2004 with a first-class degree in Experimental and Theoretical Physics. He then completed a PhD in 2007 at Cambridge in Opportunistic Radio Positioning under the direction of Dr. Peter Duffett-Smith, a world expert in this field. He is now a Principal Scientist at the BAE Systems Advanced Technology Centre specializing in positioning, navigation, sensor fusion and remote sensing technologies in the land, air, sea and space domains. We recently covered his NAVSOP project, an advanced positioning system that exploits existing transmissions such as Wi-Fi, TV, radio and mobile phone signals, to calculate the user’s location to within a few meters. Dr. Faragher has graciously agreed to answer any questions you may have about NAVSOP, the future of GPS, or what a theoretical physicist puts on his business card. Ask as many questions as you like, but please confine your questions to one per post.

He seems to have written a post [slashdot.org] on how this works and then later made an account [slashdot.org]. Sorta verified here [twitter.com]. His post is very informative and might answer a lot of questions and generate more meaningful ones.

It's a good and legitimate question for people who venture away from cities and urban centers to locations which have no radio or cell phone coverage. Those are areas that I frequent and I asked the question to understand HOW close you would have to be to a electronic signal to receive a location from this method, i.e. how sensitive of a device can be made with this method.

I'd imagine a lot of positioning calculations involve accounting for or adjusting for known effects or noise. For example, accounting for general relativity in GPS. What is the most surprising correction you've ever come across (even on an exam or done in theory)? Have you ever found yourself saying "I didn't think that could affect the calculations so much."

When you are talking about pseudoranges measured in nanoseconds, darn near everything is significant. In research, surveying and other high-accuracy positioning systems, the Ionosphere, Troposphere, multipath, antenna-to-receiver cable lengths, the data acquisition and computation time and several other effects are all modeled.

For myself, I think the most surprising was needing to pay attention to the number of significant bits in the mantissa of real and double precision numbers used in the calculation

What kind of accuracy is possible to achieve using NAVSOP - or other systems you know of - if I can place stuff like APs, mobile phones, etc. myself in a factory area? Do you have methodology for designing placement of such devices so positioning accuracy is reached at every point? How low can one get with costs of such solutions?

Let me add why this an important the next step in indoor robotics. The Roomba and similar robots and toys have avoided this problem by operating blindly, or by using limited aids such as virtual walls or markers. But these parlor tricks aren't good for long-term intelligent indoor robots. They need to not just avoid obstacles, but also find the refrigerator, skip vacuuming the garage, and know to use different cleaning fluids on the kitchen ceramic tile -vs- the dining room hardwood. They should stay aw

Indeed, though I think it's easier at home/flat -- you could just place a few dozen RFID tags as markers and be done with it.
On the other hand, at industrial level it is not a viable solution and it would be great if low-cost solution existed, based on placing cheap APs instead of expensive laser stuff, etc. and still obtain accuracy of, say, 10 centimeters.

Yes, I know Western Michigan University Library technology staff have an in-house built app for an in-out board that uses wifi signal strengths to locate staff. It's been active for over a year. It does have to be initially trained for any different location. Not hard with the push of a button. Also android only, iphone doesn't let you sniff information on other wifi nodes, just the one you're connected to. I don't have the published article reference offhand but there are a variety of papers on similar id

This approach is nifty, but is too limited for general purpose robotics use. It isn't accurate. It requires multiple wifi hotspots. It doesn't let you move your wifi routers or change them out - even a firmware update the changes the power output or the channel (which some routers do automatically) might break it. It requires training.

It would seem that to use this technology, the client would need to have a much larger datastore than with GPS: Whereas only the positions of the GPS satellites need to be known to make a calculation, the dataset here is in the many thousands to millions. In addition to the data required for map storage, it would seem any implimentation of this would require an internet connection to download the data in a geographically-restricted fashion. This opens the door to privacy issues that standalone GPS clients do not have.

Huh? GPS units have had GB's of map data for almost a decade, it's not like in the 90's where you had to load a new map set if you were traveling more than a few hundred miles. Compared to the full map set for say North America the list of radio/cellphone towers and locations is trivially small (although the cellphone ones do change somewhat regularly so you'd want a way to update the database).

I think you're misunderstanding the point. OP talks about finding bare coordinates, not position on a map. Not all applications of GPS are tied to the map data, and map data is "external" to finding your position anyway (but can be used to correct it with assumption like "you're on the ground and on a road").Sure, maps are big, but all the data needed to get raw coordinates is quite small. But with this system, you need a database of all your new "reference points" - cell towers, wifi, etc.What's worse, thi

His question is more on the map being required to locate it's self, your comment is about maps which you can show your self on after you know where you are. They are completely different sets of "maps".

Something that surprises me is that we're so obsessed with the exact positioning of things on Earth but at great exo-solar distances, we seem to be okay with measurements to the nearest million light years. A couple days ago I read about a new method devised to measure location to within a few hundred meters of something 200 million kilometers away from Earth [aalto.fi] and it struck me as odd that more effort isn't put into this. While the practicality of Earthbound work is far greater, the implications for physics and verifying theories seems to be an obvious benefit for better positional measurements in space. I know satellites and objects near Earth are heavily measured but why isn't there more attention paid to precision of deep space objects? What problems prevent sensor fusion from being applied to space? Too much noise? No way to actually verify your results?

I know satellites and objects near Earth are heavily measured but why isn't there more attention paid to precision of deep space objects?

For those objects for which high precision is required - there is. For those objects for which it's not - there isn't. There's surprisingly few of the former, and many, many, of the latter.The ultimate problem however is that the input data is (relative to what we're used to for terrestrial applications) generally of fairly poor quality. Thus you need either a

With the improvement, both in time and coverage, that using WiFi and cell tower triangulation adds to straight GPS or A-GPS, and with NAVSOP going even further, what signals (or aspects thereof), in the EM spectrum or otherwise, remain untapped? What's the next step in improving time, coverage, energy efficiency, affordability of location systems?

How much of this can be done automatically and how much of this must be hand guided? For example you talked about [slashdot.org] fingerprints changing over time and being used only as a guide. Is there a measurement or confidence variable that you can employ to automate when the fingerprint is still valid or has morphed too much? Or is that something that a human overlord must monitor and do research to notice that a new apartment building has just been opened and there are now hundreds of new signals? It feels like you are using an open domain that could have outliers and irregularities that require a human to clean the data before it can be trusted to give you low false positives and true negatives. What statistical methods do you use to overcome these sort of real world problems so that your system can be put anywhere and work?

Hi, Dr Ramsey!What is your best estimate as to what is the US DOD's current GPS backup system?
IIRC Obama cut the budget for LORAN around 2010 and till then the system was financed with the explicit explanation and purpose - GPS backup. But no more...
I am currently teaching ECDIS systems to mariners and I always emphasize the weaknesses of GPS under jamming.
Ever since Selective Availability has been switched off, the jamming topic pops up more and more as a soft spot of the whole process, so I think we are not fooling ourselves that the US would let down such a gaping hole in its systems uncovered...

Because civilian aircraft and many other mission-critical functions depend on selective availability not being enabled, it's highly unlikely they'd use it domestically.

The other argument against SA is that there are severalmethods of interpolating the GPS signals to achieve a lock that don't require decoding anything but the almanac. So mucking up the signal intentionally doesn't have to affect equipment, as long as it is designed to use the more sophisticated methods of acquiring a lock. So if you're a te

Wouldn't this thing require a whole slew of regulatory approvals since you'd be fishing for different types of signals? Or would this involve mere processing of data already available to, say, the smartphone armed with this technology?

Global positioning signals are used to help target various weapon systems in the United States arsenal. These signals can be--and have been--spoofed, to mis-direct these devices. Do you see spoofing technology as a meaningful threat to our offensive and defensive capabilities?

That exists already but you don't have access to it. Maybe when the European Galileo system comes online and you will. It is suppose to allow centimeter level accuracy if you pay, but is open enough so that everyone can have access to that level instead of just the military.

Or a high sensitivity antenna in the device. I was shocked at the increased signal strength my newer GPS (has a high sensitivity antenna) has compared to my old one. I have used them side by side and the old one will frequently loose satellite lock in dense cover while the new one doesn't.

From my understanding it is possible to have a device that supports the American GPS [wikipedia.org] as well as the Russian GLONASS [wikipedia.org] system and the planned European Galileo system [wikipedia.org]. I have done some checking and it appears that there are some consumer level devices available [amazon.com] that are capable of receiving and processing both GPS and GLONASS signals for faster and more reliable location lock. I would imagine that adding support for Galileo would be simpler as those frequencies are much closer to those used by the US GPS (so mu

We all know how GPS works, and even the basics of the math behind it, but where does one find more in-depth mateirals? Like if one were to create a GPS receiver, how to translate those signals into something useful? Not just dumping the equations out and say "solve this" but working step by step through a solution and then adding in corrections like relativity?

So, since the GPS satellites and other systems are enmeshing the world in streams of digital data, can a portion of the data stream be used to transmit some sort of key so that it can be proven the receiver was at a specific space-time coordinate?

Like if satellite A was transmitting a continuously varying stream of random numbers and satellite B was doing the same then a receiver could take the product of the "random" numbers that it captured over a short window of time and use it to encrypt and "space-time

In order to combine all the sources of information, are you relying on a messy approach, something based on many signature machine learning algorithms (think boosting, SVNs, random forests etc) or are you writing an explicit generative model for the noise and then applying filtering to it, with a particle filter for instance?

erratum:In order to combine all the sources of information, are you relying on a messy approach, something based on some classic machine learning algorithms (think boosting, SVNs, random forests etc) or are you writing an explicit generative model for the (trajectory, sensor output) and then apply filtering to it?

So I've heard that a problem with the GPS and presumably other systems is that the radio signals are slowed by varying amounts by going through the ionosphere (thus reducing accuracy). I realize that you cannot use quantum entanglement to send (new?) information but does that include the information that a measurement has been made (if not the result)?

So could GPS use entanglement to precisely determine the time of a measurement? I think it's been demonstrated that they can send an entangled photon hundre

Following the downing of an American drone in Iran the hypothesis was put forward that the Iranians spoofed the GPS signal and convinced the drone that it wasn't where it thought it was in order to get it to land in Iran (I'm not sure if this was ever confirmed). A recent issue of Aviation Week reported on a group I believe in the U.S. working on the same idea, spoofing the GPS signal in a transparent manner to convince an autonomous vehicle that is was somewhere other than its actual location. Would NAVSOP

So I seem to remember a proposal to use pulsars to provide a sort of galactic GPS. (Pulsars, spinning neutron stars, are extremely stable periodic emitters of radio waves at interstellar distances). I think this might be what an earlier poster was referring to for spacecraft navigation, I believe they were used on the famous Pioneer 10 plaque (with the naked humans) to show aliens where we live.

Anyway, what's the accuracy for this (the previous poster mentions several hundred meters over hundreds of kilom

A lot of the "this is not new" comments refer to differential positioning using reference receivers and having access to databases of transmitter locations (Rosum, the old Cursor positioning system from Cambridge Positioning Systems, etc). We consider those aspects to be undesirable constraints on a flexible opportunistic positioning system and don't rely on them.

Is the idea to be a fully self contained (and self teaching) system? Is there any way to (reliably) share transmitter location data between clients using some sort of P2P or swarm connection?

Have you considered doing the same using earths magnetic field rather than RF? Local varience within buildings or geology, earth field lines or using a RLG/GPS reference to see parallax in declination as a basis for rough positioning?

Would it be possible to get a more accurate location data from GPS by using multiple receivers separated by some relatively close, aproximatly 1 meter, known distance and then averaging the returned position to get a more accurate center position?

I am envisioning something that uses 4 or 5 receives arranged in either a triangle or square with one receiver located in the center of the others. The distance between any 2 of the receivers would be at most slightly more than 1 meter which is below the accuracy

That is done today. It is is called "Differential GPS" (DGPS) and "Real-Time Kinematics" (RTK). However, it's much more than just averaging - there are more complex mathematics involved to remove the errors common to both sets of measurements. RTK is done all the time in surveying and can achieve centimeter level accuracy in real-time. http://www.geod.nrcan.gc.ca/edu/rtk_e.php [nrcan.gc.ca]

What the OP asked is not "Differential GPS"; actually it is the opposite.

We can assume that if you measure GPS coordinates at a given time and location, you will have a systematic error (inaccuracy in satellite position, different speed of radio signal due to weather) that is the same for all GPS receivers in an area, plus a random error different from GPS to GPS. Taking lots of measurements would tend to cancel out the random errors and leave you with the systematic error.

RTK is often done based on a relative location from the base station. If a known monument is available, then the surveyor will usually set-up the base station over it, but if not, they often just use a landmark and do relative baselines.

I found it striking that in the case of the drone that was forced down in Iran (e.g.) the 'return to base' failsafe seemed to be completely dependent on electronic signals.

It would seem much harder to spoof the position of the sun, the Earth's magnetic field, the position of the stars, etc. I presume, perhaps naively, that the sensors and algorithms to do this wouldn't be all that expensive or complex. What's the challenge in getting computers to use traditional navigation techniques?

What's the challenge in getting computers to use traditional navigation techniques?

There isn't a challenge - it's a well trodden path. The problem is that for the most part the sensors required are much more expensive than a GPS receiver, moderately complex, and require significant calibration and maintenance.

What would you recommend for a real time low cost small size light weight position/speed sensor that could be used for the sport of racing homing pigeons? I'm not talking about a data logger, but a practical device that can transmit information to allow for remote real time information delivery.

How will the system function during/after a major solar event? Assuming a worst case scenario, satellites in orbit could be disabled for good. And depending on the severity, transmitters on earth may be affected as well.

I'd like to fly drones over a, say, 100x100 meter area with centimeter precision, possibly indoors, for filmmaking. GPS is clearly not going to work, even outdoors. Time Domain sells a system with 5 cm resolution, using UWB technology [timedomain.com] -- but is there anything better than that?

Do you use Bayesian inference to combine positional information from many sources, some of which might be sorrily mistaken? I'd be interested in hearing more about the algorithms used to stitch this data together, and if there are any heuristics or approximations that help.

You mentioned earlier the domination of signal strength when indoors. Can you also use patterns in observed environmental data for automated mapping and exploration?

For example a robot exploring a cave or a large indoor structure like a power plant might be able to even use information such as ambient temperature / humidity, echoic nature of surroundings, or patterns in ambient air pressure / acoustic input from machinery or the sound of treads against floor.

We bought our first GPS receiver in 1992 for $1500. Today we have GPS receivers for pennies in your cell phone with better accuracy. Why have costs not come down on higher end systems? Patents? Lack of Competition? For instance, in agriculture you buy a 1500$ receiver and the vendor sells you different levels of $2000 software unlock codes to go from 8" to 4" to 2" to 1.5" to 1" accuracy. Are they selling the receivers initially at a loss?