Some authorities have stated publicly and without explanation that if the theories of Special and General Relativity were not taken into account in the design of the GPS (by building the satellite clocks to run 38us/day slower than GPS time before launch aka 'the factory offset), the position indicated by an earthbound GPS user device would drift by about 11km/day. I've considered this for various GPS models but can only predict much smaller effects. That multiplying the 38us/day uncorrected difference from GPS time by the speed of light yields 11.6km/day, does not for me seem to relate to GPS receiver function. Please reply with references if possible. I'd be very glad for any pointers.

If you look at the wikipedia page about GPS and relativistic corrections, they make it clear that this 10km/day drift applies to the 'pseudoranges' - the initial distance calculated between the receiver and each satellite. This error would cancel out in solving the triangulation problem to obtain the receiver position, since it is a an equal error in all the satellite clocks.

Here is my guess as to why they chose to correct this effect: the individual clocks all have some drift as well, and are periodically synced to a master timebase on earth. If they were allowed to drift so drastically from the master, it would be necessary to adjust every clock simultaneously, or navigation would be completely out of whack. Somewhat simpler to just adjust individual units as their drift becomes noticeable.

The satellites' clocks are corrected for GR and SR but this is irrelevant for navigation purposes. Your receiver is comparing the time difference between the time sent by a number of different satellites.

If this time is in 'earth' seconds or 1 part in 10^10 speeded up 'space seconds' is to first order irrelevant - so long as all the satellites experience the same effect. So the choice is to broadcast at 10.23 MHz and let the signal be a slightly different frequency when it reaches the ground, or adjust the frequency to 10.22999999543 MHz onboard so it's 10.23MHz on the ground.

I think this is where the urban legend of the 'USAF didn't believe in relativity and weren't going to correct the clocks' comes from.

Of course although your position relative to the satellites is unaffected by the time dilation - the satellites' own knowledge of time and so its position in its orbit would accumulate an error. To allow you to find your absolute position the satelite also broadcasts its own position in orbit.

The satellites are in orbit at about 20,000km altitude, 26500km from the centre of the Earth so have an orbit of 165,000km which they cover every 12hours. An error of 38.6 μs/day in a path of 333,000km/day still gives a position error (of the satellite) of only a fraction of a meter - although this accumulates with time.

This could be corrected by giving the satellites an adjusted figure for their orbital speed or by updating their empheris as they pass over the ground station.

Found the answer after drawing a blank with several experts. Two US professors of high GPS pedigree, independently explained that the '10km/day' claim presupposes that between 1 and 3 of the satellites used for a 4 satellite fix do not incorporate the 38us/day clock rate ('factory') offset. They also remarked that the GPS is often used as a time source where observed time shifts are clearly important.

I and others have been vexed by several scientific authorities publicly repeating the 10km/day position error claim without any mention of that presupposition. The question is resolved but the presupposition seems strange because relativity shifts all the observed satellite clock rates approximately equally. That presupposition seems only to allow GPS position finding to be shown to be about as susceptible to transmitter clock differences as radio-location systems such as Loran, where relativity is not a consideration.