It appears there are several conventions in digital communications for
measuring signal levels relative to the full scale of the A/D converter.
Is there a de-facto standard definition of 0 dBFS in ham SDR work?
I'm currently using the convention that 0 dBFS corresponds to a
rail-to-rail square wave (or full scale DC) in *both* the I and Q channels.
I like this convention because it makes 0 dBFS exactly equal to the
absolute largest signal possible. I'm not likely to encounter it in
normal operation but I can conceive of an A/D being severely overdriven
by a malfunctioning AGC.
Another convention defines 0 dBFS as a sine wave with peaks at the
rails. But then a square wave (if you could generate one) would be +3dBFS.
Unlike the digital audio field where each channel is measured
separately, I use a single level for the sum of the I and Q channel
powers. I.e., if the I channel were a rail-to-rail square wave while the
Q channel was 0, this would be -3dBFS. A single carrier that is just
starting to clip as it swings through the I and Q axes would be -6 dBFS.
You do have to be careful to include the factor of 2 when summing up the
squares of the samples in each channel.
But if there's a de-facto standard in ham SDR work, I'm willing to
switch to it. Is there one?
Thanks, Phil