Are most rigs these days using "USB" or "LSB" offset for CW? I have an FT-101E which uses "USB" offset - ie a CW signal received in the USB mode will have the same pitch when I change the mode switch to CW. The FT-101E is always a little off freq becase it uses separate circuits in RX and TX to set the LO freq, but I leave the RIT on and have a pretty good idea where to zero beat a USB or LSB phone signal within a few dozen Hz, where the modes always match.

I've just started operating CW, and of the three contacts I've made I've noticed that two were near the RIT setting for USB I always use, but one was way over on the other side (out of the rx passband until I tuned it in). Rather than being off freq, I think his rig (Kenwd TS930) was operating with an LSB offset.

With the mix of USB and LSB offsets in use, how do you compensate, especially with super-narrow filters? So far it's been easy for me to swing the RIT and zero beat, but I've just been sending CQs. I've tried to answer a couple calls, I got ""s and "QRZ?"; I didn't try to tune the other op in but swung the RIT only.

If the etiquette to depend on the other op to swing their RIT around, or do they expect you to retune? I hear a lot of QSOs down at the bottom of 40, blasting away merrily at several hundred Hz offset.

Several hundred Hz offset? That is certainly unusual. And the way that's likely to happen is because the stations involved are using R.I.T., when they really shouldn't be.

With most modern transceivers, including the TS930S (not that modern, but surely not ancient), when the CW mode is selected, the transmitter offset matches the receiver pitch when a CW signal is tuned into the same frequency as the CW sidetone, and should match it precisely. No reason to use R.I.T.

In every rig I have (several, as old as a Drake TR-7, vintage 1977, and as new as a Ten Tec Scout vintage 1992 and Kenwood TS-850S/AT vintage 1991), I never need to use the R.I.T. -- ever -- when operating CW. If I send a string of "dits" (without transmitting!) and listen to my sidetone pitch, and tune the VFO (main tuning) dial to match the received tone to that pitch, I'm done. Perfectly zero-beat, no variation, ever.

I guess if someone has a horrible sense of pitch and can't differentiate an A-flat from a C-sharp, this could be a problem, and if so, then the standard "zero beat" process could be used. But that's time consuming, and I'm sure glad I don't have to resort to that -- I don't think many would.

OK, thanks, I worked through this on my FT-101 with the xtal oscillator generating a carrier and I realize now that the USB vs LSB ambiguity is resolved by the suppression of the other sideband. Whether in USB or LSB, to tune a 7100 khz carrier to the sidetone freq, the VFO pointer points to 7100. So the mode does not matter.

In the BFO age, I wonder how one knew which sideband to zero-beat to. I recall my Heath GR-64 would tune both "sidebands" of a carrier. There must have been a convention.

Such rigs which did not feature "single signal" reception were easy to use, anyway. The trick was to tune the desired signal into zero beat (no tone at all, tuned dead center on the carrier), then use the "SPOT" function of the VFO of the transmitter to tune that to zero beat, also. Then, the transmitter would be precisely on the frequency of the receiver, and from that point forward it didn't matter which sideband you tuned to.

And that whole operation took about one second, once you've done it a thousand times. (I think I did it more like a million times, since I had "separates" for many years, through the 1960's.)

Copyright 2000-2016 eHam.net, LLC
eHam.net is a community web site for amateur (ham) radio operators around the world.
Contact the site with comments or questions.
WEBMASTER@EHAM.NETSite Privacy Statement