What's the Frequency, Kenneth?

Early in my career, I got a job as a contract engineer for a local radio station. The station's assigned frequency was 1.000 MHz, and the transmitter was a Gates BC-1T, which was probably the best transmitter ever built for AM broadcasting. The crystal was cut for maximum stability over time and temperature and then sealed in a vacuum. The overall stability was better than one part per million (ppm).

In those days, the operator was required to check the frequency every half hour or so because when the regulations were written, crystals were far less stable. At the transmitter site was a General Radio frequency meter that showed deviation from the assigned frequency. Once a month, the frequency had to be checked against the National Bureau of Standards. Most stations contracted an outside company to conduct that test.

Because of the stability of the transmitter, the logs showed every frequency reading to be zero. Since the operator was in the studio, miles from the transmitter, for all he knew, the meter could have been turned off. The engineer who preceded me readjusted the transmitter so that it was two Hertz high and then the operator recorded rows and rows of “2” in the frequency column. The FCC required that the frequency be within ±20 Hertz of the assigned frequency.

During the time I was the contract engineer, the requirement to measure frequency was eliminated, and I typically checked the frequency each week when I calibrated the meters. I also reviewed the report from the company that checked our frequency, and each month the report said 1000002 Hertz. Since the transmitter was well within the regulations, I saw no reason to readjust it. "If it ain't broke, don't fix it!"

Years after leaving the station, I learned that some companies in the area, including the Army base, used our station as a frequency standard, rather than WWV, which can only be received at night to calibrate signal generators, frequency counters, and other equipment. I got a sick feeling because it meant that everything that came out of those companies and the Army base were 2 ppm high in frequency.

This entry was submitted by Frank Karkota and edited by Rob Spiegel.

Frank Karkota worked with power transmitters in the range of less than 1 MHz to 5 GHz. He designed and built equipment for radio stations and eventually started a company that made commercial and consumer receivers that covered 500 kHz to almost 1 GHz.

Very interesting article. It is a real-world occurrance of the old joke about the clock store where they set all the clocks based on the local factory whistle, only to find out that the factory whistle operator was setting his watch by looking at the clocks in the store window when he walked by every morning.

And that story always reminds me of working with CSMP (Continuous Systems Modeling Program), a digital simulation of an analog computer, back in the day. To generate a sine wave, we would pretend we had a cosine wave, and integrate it. Then we would take the sine wave, integrate and invert it, and feed it back into first integrator in place of our pretend cosine wave. It always seemed kind of like magic, producing a sine wave out of nothing, but apparently random noise in the system was enough to eventually get the pair of integrators oscillating.

He is off by 1000 on his units. AM stations typically have a frequency between 540 and 1600 KHz, 1000 MHz would put him higher frequency than TV (1000 MHz=1GHz). Your comment was right on saying you used a station in Chicago @ 1MHz., not 1000 MHz.

Back in the 60's I would surf the AM band for stations from Pittsburg Kansas, and finding them from all over the country. We could pick up WBBM, WLS, WWLS, WJR, KOMA, KOA, The Mexican station on 800 KHz and many others it has been a long time. I had a shortwave radio to pick up WWV, and others, as well. I still remember the call letters of the stations and their frequencies to some extent 50 years later.

Using a strong 1 MHz signal makes calibration incredibly easy. Connect an antenna to the vertical input of the scope, with a tuned circuit from the input to ground. Connect the frequency standard to be calibrated to the horizontal input. Adjust the standard until the Lissajous pattern is stable and you are done!

Back in the days of NTSC television, the color burst was generated from a rubidium oscillator for network programs. The exact frequency was calibrated against NITS standards and published. A frequency standard, tracable to NITS, was no further than the nearest television!

This is indeed an interesting article, and the part about the meter reading the deviation brought up an issue that I have had to deal with, which, to summarize, was "How do you tell the difference between zero and nothing." One part of the answer explains part of the reason for the use of 4 to 20Ma analog signal loops, which is that it is quite simple to detect the difference between "zero" = 4Ma, and "nothing",= 0Ma. So an opened data pair would be quite obvious.

I never thought about the concerns of remotely reading a frequency deviation display, but a 4-20Ma loop would have solved at least part of the problem.

If what you say is correct (army band always 2ppm higher) maybe that was the fault of this particular radio station. Someone in the Army finally realized the reference they had been using for years was actually 2ppm high, so rather than fix a "problem", they simply made 2ppm high THE "standard".

I rember the first time I saw my favorite college professor Dr. Howe. He was sitting on his desk, feet on his chair, watching a galvanometer with a stopwatch in his hand. He had just gotten a brand new frequency synthesized signal generator and he was checking it out by driving the galvanometer on one side with a synthesyzed 10MHz and WWV on the other side. The stopwatch was to time how long it took for them to drift a cycle apart.

Nancy you are correct that the article is very interesting and captivating but according to me what i understood is when no signal used to come the concern person automatically converts or moves it to 2 Hertz.Secondly i am not sure but according to me different organisations have different bands assigned similarly army band is always 2ppm high in frequency

You're right Nancy, "anyone else using it would be responsible for verifying their reference". My guess is that they knew the tolerance for the station was 10 ppm and figured that was more than accurate enough. Back then it was tough to have any lab instrument that accurate, and we did the same thing, picking up WCFL in Chicago at 1MHz.

Very interesting article on early radio and the Gates BC-1T transmitter. I found the 2 Hertz high adjustment solution interesting and an easy solution to verifying the frequency – especially since it was a very stable transmitter and was operating well within the FCC guardband. As for its use as a reference - Anyone else using it would be responsible for verifying their reference source...and sometimes it is consistency that reflects proper calibration rather than the actual reading itself. Not my area of expertise, but I am guessing they were using it because of the stability of the transmitter and the 2ppm was either known or negligible.

Focus on Fundamentals consists of 45-minute on-line classes that cover a host of technologies. You learn without leaving the comfort of your desk. All classes are taught by subject-matter experts and all are archived. So if you can't attend live, attend at your convenience.