A typical Heath era rig might have a sensitivity rating of 1mv. Typical today`s rigs are .2mv. If you were using two rigs with those ratings with the same antenna,conditions,etc. what difference would that make as far as "S" meter readings? Approximately speaking.

I presume that by '1mv' you mean '1 microvolt' - mV usually means millivolt.

Supposing (and it's BIG supposing) that the S meters have been calibrated for 50 microvolts equals S9 AND that the linearity is identical, then they should read the same. The difficulty is that the linearity is almost certainly different, so although the S9 may be the same, the other readings probably won't be.

As far as HF is concerned, most receivers are rather more sensitive than they need to be: this is especially so at 7 MHz and below, because of atmospheric and man-made noise - especially the latter in urban and even suburban areas. I live in teh country and it's pretty quiet, but even so, on 10m, I can use 3dB of attenuation in the antenna lead and still get a 4 dB rise in noise in switching from dummy load to a 4 ele Steppir - and even more if the beam is pointing to the sun when it's playing up

Steve is spot on. Sensitivity is a non-issue. If you hear more noise when you connect your antenna- then you have enough sensitivity.I get a kick when I hear someone comment on how "quiet" a receiver is. This most assuredly is a result of how the AGC is set up, as our HF noise floor is not determined by the receiver, but rather by manmade, atmospheric and galactic noise.

A 'quiet' receiver could be sensitive or deaf as a post. Wire a shorted plug across your antenna connection then turn RF gain and audio gain to full, then you can tell if you have a quiet receiver. Life gets really interesting when you do this test with the pre-amplifier switched in, some of them are truly horrible.

I normally use a variable attenuator from 160m-30m, nothing on 20m and a 10dB low noise amplifier on 30m and higher.

>If you hear more noise when you connect your antenna- then you have enough sensitivity.<

Not totally true, because especially on frequencies below 10 MHz, the rise in noise on connecting the antenna may be caused by multiple signal intermodulation as well as, in addition, phase noise.

Higher sensitivity was a goal in the 1930s, when it WAS needed. Unfortunately, it's stayed as necessary marketing hype 70+ years later, possibly because of confusion with noise figure requirements at VHF/UHF.

from G3RZP:Not totally true, because especially on frequencies below 10 MHz, the rise in noise on connecting the antenna may be caused by multiple signal intermodulation as well as, in addition, phase noise.

Higher sensitivity was a goal in the 1930s, when it WAS needed. Unfortunately, it's stayed as necessary marketing hype 70+ years later, possibly because of confusion with noise figure requirements at VHF/UHF.--------------------No argument there at all. I am well aware of phase noise and IM although phase noise may have been a less of an issue with the good PTO's like Collins or Drake or TenTec than it is with some of today's DDS stuff.And, as you said, VHF/UHF is a totally different story. On my 23cM EME system, I can make use of LNA's with NF below 0.2dB.73,Dale W4OP

When talking about sensitivity, there are a lot of standards.I use hard microvolts. I add a six Db pad between the generator and receiver.Using 30% modulation at 1000 Hz, I look for a six Db drop in audio when I disable the modulation, then I read the microvolts off the generator. Most of the time, the sensitivity is 1 to 2 microvolts.

Certainly in the UK, and I believe a lot of Europe, historically generators were calibrated in terms of the EMF i.e. the open circuit volts. The US, on the other hand, calibrated in terms of the actual volts into the correct load - which is half the EMF.

Then the radar/ECM people came into the game. For them, it's actual power that matters and so we found generators calibrated in dBm, which is the power in 50 ohms that it would be if the load was matched. In most cases at HF, the receivers have quite a high input SWR, but provided the cables are short at HF, the results are repeatable enough.

Marine radio standards required on AM that switching off 30% modulation gave a 10dB change in output, while the specifications were all for EMF voltages: below 3.8 MHz, the dummy antennas were specified - for 1.6 to 3.8 MHz, it was 250pF in series with 10 ohms, so it was an interesting and lossy little network wthat presented 50 ohms to the generator and source impedance of 10 ohms and 250pF in series to the rx. I can't remember the lower frequency dummy antennas though - it may have been 6 ohms and 350pF at 500kHz.

So with multiplicity of methods, it has to be defined which is used!

Incidentally, there's a good article in the latest QEX on antenna noise, although his reference to the ITU-R Rec. P372 -7 is several years out of date: the current version is P.327-10

I am seeing that exact situation on my Hallicrafters FPM-300 MK II. The VFO runs straight to the 9MHz mixer on 80M, while all other bands go through a het /mixer with gain. The output into the 9MHz portion is exactly 3dB lower on 80M and this shows up in the final PA output level.

Copyright 2000-2018 eHam.net, LLC
eHam.net is a community web site for amateur (ham) radio operators around the world.
Contact the site with comments or questions.
WEBMASTER@EHAM.NETSite Privacy Statement