75 ohm cable mismatch question

Assuming someone had a transceiver/ 50 Ohm cable / HF antenna setup with a perfect 1:1 SWR then changed just the cable from 50 Ohm to 75 Ohm, what would be the change in SWR? I am asking this because I want to know how important the impedence match of the cable is and how much that alone would change the SWR.

I read somewhere that 75 Ohm cable is more efficient than 50 Khz cable. I am also wondering why the "standard" for 2 ways radios is not something higher than 50 Ohms because 50 Ohms requires higher voltages than 75 Ohms for a given wattage and that is a potential safety issue.

Another reason I ask is because what if someone got a free long piece of 75 ohm "cable TV" cable and they wanted to use it to wire up some HF antenna for testing such as 10 meter. How good or bad might it be as far as SWR? Would the length of the cable affect how bad the SWR deteriorates? Can an antenna tuner be used to help compensate for the 75 ohm cable mismatch if the operator doesn't care about maximum output power, just a cheap way to test HF antennas?

What if someone used 2 equal lengths of 75 ohm cable wired in parallel and coupled together? Would that be a reasonable match for 50 ohm?

I read somewhere that 75 Ohm cable is more efficient than 50 Khz cable.

Click to expand...

75 ohm cable is more efficient for receiving but the center conductor size may not be optimum for transmitting.

Another reason I ask is because what if someone got a free long piece of 75 ohm "cable TV" cable and they wanted to use it to wire up some HF antenna for testing such as 10 meter.

Click to expand...

Back in the 50's when tube transmitters had built-in pi-net outputs, many hams, including me, used 75 ohm cables like RG-11. In actual practice, there is little difference in efficiency with an indicated SWR of 1.5:1.

If a 50 ohm load is fed with 1/2WL of 75 ohm coax, a 50 ohm SWR meter will read 1:1 even though the actual SWR is 1.5:1. (My old Heathkit SWR meter could be calibrated for either 50 ohms or 75 ohms.)

If a 75 ohm load is fed with 75 ohm coax, a 50 ohm SWR meter will read 1.5:1 even though the actual SWR is 1:1.

Since an SWR of 1.5:1 is usually perfectly acceptable, one usually doesn't need to worry about the 50/75 ohm mismatch.

The real problem with using CATC cable is it is darn hard to solder to.

Usually the braid is aluminum with a metalized plastic for shield.

This works alright if you crimp your connectors but like I said it’s a bugger to solder.

If you are going to the bother to build twin lead it is much easier to make it out of two lengths of wire. Solid or stranded.

It makes a bit of difference on the diameter of your wire and the outside of the coax shield is the diameter you would be working with for your calculations. Yes, even if you only solder to the center conductor. Not sure why but that is the way it is.

I used some cast off CATV coax for several years for HF. I wired up a lot of CB antenna with the stuff too. The folks I hooked up with it got out as well as the folks who could afford to buy RG-8U.

At HF the miss match between 50 ohm and 75 ohm is measurable with the equipment the average ham might have on hand, but not worth worrying about.

Here is a spinoff question from the original posted question. If I get a HF system working with a certain wattage (say 1W) and a SWR of 1.5, should that SWR stay the same as I increase power (to say 4 watts) or is the SWR dependant upon the power output? I remember back in my CB days, I would get one SWR reading at 1W and a slightly higher one at 4W but nothing changed on the antenna/cable setup.

Does this happen on variable power amateur stuff too? I read in the operating manual of an all band transceiver that the power output should be set to maximum before using the automatic antenna tuner. Why is that? Isn't it "safer" to use low power first and have the radio try to tune up before putting the full power thru it? What if there is a gross mismatch that even the antenna tuner couldn't handle? Would that cause some major reflected power to come back down the transmission line?

Then another spinoff question of paragraph above is where does the reflected power go? If the SWR is severe (say 10:1), does most of the reflected power go to Earth ground? Why do they say dont operate a 2 way radio with high SWR what is actually going on that can damage the radio?

Here is a spinoff question from the original posted question. If I get a HF system working with a certain wattage (say 1W) and a SWR of 1.5, should that SWR stay the same as I increase power (to say 4 watts) or is the SWR dependant upon the power output? I remember back in my CB days, I would get one SWR reading at 1W and a slightly higher one at 4W but nothing changed on the antenna/cable setup.

Does this happen on variable power amateur stuff too? I read in the operating manual of an all band transceiver that the power output should be set to maximum before using the automatic antenna tuner. Why is that? Isn't it "safer" to use low power first and have the radio try to tune up before putting the full power thru it? What if there is a gross mismatch that even the antenna tuner couldn't handle? Would that cause some major reflected power to come back down the transmission line?

Then another spinoff question of paragraph above is where does the reflected power go? If the SWR is severe (say 10:1), does most of the reflected power go to Earth ground? Why do they say dont operate a 2 way radio with high SWR what is actually going on that can damage the radio?

Click to expand...

SWR is constant with any given power level.
The reason you saw a change was due to the less then perfect directivity of the directional coupler in the SWR bridge.

The reflected power in part ends up as heat lost in the line (line losses) and radiation from said transmission line due to the mismatched load, and some does end up at the antenna.
This is a less then precise answer also this assumes the transmitter is matched to the line.

Sue, et all, The difference between readings at low vs. higher power is probably due to the fact that diodes do not act as linear devices. If you plot the forward resistance as you increase the applied voltage, you will see that the result is a curve, rather than a straight line. This curve will affect the SWR reading even though the SWR itself is not changing! Typically, the SWR will appear to rise as the power is increased. It takes a much more expensive solution to minimise this effect. That is why a new Bird(tm) directional wattmeter costs a LOT more than an MFJ unit, for example!