Cathode to ground question

There is a cathode to ground resistor on a KT66 power tube on a amp kit. The value is 1 ohm. I ASSUME the only reason for it being there would be to be able to measure the current flow. I ASSUME 1 ohm is not going to change the cathode bias. Is my assumption correct?

The KT66 is a beam power tetrode and this amp kit is based on a Marshal JTM 45 schematic. The original schematic has the cathode going directly to ground.

The cathode to ground resistor introduces a slight negative feedback. At a cathode current of 100mA, it will change the grid bias by 100mV, which is not very much. The famous Williamson amplifier (http://www.preservationsound.com/?p=3823) uses a complex cathode resistor network, but assume an equivalent of 120Ω per tube, giving a cathode bias (at 80mA) of ≈ 10V.

No Jim, not the same amp. This is a new kit I am try fix for a kid to keep him from killing himself. You would not believe how bad he had it messed up. Solder running everywhere. Really dangerous stuff. I had to re-do everything.

I am now looking it over to see if the instructions are correct. I really dislike these kit projects. No bleeder resistors. No cover on the AC line. On and On...not good.

I am learning this at the moment but can I assume that a 1 Ohm resistor in the cathode circuit allows the plate current to be measured directly as a voltage because a 1 Ohm resistor gives V=A. Once I have my plate voltage I can work out my plate dissipation and then adjust bias accordingly.