Why do voltmeters have a high resistance?Watch

Hi guys,
In my AS-level book for physics, it states that voltmeters should have a high resistance to prevent a lot of current from flowing through it so that the ammeter in the circuit will record the true value of the current in the circuit. I don't understand why this matters since at the end of the junction, the current will be the same as the current entering the junction (Kirchoff's 1st law). Does anyone understand why this is important?

To make a voltage measurement between two points in a circuit, the voltmeter is placed in parallel with the true current path (which has a finite resistance) and not in series as with an ammeter.

The parallel placement creates an additional current path through the voltmeter. This results in a reduced resistance presented between the two measurement points. i.e. the combined current path (circuit and voltmeter) has an overall reduced resistance as opposed to no voltmeter in circuit.

A high voltmeter resistance is therefore required so that the current through the circuit under test is affected as little as possible by the voltmeter parallel resistance.

I sill don't understand. Say that there is 2A of current in a series circuit. When it meets the parallel combination of the resistor and voltmeter, 1A of current flows through the voltmeter wire and 1A flows through the wire with the resistor. When these wires join, the current combines and becomes 2A once again (current is conserved). An ammeter placed next to this parallel combination would then record the true value of current (2A) so why does it matter how high the resistance in the voltmeter is? Whatever the current in the voltmeter it will still join back at the end of the junction giving the true value of current every time.

(Original post by QuantumBoi)
I sill don't understand. Say that there is 2A of current in a series circuit. When it meets the parallel combination of the resistor and voltmeter, 1A of current flows through the voltmeter wire and 1A flows through the wire with the resistor. When these wires join, the current combines and becomes 2A once again (current is conserved). An ammeter placed next to this parallel combination would then record the true value of current (2A) so why does it matter how high the resistance in the voltmeter is? Whatever the current in the voltmeter it will still join back at the end of the junction giving the true value of current every time.

The voltmeter has changed the circuit resistance and altered the current flowing from the case where no voltmeter is present.

Let's try an example. Assume the resistance in the circuit comprises 2 x 100K ohms resistors in series connected to a supply of 20V. The current in the circuit would be 20V / (200K) = 0.1 mA. This is the actual current in the circuit.

Now connect a voltmeter of 100K resistance in parallel with one of the resistors. The total series resistance is now 100K in parallel with 100K = 50K ohms, and this is now in series with the other 100K reisistor, total circuit resistance is now 150K ohms.
The current in the circuit is now 20V / 150K = 0.133 mA.

Adding the voltmeter to the circuit has changed the load presented to the supply by reducing the overall circuit resistance and increased the current by +33%. In other words, the voltmeter resistance has increased the error of the actual current with no voltmeter in circuit by +33%.

This is why a voltmeter with the highest possible resistance is required in order to minimise the change in circuit resistance when the voltmeter is placed in circuit.

Put it this way. You're measuring the potential difference across a component, yes? The voltmeter is in parallel with that component.
Now, it's important to make sure that you don't change the value by measuring it. Or at least, you change it by as little as possible.

The formula for parallel resistance is

If either has infinite resistance, then tends to 0.

So it no longer affects the total resistance and therefore doesn't affect the potential difference.

Another way again of looking at it: it's the energy per unit charge that goes through the component you're measuring. You want all of the charge to be going through the component and none of it to go through the voltmeter, because that's how it normally would be. How do you do that, when they're in parallel? Give the voltmeter as high a resistance as possible.

(Original post by uberteknik)
The voltmeter has changed the circuit resistance and altered the current flowing from the case where no voltmeter is present.

Let's try an example. Assume the resistance in the circuit comprises 2 x 100K ohms resistors in series connected to a supply of 20V. The current in the circuit would be 20V / (200K) = 0.1 mA. This is the actual current in the circuit.

Now connect a voltmeter of 100K resistance in parallel with one of the resistors. The total series resistance is now 100K in parallel with 100K = 50K ohms, and this is now in series with the other 100K reisistor, total circuit resistance is now 150K ohms.
The current in the circuit is now 20V / 150K = 0.133 mA.

Adding the voltmeter to the circuit has changed the load presented to the supply by reducing the overall circuit resistance and increased the current by +33%. In other words, the voltmeter resistance has increased the error of the actual current with no voltmeter in circuit by +33%.

This is why a voltmeter with the highest possible resistance is required in order to minimise the change in circuit resistance when the voltmeter is placed in circuit.