Power and transformers

For example: Let's say a manufacturer creates a device which runs at 120W (let's say a lightbulb). Does this mean that i can use which combination of voltage and current such that V*I=120W i want?
Another example: Let's say that the maximum voltage across a capacitor is 300v. Can i run it with as high current as i want as long as i don't exceed 300v?

My second question is that instead of adding resistors to decrease current can't we just add a transformer instead which will decrease voltage to produce the same current?

For example: Let's say a manufacturer creates a device which runs at 120W (let's say a lightbulb). Does this mean that i can use which combination of voltage and current such that V*I=120W i want?
Another example: Let's say that the maximum voltage across a capacitor is 300v. Can i run it with as high current as i want as long as i don't exceed 300v?

My second question is that instead of adding resistors to decrease current can't we just add a transformer instead which will decrease voltage to produce the same current?

Thank you for answering!

Don't forget the light-bulb also has 'R' resistance that must the used in your power calculations.
Capacitors also have internal resistance and possible dielectric losses that causes heating so they have AC current/power rating too.

Your 'second' question is not really clear about which current you mean (primary or secondary).

A transformer is not a magic box that makes current from voltage. Think of how the ratio of voltage and current (100v*1A -> 1v*100A) might vary over a large range but still be equal to the same about of power (100W) and what resistance at each voltage (as we adjust the transformer turns ratio) would be needed to draw the needed current for 100W.

Staff: Mentor

For example: Let's say a manufacturer creates a device which runs at 120W (let's say a lightbulb). Does this mean that i can use which combination of voltage and current such that V*I=120W i want?

No. The filament of the lightbulb has a set resistance. This means that there is only one voltage that gives you the right current so that V*I = 120 W.

Another example: Let's say that the maximum voltage across a capacitor is 300v. Can i run it with as high current as i want as long as i don't exceed 300v?

You don't select the current a device uses. You select the applied voltage which then determines the current as per ohm's law.

My second question is that instead of adding resistors to decrease current can't we just add a transformer instead which will decrease voltage to produce the same current?

I think you're misunderstanding how a transformer works. With a step down transformer the voltage is taken from a higher value in the primary circuit and stepped down to a lower value in the secondary circuit. This voltage value is applied to the secondary circuit and the current will depend on the resistance/impedance of the secondary side. The key idea here is that the power of both the primary and secondary sides is always the same.

Let's say that we have a transformer with a primary average voltage (Vrms) of 100 volts. The secondary steps this down to 50 volts. If the load on the secondary side is consuming 100 watts of power, then the current through the secondary circuit is 2 amps, as 50 volts * 2 amps = 100 W. However, the primary has 100 volts, so the current flowing through the primary is only 1 amp.

If we change the load so that it consumes 200 watts, then the current jumps to 4 amps in the secondary side and 2 amps in the primary side. Note that the voltage in the secondary side is set by the transformer itself and doesn't change. The current will change depending on the resistance (or impedence) of the secondary circuit, which includes the load.

If you halved the voltage going to your lightbulb by using a transformer, then the current flow through the lightbulb would also be halved and the power would drop to 1/4 what it used to be. For a 100 watt bulb being supplied with 120 volts, the resistance of the bulb would need to be 144 ohms in order to get the required 830 milliamps of current so that V * I = 100 watts. If you used a transformer to drop the voltage to 60 volts, then the current flow through that same lightbulb is now only 417 milliamps and the power falls to 25 watts. (Because we didn't change the resistance of the bulb, which is still 144 ohms. So I = 60 volts / 144 ohms, which is 0.417 amps)

Remember! We don't directly manipulate current itself! We have to change either the voltage or the resistance of something in order to get the right amount of current flow through it.

For example: Let's say a manufacturer creates a device which runs at 120W (let's say a lightbulb). Does this mean that i can use which combination of voltage and current such that V*I=120W i want?
Another example: Let's say that the maximum voltage across a capacitor is 300v. Can i run it with as high current as i want as long as i don't exceed 300v?

My second question is that instead of adding resistors to decrease current can't we just add a transformer instead which will decrease voltage to produce the same current?

Thank you for answering!

It helps if you get this thing in the right order. A load (heater, lamp, motor) is designed to work at a particular Voltage. For a simple resistive load (R), the current that will flow will be I = V/R. The quoted Power of a device is always at its specified working voltage.

If the device is specified to work at. say 12V then a transformer will be needed, to give it the 12V it needs if you want to connect it to 240V mains. Say that device uses 12W - the current it draws will be 1A. The power supplied to the transformer from the mains (ignore losses) will still be 12W, so the primary current will be 12/240 = 1/20A. It's a matter of cause and effect, if you want to appreciate what happens. The mains will 'think' it is working into a resistance of R=V/I=240/(1/20) = 4800Ω, even though the actual device has a resistance of R = 12/1 = 12Ω. Transformers transform Resistance as well as volts and current.

And, yes - a transformer is a much better way of supplying power to a low voltage device. You don't lose the sort of power that a series resistor will dissipate.

So by using transformers we lose half of power in the primary? In which conditions do we use resistors in favor of transformers and vice versa? Which of them is more efficient? I understand how they work but i realy want to see which of them is more efficient? Thank you!

We lose no power through a transformer (except for a bit of inefficiency). If you understand how they work then isn't it obvious that no power is lost through a transformer?
Did I get my sums wrong in that post?

If you have an AC supply, a transformer is always better - unless you need to vary the supply. Variable transformers (Variac) are available but tend to be more expensive than a low power rheostat. Alternatively, there are Thyristor controls ("dimmers") that let current in from the mains through in the form of a series of short pulses. These will deliver, effectively, low volts without much dissipation.