Something I've always wondered. If I constantly hook up cellphones, hard drives and the like via USB to my computer, will it eat up more on the electricity bill? Or are the USB ports using up electricity by just being enabled anyway, thus not affecting the power usage?

@DanielRHicks If he plugs five devices at 0.5A each, that makes 16W (with a 80% efficiency). That may not be relevant for the electricity bill, but it's easily measurable with a $15 watt-meter.
–
zakinsterJun 6 '13 at 11:52

No. You can start making profit by pumping energy from SUB sockets.
–
ValJun 6 '13 at 16:29

7

My UPS has a power meter, and when I suspend my computer with no USB devices plugged in, the power usage measures 0 watts. If I plug in a tablet and two phones for charging (the USB ports are always powered while the computer is suspended), the power usage reads 7 watts. I don't know how accurate the UPS power meter is, but there's definitely measurable power used. I haven't checked USB power usage while the computer is powered on, but the computer hovers around 80W while idle, so I'm assuming that USB charging would push it to around 87W.
–
JohnnyJun 6 '13 at 17:18

2

Good question. ~Is putting an extra item in your fridge making it use more electricity?
–
TymekJun 7 '13 at 7:39

7 Answers
7

Short answer: Probably, but not necessarily; it won't be free power, but it might be obtained more efficiently. It really depends on the particular power supply's efficiency curve, and the point at which you are operating it at (and power consumption is affected by software).

Long answer:

A USB port can output maximums of 500mA (USB1&2) and 950mA (USB3) at 5V which gives maximums of 2.5W (USB1&2) and 4.75W (USB3).

USB ports don't consume power by themselves. Without anything plugged, they are just open-circuits.

Now, if you get 1A (5W) out a USB3 port, it will usually increase the global power consumption by ~6W (depending on your power supply efficiency) which would be an increase of 2% to 5% of your computer power consumption.

You'll see the efficiency is not a constant value, it varies a lot depending on the load applied to the PSU. You'll see on that 900W PSU that at low power (50W to 200W), the curve is so steep that an increase in the load will entail a substantial increase in efficiency.

If the increase in efficiency is high enough, it would mean that in some cases, your computer may not need to actually draw an extra 5W from the wall socket when you're drawing an extra 5W from a USB port.

Let's take an example of a computer drawing 200W on a PSU with an actual efficiency of 80% at 200W :

In this case, the PSU efficiency is increasing between 200W and 205W, thus you can't deduce the relative power consumption of the USB device without taking into account the whole computer power consumption, and you'll see the relative increase at the wall socket may actually be lower than 5W.

This behavior only happens because, in that case, the PSU is under-loaded, so it's not the usual case, but it's still a practical possibility.

In this case, the PSU draws the same power from the wall socket, whatever the load it receives. This is the behavior of a zener regulator where all unnecessary power are dissipated into heat. It's a behavior that can be observed in some kind of low-end PSU at very small load.

That last case, is a purely hypothetical case where the PSU would actually consume less power at higher load. As @Marcks Thomas said, this is not something you can observe from a practical power-supply, but it's still theoretically possible and proves that the instinctive TANSTAAFL rule cannot always be applied that easily.

Conclusion :

If you need to charge a lot of 5V devices, it's better to do it from an already running computer than from multiple wall chargers. It won't be free but it will be more efficient.

Also note that you may need USB ports with 1A capability (e.g. USB3) in order to get the same charge speed.

I don't think any practical power supplies have a sufficiently steep efficiency curve to actually reduce consumption under increased load, but +1 for making the very relevant point that a computer can be more efficient than a wall charger.
–
Marcks ThomasJun 6 '13 at 11:38

4

@MarcksThomas I don't think either, but it's theoretically possible and it would be easy to build a dummy inefficient PSU that behaves this way. I was just making the point that the simple TANSTAAFL reasoning only works if you don't take into account the fact that the computer PSU may already draw power that you're not using. The overall consumption won't obviously decrease, but I wouldn't be surprise if it doesn't increase as much as expected.
–
zakinsterJun 6 '13 at 12:06

2

@zakinster If a PC draws 200W with 80% efficiency, it will draw 250W from the wall (since 20% is lost in PSU conversion). Adding 5W to the PC's draw amount gives 205W drawn, and at 80% efficiency this gives 256.25W drawn from the wall (or an additional 6.25W).
–
BreakthroughJun 6 '13 at 15:27

3

@Breakthrough Completely true if the efficiency is a constant 80% at 200W and 205W, but I specified in my example that the PSU efficiency was actually 80,5% at 205W
–
zakinsterJun 6 '13 at 15:35

You don't get power for nothing. Otherwise we could just use the USB ports to power another computer, and use the other computer to power the first. It is a fun idea, but it does not work.

The energy for charging is rather small though. USB1 or 2 use 100 to 500 mAmp at 5 volts. That is a maximum of 2½ Watt. Compared to normal idle power drain of a PC that is rather small. (Normal: 50 watt for an office PC to 150 watt idle for a high end PC. And roughly thrice that when gaming, compiling etc etc).

Oops. Math fixed. Actually, not just watt. volt x watt = Watt was a brainfart. That should have been in amperage.
–
HennesJun 6 '13 at 10:29

16

@Hennes You can't apply the free lunch rule as easily, the computer power supply may already be wasting the energy needed by the USB devices and may be able to power these devices without even increasing the load on the wall socket. That may not be the usual case, but it's a common behavior for a seriously under-loaded PSU.
–
zakinsterJun 6 '13 at 15:52

5

TANSTAAFL is also known as the principle of "conservation of energy."
–
WCharginJun 6 '13 at 19:04

16

This answer is ill-founded. The conservation of energy principle alone does not guarantee us that a charging device uses more energy while charging and less while not charging. A charging device could consume the same energy regardless of whether or not it is charging, by wasting power when it's not charging. You can get something for nothing when you utilize that which is otherwise wasted. (Thus it is necessary to argue that this does not happen in a computer with USB ports.)
–
KazJun 6 '13 at 21:24

Yes. It's a basic rule of physics; if something's taking power away from your computer, your computer must get that power from somewhere. USB ports don't consume power just by being enabled*, any more than a power outlet would consume power just by having the switch "on" with nothing plugged in.

* Alright, there is a minimal amount of power consumed by the USB controller chip monitoring to see if something's plugged in, but that's a tiny amount of power.

It's possible that a computer could draw the same power while charging devices, as when not charging devices (all else being equal, like the CPU load). Laws of physics, like the principle of conservation of energy, do not provide any guarantee that this cannot happen.

For that to happen, the computer would have to be wasting power when the devices are not plugged in, such that when they are plugged in, the otherwise wasted power is then redirected into them and thereby utilized.

Electronic designers would have to go out of their way to contrive such a wasteful design, but it is possible. A circuit that draws exactly the same amount of power, whether or not it is charging one or more batteries, is harder to design than one which draws power in proportion to the charging work, and the result is a wasteful device that nobody wants.

In reality, designers reach for off-the-shelf voltage regulators to power the components of the motherboard. Voltage regulators have the property that the less loaded they are, the less power they draw overall, and the less they waste internally. (Linear regulators waste more, switching ones less, but both consume less when less loaded.)

Anything in the system that is powered down contributes something to net energy saving: powered down ethernet port, powered down Wi-Fi transmitter, spun down disk, sleeping CPU, or USB port not delivering current. The saving is two-fold: firstly, the subsystem itself doesn't use energy, and secondly, less energy is wasted upstream as heat dissipation in the power supply chain.

Actually, power supply circuits which draw a relatively constant amount of power regardless of how much power is needed used to be somewhat common, and I wouldn't be surprised if they're still used in some applications. If a mains-powered device never needs more than 1mA, a 100K resistor, "ordinary" diode, a zener, and a cap can pretty cheaply convert the AC120 to an unregulated voltage which is low enough to feed into a cheap regulator. Such a device would probably draw about 1/8 watt continuously, independent of how much was used, but could likely be cheaper than any practical alternative.
–
supercatJun 6 '13 at 23:59

Short Answer:

YES; you'll always pay for the USB power with at least that much more power from the wall. Not only is this required by the laws of thermodynamics, it's also inherent in the way power supplies work.

Longer Answer:

We'll take the whole system of the computer, its internal power supply, its operating circuits and the USB port circuitry to be one big, black box called the Supply. For the purposes of this illustration, the whole computer is one oversized USB charger, with two outputs: the computer operating power, which we will call Pc, and the output USB power, which we will call Pu.

Converting power from one form, (voltage, current, frequency), to another, and conducting power from one part of a circuit to another, are all physical processes which are less than perfect. Even in an ideal world, with superconductors and yet-to-be-invented components, the circuit can be no better than perfect. (The importance of this subtle message will turn out to be the key to this answer). If you want 1W out of a circuit, you must put in at least 1W, and in all practical cases a bit more than 1W. That bit more is the power lost in the conversion and is called loss. We will call the loss power Pl, and it is directly related to the amount of power delivered by the supply. Loss is almost always evident as heat, and is why electronic circuits which carry large power levels must be ventilated.

There is some mathematical function, (an equation), which describes how the loss varies with output power. This function will involve the square of output voltage or current where power is lost in resistance, a frequency multiplied by output voltage or current where power is lost in switching. But we don't need to dwell on that, we can wrap all that irrelevant detail into one symbol, which we will call f(Po), where Po is the total output power, and is used to relate output power to loss by the equation Pl = f(Pc+Pu).

A power supply is a circuit which requires power to operate, even if it is delivering no output power at all. Electronics engineers call this the quiescent power, and we'll refer to it as Pq. Quiescent power is constant, and is absolutely unaffected by how hard the power supply is working to deliver the output power. In this example, where the computer is performing other functions besides powering the USB charger, we include the operating power of the other computer functions in Pq.

All this power comes from the wall outlet, and we will call the input power, Pw, (Pi looks confusingly like Pl, so I switched to Pw for wall-power).

So now we are ready to put the above together and get a description of how these power contributions are related. Well firstly we know that every microwatt of power output, or loss, comes from the wall. So:

Pw = Pq + Pl + Pc + Pu

And we know that Pl = f(Pc+Pu), so:

Pw = Pq + f(Pc+Pu) + Pc + Pu

Now we can test the hypothesis that taking power from the USB output increases then wall power by less than the USB power. We can formalise this hypothesis, see where it leads, and see whether it predicts something absurd, (in which case the hypothesis is false), or predicts something realistic, (in which case the hypotheses remains plausible).

Now we can simplify this by eliminating the same terms on both sides of the minus sign and removing the brackets:

f(Pc+Pu) + Pu - f(Pc) < Pu

then by subtracting Pu from both sides of the inequality (< sign):

f(Pc+Pu) - f(Pc) < 0

Here is our absurdity. What this result means in plain English is:

The extra loss involved in taking more power from the supply is negative

This means negative resistors, negative voltages dropped across semiconductor junctions, or power magically appearing from the cores of inductors. All of this is nonsense, fairy tales, wishful thinking of perpetual-motion machines, and is absolutely impossible.

Conclusion:

It is not physically possibly, theoretically or otherwise, to get power out of a computer USB port, with less than the same amount of extra power coming from the wall outlet.

What did @zakinster miss?

With the greatest respect to @zakinster, he has misunderstood the nature of efficiency. Efficiency is a consequence of the relationship between input power, loss and output power, and not a physical quantity for which input power, loss and output power are consequences.

To illustrate, let's take the case of a power supply with a maximum output power of 900W, losses given by Pl = APo² + BPo where A = 10^-4 and B = 10^-2, and Pq = 30W. Modelling the efficiency (Po/Pi) of such a power supply in Excel and graphing it on a scale similar to the Anand Tech curve, gives:

This model has a very steep initial curve, like the Anand Tech supply, but is modelled entirely according to the analysis above which makes free power absurd.

Let's take this model and look at the examples @zakinster gives in Case 2 and Case 3. If we change Pq to 50W, and make the supply perfect, with zero loss, then we can get 80% efficiency at 200W load. But even in this perfect situation, the best we can get at 205W is 80.39% efficiency. To reach the 80.5% @zakinster suggests is a practical possibility requires a negative loss function, which is impossible. And achieving 82% efficiency is still more impossible.

Great answer, but I disagree with your conclusion; the loss function does not need to increase everywhere. In fact, it is trivial to design, for the sake of argument, a power supply that reduces loss under load, although this feature would not be useful. This answer very thouroughly shows unlikeliness, not impossibility.
–
Marcks ThomasJul 1 '13 at 11:26

The OP was referring to charging from a practical computer. While I have no doubt that one could artificially add dissipative elements which switch in under certain circumstances, to prove a point, that would constitute increased load, (for the purposes of trying to prove a point), and not increasing loss. But if there's a reasonable and practical power supply design that exhibits a negative loss function, and is not improved measurably in terms of cost or performance by eliminating the negative loss function, then I'd like to see it.
–
BillysuggerJul 1 '13 at 13:55

Yes. It's basic physics (thermodynamics). In the same way, charging your phone in your car uses a little more petrol. Another example is kinetic watches: you have to eat a tiny bit more food because you wear a kinetic watch! It is probably immeasurable, but the law of conservation of energy demands it. Energy cannot be created or destroyed.