A computer with a 300-watt PSU almost certainly does not draw 300 watts
of power, any more than a car with a 300-horsepower engine is always running
with its throttle wide open.

Only if your 300 watt PSU is fully loaded - in which case the computer
will probably misbehave quite atrociously, as PSU maximum ratings are often
quite
optimistic - will it deliver its full rated power. To deliver
300 watts to PC components (assuming that it can), it actually has to draw
a bit more than 300 watts from the wall socket, since PSUs are
not
100% efficient at converting AC into multi-voltage DC. But a "300 watt"
PSU is still not likely to actually ever be drawing anything like that much
power, except perhaps for a moment at startup.

A more realistic power consumption figure for a stacked PC, working
hard, is 150 to 200 watts (less than half of which is likely to be accounted
for by the
motherboard, for modern machines) plus another 75 to 150 watts for a
CRT monitor, depending
on its size and vintage.

[Note that this piece was written in late 2002; PC power consumption has crept upwards, since. As of 2009, a
stacked and overclocked Core i7 PC can draw 400 watts
when working hard; if it's got two graphics cards and lots of drives, it might hit 500 watts when playing 3D
games. If all you need is an office-and-Internet box, though, you can get an
Intel Atom,
Nvidia Ion or
Via Nano desktop machine.
Even when working hard, those'll draw no more than about 40 watts. Oh, and a giant
30-inch LCD monitor has a peak power draw of about 140 watts, but draws only
about 70 watts if it's set to its minimum brightness, which is what you want if your computer room isn't very
brightly lit.]

Very few PCs work hard all day, and few PCs qualify
as stacked machines. Some older machines with power-hungry components can
draw just as much as a new top-flight desktop box, but most office PCs don't
get close.

A realistic power consumption figure for a single office PC with CRT
monitor, when it's being used, is around 150 watts. Or a lot less; the 400MHz
Celeron box with 15 inch monitor that I powered from a jury-rigged UPS
here only draws about 100
watts from the DC side of its inverter, during startup. That computer's
got the same specs as a lot of current business machines.

But let's say 150 watts, for the sake of argument.

Allow for ACPI power saving, and
the fact that all NT-series Windows flavours instantly put the CPU in power-save
mode whenever it doesn't need to do anything (you can do it with Win95-series
OSes as well, but only with
extra software), and
I don't think it's unreasonable to peg the average power consumption of
a 150-watt-peak office box, even if it's turned on all day and used for
a whole eight hours each day, and does not have its monitor manually turned
off, at about 1.6 kilowatt-hours (kWh) per day. For 16 hours of the day,
it'll only be drawing about 25 watts, if that.

The calculation in "About
PC power calculations" is less unrealistic than it might be, since it
compares PCs to laptops and thus implicitly includes monitors in the PC
calculation - a CRT monitor can easily draw more power than the computer
it's connected to. But it still over-estimates.

Given the above more accurate assessment of a PC's power consumption,
the power consumption for four PCs running for a year without being turned
off drops from 10,512kWh to only 2,336kWh.

The quoted 75-watt figure for a laptop is similarly exaggerated, mind
you; it's not unreasonable to still say that a laptop will have about a
quarter of the draw of a PC (with a CRT monitor on the PC, and with the
laptop using its LCD
panel), so the four-laptops-for-a-year figure would be down around 584KWh.
In any case, though, the smallness of the numbers means the laptops will
only save you $US87.60 a year, given the $US0.05/kWh power price quoted.
That price is on the low side; here in Australia we've got some of the cheapest
power in the world, yet we still pay around 7 US cents per kilowatt-hour.
There aren't many places in the States where you'll pay less.

Even if you're paying 20 US cents per kilowatt-hour, though, the PCs-to-laptops
change that's supposed to save you $US788.40 over two years (with power
at 5 cents per kilowatt-hour) will only actually save you $US700.80, and that's a
highball estimate. Regardless of what power price you pay, the total power
expense for the four PCs will be only 22% of what's estimated based on the
erroneous 300-watts-per-PC and 75-watts-per-laptop, and this fact puts a
big dent in the argument for trading your PC CPUs from relatively power-hungry
Intel or AMD chips to Transmeta
(or, more plausibly, Via
C3) processors.

The purchase price of the equipment, and ongoing service
costs, are likely to be much more important than the power consumption.
In both purchase price and service costs, laptops are seriously inferior
to desktop PCs, and so are proprietary Transmeta-powered boxes - though
a company may well save money on service-staff back pain treatment by opting
for portables rather than desktops with CRTs.

C3-powered PCs have the advantage of being plain old PCs that just use
a super-low-power CPU, but that also means that their power consumption
advantage isn't very impressive. This is because the CPU is far from the
only power-consuming component of a PC.

On the face of it, it looks as if ditching your Intel or AMD chips, which
must be sucking a lot of power or they wouldn't need that big fat heat-sink-and-fan
cooler, would obviously be a good way to reduce your computer systems' power
consumption. And also your air conditioning expenses, if you've got a big
office; all large buildings need considerable cooling all the time, if they're
doing anything more than standing empty with the blinds closed.

But it's not actually worth it.

Yes, top-end Intel and AMD CPUs, working hard, use about 80 watts. And
an 800MHz Crusoe should peak at about six watts, if Transmeta's
numbers are accurate. And an 800MHz C3 shouldn't consume much more than
double that, no matter how hard you work it.

The first issue to consider, though, is that these processors are very
different. It is only barely possible, at the moment, to buy a new Intel
or AMD CPU that's as slow as the fastest Crusoe you can buy; the C3's not
much better. The slowest AMD and Intel CPUs on most retail shelves
today are twice to three times as fast as a high-end Crusoe.

The exact numbers vary depending on the benchmark you're running, of
course, but there's little
argument that an 800MHz Crusoe is in roughly the same class as a 600MHz
P-III, 700-800MHz Celeron (P-III based, not the current P4-based Celerons
with much higher clock speeds), or 600MHz Duron. An 800MHz C3 is a little
faster, but only a little.

The 1GHz Crusoe is proportionally faster than the 800MHz one, but I don't
think it's turned up in many (any?) products yet, and 1GHz C3s aren't exactly
leaping off the shelves at the moment, anyway. The 25% performance gain
offered by the 1GHz versions of these chips isn't terribly exciting when
you consider what it's improving upon.

But let's compare both chips with a 900MHz Celeron, for the sake of argument;
they're around the same speed, for office applications at least.

Full Thermal Design Power for a 0.13 micron core 900MHz Celeron is 26.3
watts; you can make it draw more if you really try, but in normal use you
probably won't even see full TDP very often. Anyway, even the 1.4GHz 0.13
micron Celeron only has a 34.8 watt TDP. In normal use, the 900MHz Celeron
is likely to be a ten to twenty watt CPU, at most; for office apps under
Win2000 or WinXP, the difference between the Celeron and the Crusoe is very
likely to be below 10 watts. The difference between Celeron and C3 will
be even less exciting.

Now, the difference in power consumption between a Celeron-based PC and
a Crusoe machine is likely to be much greater than that, but that's because
the PC is, well, a PC, with cheap standard interchangeable big-box components
and, probably, a CRT monitor. Crusoe boxes are proprietary laptop-style
slimline units with LCD screens and much higher price tags. C3 boxes can
be very low power, if they use a non-standard super-small-form-factor like
Via's own Eden;
these sorts of machines are like a cross between a largely proprietary Transmeta
box and a largely standard PC.

If you're in the market for
blade servers
then the equation changes a bit, because blades are pretty proprietary whichever
way you go. But then the power consumption difference will be much smaller
anyway, because blade servers don't have monitors, fancy video cards and
so on in the first place.

Now, if a Crusoe- or C3-class CPU is all you need for an application
- and, in many cases, it is - then that's fine. No problem. But it's silly
to compare them with screaming Athlon XPs or P4s as if they were aimed at
the same market; anybody who's kitting out their word processing drones
with 3.06GHz P4s or Athlon XP 2800+s needs their head read. And it's also
silly to compare a $US500 desktop box with a $US1500 Crusoe notebook, as
if they were meant for the same jobs.

Getting back to the subject of power consumption - the range we're talking
about here, for business boxes, is not as big as it might seem if you assume
that a 300-watt-PSU-PC must be sucking 300 watts all the time, and that
a six-watt-CPU Transmeta box must consume only six watts. In reality, the
difference between an unassuming Intel- or AMD-powered business PC and a
Crusoe machine is going to be less than 100 watts under heavy load, unless
the PC has a huge and/or very old monitor. In standby mode - which, I remind
you, is what the computers will be in for most of the day, if they're not
actually turned off or put in
hibernate - the difference is likely to be under ten watts.

C3-powered machines will, once again, score somewhere between the Intel/AMD
boxes and the Transmeta ones. But it's not really much of a spread.

If the lower-power technology were priced the same as the plain PC gear
then it'd be a sensible option, at least when upgrade time rolled around.
That's part of the reason why so many businesses are buying LCD monitors,
even though LCD makes you pay more money for less screen area.

But laptops, and Transmeta-powered computers in particular, are much
more expensive than plain PC gear, both in purchase price and, often, in
service costs. This squelches out any power price difference.

Example: NEC's
Powermate Eco,
"the first desktop PC engineered specifically with the environment in mind".
Which doesn't just mean low power consumption; it's apparently more environmentally
friendly in other ways as well. But we're talking watts, here.

I've no idea what the actual power consumption of the Eco is, but its
mains adapter is rated at 80 watts output (18VDC, 4.44 amps), which means
more than 80 watts input. I dare say it won't be using all of that most
of the time; possibly not ever. And the CPU will never consume much of a
slice of its input. But its LCD screen is likely to be at least a 30 watt
device, and so I don't think 50 watts is at all an unreasonable figure for
the whole thing, working hard.

A reasonably modern P-III Celeron box with a 15 inch LCD will draw little
more than a hundred watts under the same load, will be considerably faster
than the Powermate, and if you pay more than $US500 for the Celeron machine,
you've been robbed.

If you want to spend money to save power out of the goodness of your
heart, then that's great. More power, no pun intended, to you. But public
companies that do things like dropping megabucks on fancy low-wattage computers
that they don't need are apt to get sued by their shareholders.

People keep advancing the argument "OK, saving a few tens of watts per
desk doesn't mean much for an individual user, but for a company with 20,000
PCs, the savings must really add up."

The trouble is, they don't. Not really.

A difference of an average of, say, twenty watts (taking into account
a lengthy standby period), consumed for 24 hours a day, for 365 days a year,
times 20,000 computers, equals 3,504,000kW/h, which costs $US175,200 at
five US cents a kilowatt-hour, more if you're paying more for power.

From the point of view of one employee, that's pretty impressive.

But from the point of view of a corporation with more than 20,000 employees,
a hundred and seventy-five grand a year is invisibly small.

Scaling up accordingly and allowing for power that costs 10 US cents
per kWh, you're looking at an extra $US2.1 million or so, for AT&T, if every
employee has a desk, and every desk sucks 20 watts more than it needs to.

That's a lot of money if you put it in a pile in your bedroom, but only
about one-thirty-seven-hundredth of AT&T's last reported
profits. If AT&T got themselves a $US1000 bargain deal on low-power-consumption
PCs, they'd be paying a mere hundred and twenty million to kit out 120,000
desks with them. That's 57 times as much as the extra power for the older
machines costs. Power prices would have to rise a lot to
make it an attractive idea, just on a sticker-price basis; never mind the
higher service costs for proprietary machines.

Lower power PCs are a good thing. PCs that aren't more powerful than
they need to be to do the job at hand are also a good thing. PCs that are
made from recycled materials, and can more easily be recycled, are
another good thing, if only because
China seems to be getting sick of sucking down the Western world's nasty
techno-waste.

But insofar as there is a problem with PC power consumption -
and that problem isn't as big as it looks if you just read the power rating
stickers on your PSUs - you can't do much to fix it by changing the kind
of computer architecture you use.