Posted
by
samzenpus
on Thursday December 13, 2012 @07:59PM
from the getting-your-money's-worth dept.

MrSeb writes "If you've gone shopping for a power supply any time over the last few years, you've probably noticed the explosive proliferation of various 80 Plus ratings. As initially conceived, an 80 Plus certification was a way for PSU manufacturers to validate that their power supply units were at least 80% efficient at 25%, 50%, 75%, and 100% of full load. In the pre-80 Plus days, PSU prices normally clustered around a given wattage output. The advent of the various 80 Plus levels has created a second variable that can have a significant impact on unit price. This leads us to three important questions: How much power can you save by moving to a higher-efficiency supply, what's the premium of doing so, and how long does it take to make back your initial investment?"

Or don't: it comes out at several tens of years in any realistic scenario.

Scenario 1: an always-on computer running near-idle for four years.

Idle power draw, 85% efficient PSU: 66 wattsIdle power draw, 80% efficient PSU: 70 wattsDelta: 4 wattsTotal power difference over the four-year life of the computer: 140 kilowatt-hours.At 5.5 cents per kilowatt-hour (cheapest power in the US), building with a more-efficient power supply makes sense if it costs no more than $7.70 beyond what the less-efficient power supply does.

Scenario 2: an always-on computer running Folding@Home for four years using both CPU and GPU.

Power draw, 90% efficient PSU: 215 wattsPower draw, 80% efficient PSU: 245 wattsDelta: 30 wattsTotal power difference over the four-year life of the computer: 1.05 megawatt-hours.At 36 cents per kilowatt-hour (most expensive power in the US), building with a more-efficient power supply makes sense if it costs no more than $378 beyond what a less-efficient power supply does.

The second scenario represents someone running F@H on a modern high-end computer in Hawaii -- not exactly "unrealistic".

If you're trying to be energy neutral or positive in your living (e.g you want to be off the grid with a wind/solar setup) then every efficiency gain will more than offset the cost of producing / storing the power required).

If you're just wanting to view movies / ebay / email an live in a McMansion, with the full home theater setup, then there's no point because the rest of your lifestyle says "Fuck the planet, I'm all right"

Given that a more efficient power supply generates less heat, does it last longer? And does it generate less noise, since it doesn't need as fast a fan? Which gets kinda importat at the wee hours of the morning.

Then you dont know much. I run TWO Nvidia video cards and when in full gaming mode I cant exceed 300Watts of power draw. This is real measured numbers, not the fake crap on spec sheets. This is in a low end i7 3.2ghz setup with only 12 gig of ram and 4 WD Black drives.

The big factor for me is: how much heat does it put out? Texas in the summer can be brutal, and anything to keep my office half a degree cooler helps tremendously, especially in the era of multiple monitors. Higher efficiency = less waste heat.

Not to mention reduced heat output (and potentially less fan noise due to lower heat), important in many scenarios

Plus you have to add in costs due to the extra air conditioning load in the summer time (gotta remove all of that heat), and subtract in the winter time to account for the fact that your furnace needs to do less work to keep your house warm.

600W is actually pretty enormous by modern standards. an i7 3770k will use 77W, a very high end GPU 250W, and motherboard and other bits about 50... so even for very very high end systems, you're talking about 400W total consumption with everything under maximum load. Under most normal usage you're talking more like 100-200W.

That's a meaningless comparison tbh. The difference is likely that of 'el-cheapo' vs 'upper-mid-range'. The el-cheapo is probably not as stable when you get closer to its rated output.

An upper-mid-range 400W would probably have been fine.

Also, a general question on efficiencies; Do the higher power rated PSUs generally have higher efficiencies at lower power outputs? IOW, given 2 comparable model 'high efficiency' PSU's, one rated at 1000W and the other at 500W, would the 1000W one be more efficient than the 500W one at, say, 250W?That could make the 'over the top' ones worthwhile even at lower power levels...

If you spend $15 on a PS don't expect it to run long, and when it does go it could be in a big way.

Which also explains all the cheapskate DIYers who say they still get blue screens in Windows XP and above. Windows hasn't been unstable since ME. It's probably power fluctuations or cheap/underrated caps on the motherboard that's responsible for more blue screens than anything these days.

I just checked my electric bill; I'm paying about $0.14 per kWh. That gives:

(1050 kWh / year) * ($0.14 / kWh) = $147 / year

A 90% efficient PSU is half as wasteful as an 80% PSU, and half of $147 is about $73. If you can pay $73 to upgrade from an 80% efficient PSU to a 90% efficient PSU, you'll get 100% return on investment in one year. That's ignoring the extra cooling demands of the higher efficiency unit (and ignoring the decreased heating demands because electric heat is freaking expensive so $73 in electric heating would offset, what, $10 of gas heat?).

TL;DR: you're almost always better off buying the high efficiency PSU.

That was true in the past when the PSU wasn't a particularly valued component and the industry standard method of rating their power output was 'think of a number, any number. Now write that number on the side.'

It's *less* true these days if you're buying from one of the decent brands. The numbers they write on their spec sheets actually bear some kind of resemblance to reality, these days: you can actually accurately spec up your expected draw against the capabilities of a PSU and expect it to more or less work out. It's worth leaving a bit of safety room, but you don't really need 2X.

Yeah, I used to get the cheapest PSU I could. But after I somehow inexplicably fried some of my expensive components, like my GPU, I decided to drop in something a bit better.

When I dropped another $250 on a replacement GPU, I also decided to shell out real money for a nicer PSU and put my old PSU out to the pasture... in my kids' cobbled-together box.

Ended up going with a SeaSonic, since that's one of the brands that tend to be recommended by the Ars Technica Budget / Hot Rod box guide.

I wish I could find it, but there was some PSU snob site that went into all of the power benchmarking and provided pagefulls of data and charts like the other sites that benchmark CPUs and RAM. They managed to point out all the ways my old PSU was deficient and sorta almost turned me into a PSU snob as well.

That brings home another benefit of picking a high efficiency power supply: generally a much higher quality and specs that you can actually trust. For instance compare the review of the Coolmax 750W [hardwaresecrets.com] with that of the Corsair VX450W [hardwaresecrets.com]. The el-cheapo 750W PSU blew up twice [hardwaresecrets.com] after they pulled just 500W while the 450W one managed to provide a stable 572W [hardwaresecrets.com] before it shutdown cleanly due to over load protection! So before buying a power supply it's worth reading a proper review of it, even if you only read the conclusion [hardwaresecrets.com] page [hardwaresecrets.com].

So just looking at much is saved on electricity is missing the big picture.

I encountered a question on a (university!) CS exam asking how much swap space should be allocated for a particular system... as we did not know the workload, my answer was (in depth) it depends... NOPE wrong, 2x RAM.... WTF?!?

I used to test server and PC power supplies for a living (until 2009). I do NOT recommend running at 50% load unless your PSU is a cheap turd and you are worried (rightfully so) about component failure. 80-90% load will give you better efficiency, a higher power factor, and less harmonics. Fyi, as a residential electricity customer you don't really have to worry about power factor or harmonics much but large companies can be charged by the utilities for abusing the infrastructure with a ton of shitty/under-utilized PSUs. Since the company I used to work for sold into enterprise, we were very interested in PSU performance and matching up components for efficiency.

At home, I run a decent 350W PSU now, and my system draws about 200W of DC power under load (i.e. gaming) with my components (single Intel 2500K CPU, 8GB RAM, ATI 7870 GPU. 1 HDD and 1 SSD) and around 130W when surfing the web or working. I literally couldn't find a decent, well priced PSU with lower DC power output when I built the machine 18 months ago. It cracks me up when I see guys putting 700W power supplies into their gaming rigs that never draw more than 300W (and none seem to understand the difference between AC power draw from the wall and DC power draw of the components in their system, which is what the PSUs are rated for). It's basically flushing money down the toilet in multiple ways.

It cracks me up when I see guys putting 700W power supplies into their gaming rigs that never draw more than 300W (and none seem to understand the difference between AC power draw from the wall and DC power draw of the components in their system, which is what the PSUs are rated for). It's basically flushing money down the toilet in multiple ways.

About sums it up for me. Time and time again, reviews show "at the wall" power draws for modern non-OC'd, non-dual GPU high end desktops being under 300 watts at peak.

That's carryover from the bad old days when way too many power supply vendors played fast and loose with the figures. If pushed much over 50% utilization, the supplied power started getting dirty. Couple that with crazy overclocking rendering the system over-sensitive and you really did need the P/S to be rated at double the actually required power.

Your HTPC server consumes 350W? What the hell do you have in that thing?

Mine consumes less than 65W running full blast, serves files and 1080p video. I'd say you'd save a hell of a lot more money by downsizing that HTPC rather than just getting a more efficient power supply.

what makes it different? From a designer's perspective, I would say something that is small and as quiet as possible. In the case of systems I designed, more often than not around Shuttle XPC chasses, fanless. There may have been barely detectable sounds coming from an external water pump running at half speed (although that was shortlived, being as I shortly discovered the joys of Peltier heat pumps and huge copper heatsinks), but the hard drive was insulated so well on rubber dampers and felt lining insid

I have 2TB drive with well over 2 seasons of 5 different shows (I have 5 seasons of one of them) and it just breached 50%. I think you should probably find out why your video encoding software sucks balls or keep fewer episodes.

I did things slightly different. Btw you didn't think this was a bitcoin story....BUT IT IS! lol. My bitcoin rig ran at 550W and I had these calculations down plus an actual meter and they were all spot on. But let's say it's my gaming computer instead. That's around 240W peak of actual device pull. Let's say it's used for 6 hours a day at max load. Let's say I was going to get a piece of crap 76% efficient one but I went with an 80+ bronze which happens to be 83%. That's 40.8 watts added in waste he

new efficiency @ load % - old efficiency @ load % = delta%integrate over time (delta%*cost kw/hr) until result = new unit cost (solve for t)

You're missing the savings on removing that excess heat from your house too (in climates where that is relevant).

In a cold climate where you are heating your house, unless you can get better $/unit heating out of something else, the "waste" energy is heating the house anyway so it doesn't matter much.

In a hot climate where you are cooling your house, every unit of heat that you put into the house has to be removed. Firstly from the computer by making the fans work harder, then from the house itself by makin

Most of the apartments I've lived in had electric baseboard heating. For some reason the building owners didn't want to let me install my own gas furnace. Maybe I should ask them if I can drill from the 20th floor down to the ground and a bit more so I can install a heat pump.

That may be for the subset of "financial investment" but more generically:

an investment is something that returns more value than it costs.

By my definition, a car that depreciates is an "investment" because with it you were able to get a job and make more than the car cost, even if the car itself was a loss. The power supply is the same. If you count the added cost of an 80% efficient supply, you may never make back the difference, unless you count the air conditioning savings, and put a price on the

If you're only factoring just the electricity bill as a factor. But there are also environmental reasons maybe and it's harder to put an unemotional price on that. This is sort of like the people who claim hybrid electric cars are a waste of money since they're only looking at the wallet and not the bigger picture. It's more than just saving a little electricity as well, there is also the slight increase in customer demand, which slightly increases the market forces towards creating more efficient products in general.

Bought one Antec Earthwatts long time ago. The PSU was not much more expensive than the others (good brands) so the savings are obvious. Still, the PSU is very quiet which is the main reason why I bought it.

Bought one Antec Earthwatts long time ago. The PSU was not much more expensive than the others (good brands) so the savings are obvious.

Another thing TFA doesn't take into account is that the 80-Plus certified supplies tend to have better components overall than non-certified supplies.

Read some of the reviews at Hardware Secrets [hardwaresecrets.com] and you'll see that it's not uncommon for a well-built "350W" power supply to be able to output 450W, while a crappy 350W supply can't even handle 300W.

If you reduce the brightness of an LCD screen backlight it will also lower power consumption. Mine uses 40 watts full brightness and 20 watts dark. So if you shave off 10 watts it may nearly equal the savings of a good psu but for no outlay.

Have you looked at the price difference between different efficiencies for the same wattage? They're usually minimal. So might as well vote with your wallet and go for the highest-efficiency one. There's no telling how electricity prices will evolve over time...

To make the maths easier, lets assume you can improve your efficiency by 25% (that's huge) and assume you're loading it to 400 watts, (also huge) and assume you run it 8 hours a day, 5 days a week with 2 weeks off a year (running at full capacity).

And from an AC we have the best answer. A cheaply made power supply can much more easily damage your computer's other parts. Factor in the cost of replacing them and the more expensive power supply pays for itself regardless of energy savings.

don't choose a cheap piece of shit just because it claims a higher '80 plus' certification level than a quality, name brand unit from a reputable company that might cost twice as much.

yeah, this hits home. I just replaced my second failed Rosewill 80+ today (5-star reviews...). Visible build quality on the first two were great, but obviously the guts aren't so good. I'm gonna open it and look for mushroomed caps.

The third one, my only spare-on-hand is of such poor build quality that the metal conductors

Yeah Antec is good generally, most of their stuff is made by seasonic. So is OCZ(generally), mushkin, and a few others. I'd recommend looking through here. [hardwaresecrets.com] And see who is making what it can change sometimes between revisions. And generally the reviews are quite good. And each PSU has a teardown, including what's being jammed inside the guts. So you have a fairly good idea of what components are being used.

These days, 80plus PSUs are very cheap. The only things cheaper are unreliable JUNK PSUs which won't last a year. Also, because of the legal terms of using the 80plus trademark, manufacturers seem to not inflate the wattage ratings on 80plus PSUs, while you can easily find $15 "2000watt" junk PSUs.

And besides all that, I'd pay the 80plus premium just for the heat/noise reduction. Combine with a WD "Green" hard drive (or SSD), low-power CPU, and a couple low-noise fans, and you've got a very low heat and

A PSU has a power efficiency curve that looks like this [anandtech.com]. That article also explains what I'm about to summarize:

Pick a PSU that is no more powerful than you need, to keep your system in the middle of that curve, for maximum efficiency. 100% margin is more than plenty, so if your components will use 250W max, you don't need a 900W PSU. Look for something in the 500 range, or even less if you pick a good-quality PSU.

You probably won't be able to make a cost argument for maximizing efficiency, but you can build a quieter system focusing on efficiency, and it's quite satisfying obsessing over something different.

A while ago I purchased an EZ-Watt meter so see how Much power that my system was consuming. I found that my system at max CPU and GPU load consumes about 350 W of power. So my question is why would I buy a green 800 Watt power supply when my system only needs 300 W? It seems that it would be best to match the power supply to the system in order to maximize savings since the efficiency of the power supply is calculated at its maximum rating. How much power doesn't 800 Watt power supply consume when the syst

It depends a lot on whether "about 350 W" is a maximum or an average and even more on what you've been doing with your computer while you measure. Measuring draw like that is a good idea, but it doesn't tell you everything. There very well may be usage patterns for components in your system that some software may cause that are higher than your normal usage. If you start using your computer a different way (say by running a demanding game which uses your CPU, hard drives, optical drives and gpu hard all at

So my question is why would I buy a green 800 Watt power supply when my system only needs 300 W?

Components degrade with time. Specifically with regards to electrolytic capacitors. As the PSU ages, the ability for them to run a peak ratings diminish. At best, you get excessive DC ripple that puts a strain on your motherboard components. *Always* purchase a PSU that at least rated for 30% more power than what you need!

How much power doesn't 800 Watt power supply consume when the system is using only hundred t

If you look at efficiency graphs, you'll see that power supplies are typically the most efficient under moderate load: at low and high load the efficiency drops. A typical desktop or home server is idle most of the time, so idle efficiency will have a big impact on the total efficiency. If you over-dimension your power supply, your idle load might be 10% or less of the max rating, which is far from the optimum of the efficiency curve.

I'd recommend getting a power supply that can deliver a bit more than what you need, for example 450 W if you think you need 350 W max. A bit of margin is useful since you might not have found the actual worst case or you might want to add components later. Also it avoids poor efficiency at the high side of the curve when the system is under load.

Disagree about peak efficiency. In my experience testing PSUs, it is normally found around 90% load. Newer PSUs have gotten a lot better and enhancing efficiency at lower load levels, but PSUs still work most efficiently when running near the load they are designed for.

Newegg's calculator is a joke. It drastically overestimates requirements so they can pimp massive PSUs with higher profit margins. I suggest adding up the various component manufacturer specifications (i.e. max power draw of the MB, GPU(s), HDD(s), DIMM(s), and CPU(s)) and throw in 10-15 W for overhead, then buy a decent PSU with a load rated as close to that number as you can get. Even with a dual GPU setup, you are VERY unlikely to exceed 400W of DC power draw. My current mid-range single GPU system draws around 200W under load (gaming).

Disagree about peak efficiency. In my experience testing PSUs, it is normally found around 90% load. Newer PSUs have gotten a lot better and enhancing efficiency at lower load levels, but PSUs still work most efficiently when running near the load they are designed for.

Thank you for the correction. I was talking based on the 80+ certification requirements and hazy memory of an article I saw once. Glad you know the efficiency picture is better than I thought.

Unless you really need it, then choose something more modest than a honking 1000W PSU. Not a frag-fracking gamer? A 90W DC PSU should have enough juice for your 65W CPU. As PSU efficiency is measured in percentage, even a 50% inefficient 90W PSU will beat a 95% efficient 1000W PSU.

My experience is that nearly everyone overestimates their PSU needs and it becomes a game of "who's is bigger?". This is a stupid way to pick hardware. My desktop runs a 650, my ESX server with 24+ bays runs an 850. If I had a way better video card in the desktop I might move to a 750 and I wouldn't run dual cards.

so it probably for 99% of the people won't make sense to upgrade a power supply just for efficiency

but if for some reason you need a new power-supply anyways finding a good quality (80+ gold..etc) unit on sale is totally reasonable.... at this point most units worth trusting the rest of your gear to are probably 80+ anyways.

in my own case i had been using a 80+ power-supply that wasn't modular and cables where a hassle to manage... i wanted a modular power-supply and also have no intention of risking a $200 processor and $300~ video card etc to a generic / shoddy power-supply so i found the Seasonic X750 (80+ Gold ) on sale for $100~ (which if you look at newegg is cheaper than any 700-800watt fully modular power supplies currently.

since i wanted/needed fully modular 750~ish watt power-supply finding the X750 for $99 made sense as it was cheapest meeting those requirements.... the fact is it 80+ is just bonus... seasonic's 5year warr and generally pretty good reputation for quality power supplies drove the choice more than the 80+ gold.

The design choices that manufacturers make in order to meet these levels of effeciency have other impacts. Active power management, cooling fans that only run when needed, and higher quality components are all good reasons to consider a higher effeciency rated PSU. My computers often run 24x7 for years on end so I tend to choose decent PSU.

Also, just as a data point, I have a 4U box running a Xeon, 32gig of RAM, many cooling fans, 3x SAS cards, an SSD, and at least 20x HDD. It has a gold rated PSU listed as

Most PSU that do not sport the 80+ badge are outright junk that does not respect environmental and security norms in the first place, and will blow up in a variety of creative ways if you were to draw half of what is written as max wattage on the sticker. The 80+ badge weeds out most of the crap (not all though).

I bought a midrange power supply (Antec Gold £150 job) for my gaming rig some years ago (it was an Athlon XP2400+), which said 750W on the box. With a 4-box RAID0 and GeForce 7600GT the power draw was something like half that. It's still running.

I built an identical box around the same time for someone else. He didn't see the point of a beefcake PSU so he said to use a cheap (read: £20) 350W brick. His computer lasted a month before the caps blew and took the motherboard with it.

If people use less power through more efficient devices there needs to be less power produced. Less power production means less pollution and less greenhouse gases. Environmental issues may be a contributing factor in the selection of a more efficient power supply.

(1.) Not die within a year of running at 50-75% load
(2.) Not take any other components of my computer with it.

Power supply problems are the most annoying to diagnose, because the symptoms usually show up in other components (like apparent RAM corruption, HDD stuttering, etc). I would pay $50 extra for a power supply that is *not* 80-plus if it has stellar reliability, because it means I only have to build my computer exactly once. On that note, the Corsair HX [newegg.com] series power supplies have not only stellar reliability, but also pretty much silent. I refuse to buy anything else, and you can usually them 20% off if you watch slickdeals.

Efficiency saves you money, while reliability saves you time *and* money. And time is a limited resource for some of us...

I have no tangible proof, but generally PSUs that have an 80+ certification are generally much better quality than those that aren't. The peace of mind knowing that your PSU is likely to out-last the rest of your components is definitely worth it. Sometimes having your computer fail costs real world money (or equatable in-game money).

I have built passive-cooled machines since 2004 (or very nearly passive, with some machines having a single, huge, slow fan). The only way to make a PSU fanless is less wasted heat, or better efficiency. I don't care about a few wasted watts, when I have over half a kilowatt of computation going on, but I can't stand the noise of typical computer fans. High efficiency gear also tends to be very high quality for obvious reasons, so they last long. (I still have my first passive PSU from 2004, a precursor to the PicoPSUs.)

Get your head out of your ass. Most electric heating is done with heat pumps. A heat pump pumps more heat into your house than the electric energy it consumes (that's why it's called that way). Heating by burning something is also more efficient than dissipating electric energy because you're cutting out conversion (see Carnot efficiency) and transportation losses.

And in the summer, if the AC is on, inefficient appliances make you lose double: once by consuming more electricity than they should, and a second time because the AC needs to consume energy to pump the heat out of your house.

Can you really qualify heating done with heat pumps as electric heating? My house is heated with hot water from a gas furnace recirculated using an electric pump. By your definition of electric heating, wouldn't that make my house electrically heated? Also, aren't there transportation and conversion losses from burning something for heat just as there are with electric heating?

And in the summer, if the AC is on, inefficient appliances make you lose double: once by consuming more electricity than they should, and a second time because the AC needs to consume energy to pump the heat out of your house.

I'm not sure what method you can possibly imagine for pumping heat out of your house that doesn't consume energy.

This is why "software engineer" is a term I will never use willingly. It is an insult to real engineers. Heat pumps do in fact put more heat into their hot side than they consume in work. They take heat from a low temperature resivoir and send it to a high temperature resivoir.

So they work well in heating a house as long as its not cold outside. Probably not so good in a real winter..

Theoretically spoken, they just need to pump against a larger gradient if it's really cold. They will still have a benefit, only less.

Now, practically spoken, there are these nasty little engineering considerations. A practical heat pump has to be built for cold climates, and the heat pump/AC combos that are popular in the warmer parts in the US aren't, and are actually capable of being slightly less efficient than a resistor if it's really really cold outside.

Perhaps the "pump" part of heat pump completely eluded you, since they do not defy the first law of thermodynamics as you seem to be implying.

Heat pumps work by having a sink source off of which they are pumping the heat from or away from. Most of the ones I know happen to be geothermal, which work because the sink which they are pumping from maintains a constant temperature year long underground. So, during the summer, the heat they can extract from that source would be cooler than the air above ground, but during the winter be hotter. They do this by extracting the heat from the source sink, rather than producing it themselves.

So in that respect, they work much like the fan does within your computer, since the air inside the case is much hotter when running than the air outside of the case. The fan can then displace that heat generated inside rather efficiently by just pushing the hotter air inside the case out, while bringing the cooler air from the room outside in without having to require an equal amount of energy to then power those fans as the equipment running inside of it, thus, like the grandparent, requiring less electric energy to power those fans than what the computer itself uses. If this were not so, then it'd make a lot more sense to completely seal computer cases, as the cooling benefit from the fans wouldn't make up for the amount of dust which they bring into the case during operation.

So the next time you're tempted to call bullshit on a well known physics principle, make sure you double check that you're not making some stupid mistake. Or else you'll end up looking rather foolish again when someone else points out how you don't know what you're talking about.

While GP is woefully incorrect and you're right to call him out on it, your explanation isn't right either. Heat pumps can in fact pump against a gradient, and are mostly used to pump heat from a cold to a hot place. Air-source heat pumps (ie. coupled to the outside air rather than a geothermal reservoir) are used in parts of the US to heat houses in the winter and cool them in the summer. They're also what makes a refrigerator work. A fridge pulls heat from a cold place (inside the fridge) to a warmer place (outside the fridge). The resulting decrease in entropy needs to be balanced by an equal of greater increase in entropy, which is accomplished by converting electricity to heat. Or, to avoid the thermodynamic jargon, you're pumping against a gradient, so you need to spend energy to do so. The heat produced at the back of your fridge is the sum of the heat that was pulled out of the interior of the fridge + the heat-equivalent of the electricity the fridge consumed. This is also what an A/C does. Now, if we turn the A/C inside-out, so that it pumps heat from outside to inside, then you have the kind of heat pump we use to heat our homes in the winter. The sum of the heat that was pulled from outside and the heat-equivalent of the electricity the device consumes is larger than the heat-equivalent of the electricity alone, thus the pump brings more heat into your home than a resistor using the same amount of electricity. GP suggested to generate electricity from this heat gradient, but the flaw in his thinking is that the heat pump as well as any electricity generation device he can come up with are bound by the Carnot efficiency, so you can never get more electricity out than you put in.

Yes, and the energy wasted by an inefficient power supply is likewise 100% converted to energy. But the thing is: a heat pump can go higher than 100%:
http://en.wikipedia.org/wiki/Heat_pump#Coefficient_of_performance_.28COP.29_and_lift [wikipedia.org]
It can do that because it's not a closed system - it pulls heat from outside (yes, even though it's colder outside) to inside in addition to dissipating electric energy.

As for the burning, well if you're lucky to live in a place of the world with a large percentage of rene

In Sweden, electric heating is illegal because it's inefficient. "Even with a 100% efficient electric heater, the amount of fuel needed for a given amount of heat is more than if the fuel was burned in a furnace or boiler at the building being heated. If the same fuel could be used for space heating by a consumer, it would be more efficient overall to burn the fuel at the end user's building."

Do you heat your house year rounds. It is often assume that removing a watt of heat costs 3-4 watts of power. So if you cool your house for 3 months of the year, and heat it for the rest, then that might be break even. There are many places where cooling is not installed, so it is not an option. There the power supply would only be beneficial if you can tolerate the heat. For example a small apartment in new york city with no colling where the daytime temperature might be 85, and the computer might bri

Corsair's AX850 is a solid power supply OEM'd from Seasonic. But you can't cost justify buying one unless you have a truly ridiculous system. As of a few years ago, a good 500W power supply was already plenty to handle even three video card systems [tomshardware.com], and CPUs in particular have just reduced power requirements since. Newegg is showing me the AX850 as $189. You can get their similarly constructed 650W TX650M instead for $109. I was willing to pay whatever I had to in order to get the most reliable setup p

Wishing for mod points. +1.
I can count almost a half-dozen PC issues I troubleshooted (troubleshot?) over the years that were because of a flaky PSU. I only use top-end PSU's nowadays: PC Power and Cooling Silencer 750 Mark 1 on the media server (recycled from my old game rig), Cooler Master 1200W Silent Pro Gold (powering dual overclocked EVGA GTX 680's, 2 SSD's, 4x 7500 RPM HD's, Intel i5 2500k overclocked 45%, occasional hot-swap drives) on my current rig. More unnecessary crap to be installed shortly.