Posted
by
CmdrTaco
on Tuesday March 22, 2011 @11:24AM
from the direct-and-to-the-point dept.

An anonymous reader writes "Researchers at the University of Bath, UK are undertaking an in-depth study of energy consumption within the new network, with the aim of demonstrating that running a large network of devices on DC rather than AC is both more secure and more energy efficient. AC electric power from the grid is converted to DC and runs 50 specially adapted computers in the University Library. Students using the system have noticed that the new computers are more compact and much quieter than the previous systems. The immediate advantages of the new system are not only for the user but for the energy bill payer and the environment."

Edison: "Genius is one percent inspiration and ninety-nine percent perspiration."

Tesla: "If Edison had a needle to find in a haystack, he would proceed at once with the diligence of the bee to examine straw after straw until he found the object of his search. I was a sorry witness of such doings, knowing that a little theory and calculation would have saved him ninety percent of his labor."

Truthfully both approaches are valuable, and we would be a poorer planet without either of these men

Without Edison there would have been a lot less tortured puppies, cats horses and elephants.. Edison arguably invented very little, instead taking the inventions of people who worked for him and claiming then as his own.

AC is still the prime motive force in electrical generation and always will be.

No, not necessarily.

We've already started moving away from AC for long-distance power transmission, using "HVDC" instead for things like 2MV transmission lines.

The main advantage of AC is that, with no semiconductor technology available, you can easily step it up and down between different voltages using an iron-core transformer, nothing more than a bunch of iron and some copper wire wrapped around it. High voltage is absolutely necessary for power transmission, because I^2*R losses are too high at lower voltages, but high voltage isn't usable by end-users because of safety and other concerns.

Nowadays, with power electronics (giant power transistors capable of handling thousands of volts and amps) and high-frequency switch-mode power conversion, that stuff is mainly obsolete, so it's fully possible to eliminate AC for power transmission, and even get better conversion efficiency than transformers. The only reason it's really still used is 1) our infrastructure already uses AC, so you can only replace it in certain places where it won't be too disruptive (like long-distance links), and 2) iron-core transformers are still much cheaper than electronic alternatives, so it's only economically feasible to switch to DC for certain large-scale projects, not for every transformer in a subdivision.

There's no technical reason that, in the future, DC couldn't become the standard, with electronic "transformers" stepping the voltage up and down as necessary.

I've always thought they were both right - sort of. Tesla is right about power distribution. Edison was very, very wrong, as he all too frequently was. But, D/C does have many uses once its at a consumption point.

I never understood why we don't have converters in every house. Simply deliver the power as A/C and provide for A/C and D/C in every house. Keep A/C for things like laundry and dish washing and D/C for most everything else in the house. There would be many advantages to such a dual scheme. Just ima

Ummm...no. I'm pretty sure people understand how a power receptacle works.

Engergy efficiency has little to do with the device itself.

Completely wrong as you're not looking in the right place. Massive amounts of power in wasted on inefficient power converters. Many are frequently in the 80%. Part of this is higher efficiencies demand higher prices. So, frequently to save money, low efficiency converters are provided with consumer goods. Even many PC's have efficiencies which which range from very high 80s to very high 90s. The difference is almost always dictated by

You are not quite comparing apples to apples. 12VDC at 10A would only be 120W of power - not enough to run most desktops.At 120VAC, this would only be 1A.

120VAC like used in US houses is supplied at something like 15 to 20A. (I live in Australia - household power here is 240v, 10A.)

If you needed to supply enough DC for several computers, TV's, and pretty much anything which doesn't have a high power motor or heating element in it, you would need to be able to supply a lot more than 120 Watts. My PC + monito

120VAC like used in US houses is supplied at something like 15 to 20A. (I live in Australia - household power here is 240v, 10A.)

Electrical supply in Australia is 230V/50Hz. Residential switchboards have an 80A fuse to the mains and the individual circuits have 5/10/15/20A fuses/breakers depending on the age of the house and the expected circuit usage. In my current house the lights are wired into a 5A circuit, living areas are on 10A circuits, kitchen and laundry are a 15A circuit and the garage is a 20A circuit.

DO NOT DO THIS!!! If you have cuts and/or your hands are wet/moist you could still get a tingle. Car batteries are made to give ~120A, plenty of Amps to seriously hurt you if something goes wrong (eg. your ring/watch/other jewelry makes contact between your hand and the ground of the car).

To see what a car battery can do, put starter cables on the battery, close the hood and then push the other two ends together with a long wooden or plastic stick. Close the hood just in case your battery explodes. It might

You're missing the point. Much of our electronics run on 5V DC. If you put a big 5V converter on the side of your house, and a bus system to connect this to all your 5V gadgets, the losses would be greater than just having all those wall-warts. Even worse, your gadgets wouldn't work, because the voltage drop between your converter and your gadgets would be so great, you'd end up getting 4V at the gadget, and worse, the drop would vary depending on how much current that gadget is drawing (and other gadgets on that run).

If you want to be more efficient, the answer is simple: throw away all those wasteful transformer-based wall-warts, and replace them with high-quality switching wall-warts instead. They're lighter and also have better efficiency, both when under load and when not loaded. The problem is that switching wall-warts cost more than the crappy Indian and Chinese-made transformer-based ones, so gadget makers don't usually bother to include them.

There is no evidence or reason for DC to be more "secure". If some lame argument about it being harder to bring your own power source / utilise their outlets when you have a custom system is put forward, then, well... no.

I can understand the efficiency argument to a certain extent, although if a workstation needs enough power that a fanless AC PSU is unsuitable then the more efficient AC PSUs will be enjoying enough load to reach over 80% efficiency. Are the centralised rectifiers + wires + in-computer DC-to-DC converters as efficient?

I think they're referring to secure meaning less downtime - the individual computers are less likely to have power supplies die, since there aren't any moving parts. Also, part of the system involves a UPS - so less issues there than previously (although there's no reason you need DC for that to work).

The thing that confused me was the statement that the system was "faster". Maybe they're just talking about the fact that they got new computers? The whole article reeks of badly uninformed reporting, gloss

Definitely true. Battery banks tend to need regular, and often costly, maintenance. And you can still cause lots of damage to them by tripping them or incorrectly swapping switchgears/breakers, etc. I've done my time repairing those, and concur.

That is only true if you use batteries for UPS systems. There are a multitude of other ways to store energy in a system, besides batteries. You could use for instance Supercaps, a flywheel or gravitational potential energy.
Of course, it is more common to use batteries, which they also mention using in the article.

Yeah, generally the more you spend the higher quality the design. Its not that you can't design build and sell a 99% AC supply, its just you can't do so and survive in this weird confuseopoly market where the only thing that matters is

Its not that you can't design build and sell a 99% AC supply, its just you can't do so and survive in this weird confuseopoly market where the only thing that matters is price

Lets see if it really makes sense to get that 99% AC supply, since you think its such a brilliant idea.

My Core2Duo system w/ graphics card chews up a whopping 225 watts under full load (measured with wattmeter), with an 80% efficiency PSU. That means the PSU is wasting an astonishing 225 *.20 = 45 watts. The kWh rate here is around $0.062 after generation and distribution. That means per month, it costs me about 8.37 cents for having such an inefficient PSU. I purchased it about 5 years ago (June 2006

That is the claim but many fail that and don't actually get close until 50% or higher, a few sites have done real world testing and have proven it. The other thing is that people always way over estimate the draw of components or simply go for the "bigger is better" mentality. I've seen so many 1KW PSU's in systems drawing 200watts or less. The wiki page gets into it all a little bit: http://en.wikipedia.org/wiki/80_PLUS [wikipedia.org]

The only thing inside a computer that actually runs on AC is the computer's powersupply. The powersupply regulates this to DC voltages! The powersupply is also quite bulky and noisy compared to the other components.

"Initial tests show that the system in Bath emits approximately half as much energy as heat than the previous AC powered system while running much faster."

Yes, I'm sure it'll generate less heat when most of that heat comes from converting AC to DC, but why the hell would it run faster when everything else in the computer is still the same?

the project team moved the one tonne AC converter through the University library and into the roof space, removing and rebuilding walls to transport it

It would have been cheaper to just use 50 energy-efficient laptops. You'd get even more power savings, and if you wanted to completely remove the heat from the transformers, just put them all in a cabinet that vents outdoors, and extend the DC power plugs.

The only thing inside a computer that actually runs on AC is the computer's powersupply. The powersupply regulates this to DC voltages! The powersupply is also quite bulky and noisy compared to the other components.

"Initial tests show that the system in Bath emits approximately half as much energy as heat than the previous AC powered system while running much faster."

Yes, I'm sure it'll generate less heat when most of that heat comes from converting AC to DC, but why the hell would it run faster when everything else in the computer is still the same?

Because they're comparing it to a previous (older) system, not the same system but powered with a local ac/dc power supply. An apples and oranges comparison.

The speed comes from the new computers. Generally, when you purchase new computers, the hardware is better than the previous ones. Add to that you don't have the Windows bloat of years of updates and installs / uninstalls, and possibly running Windows 7 versus the old XP, and you get a faster machine.

When people ask me if they need a new computer or would the one they have work, I respond that any new computer will seem better. After the honeymoon period, though, when something breaks or you install th

There's something to be said for DC distribution within data center racks, but building a plug-in DC infrastructure seems like a PR stunt. They need a whole rack of power conversion gear to serve 50 desktop computers.

Google at one point proposed that rackmount computers should be built to run on 12VDC only, so you could have a single 12VDC supply in the rack and get rid of the individual power supplies for the server. Whatever happened to that?

Much industrial automation gear and military equipment runs off 24VDC. That's low enough that you don't have a shock hazard, but high enough that the wire sizes are reasonable.

I had debated putting a 12vdc power supply in my home computer and running it to a 12v battery with a power/charging circuit on that. I never understood having UPS systems that convert AC to DC to charge the batteries then switching back to AC to power the computer which just converts it back to DC.

When I first had to deal with telephone equipment, I came across the -48 VDC power standard for things like SONET nodes, digital cross connects, channel banks, and telephone switches. I believe this is due to cathodic protection [wikimedia.org] of buried copper cables.

You can find -48 VDC rectifiers, AB fuse panels (think redundant DC power supplies) and lots of telecom gear in racks that is powered with -48 VDC.

Researchers at the University are undertaking an in-depth study of energy consumption within the new network, with the aim of demonstrating that running a large network of devices on DC rather than AC is both more secure and more energy efficient.

The new DC network also offers greater security. DC power supply units have a simpler design, with fewer parts that could fail and need replacing. The system at the University also charges a number of batteries when usage levels are low to allow the system to run independently from the grid for up to eight hours should a cut in power be experienced.

The above two paragraphs are the only I could find in TFA that mention security. I gotta ask -- can anyone speculate how centralizing the PSU would lead to a more secure system? Is it possible that there is a regional definition of "secure" to mean "very reliabile" or "very available." As in, we have "secured" a constant municipal water supply?

If you're looking at the McCumber cube [wikipedia.org], then yes, availability is one of the three aspects we're trying to protect in security (along with confidentiality and integrity).

Most "security" obsessed people these days come from the "keep the bad people out" mentality, even if it's at the expense of making it so obnoxious for the authorized users to actually be able to do their job, but a complete model of security is that people who are supposed to be able to use the system are able to use it when they want.

Not content with lowering power usage and reducing energy loss, the University hopes to extend the environmental credentials of the new network by installing mini wind-turbines or solar panels, both of which output a DC current and therefore don’t require inefficient conversion from AC to DC.

My school physics may be a bit rusty but I would assume wind turbines produce either pulsating DC or AC and hence the current has to be converted before use by electronics.

The major drawback to DC power is in the wiring. Direct current requires larger gauge wiring than AC power, which increases material costs considerably. In general, DC power is economical only if the wiring between the computers and the DC source is less than 35 feet in length. More than that, AC power becomes more economical.

I'm not sure how you came to that conclusion. AC suffers from several effects that make it less efficient and/or more expensive over long distances.

For DC, the power delivered is V*I. For AC, it's similar except the V is really Vrms - you must insulate for Vpeak, but you only get Vrms * I power. For sinusoidal AC, the difference is a factor of 1.414.

With AC circuits that have non-zero reactance, you must choose a conductor that can carry Imax, but the power delivered to the load is only Vrms * Imax * cos(phi), phi being the phase angle between the voltage and current.

AC circuits suffer from the skin effect [wikipedia.org] where the power travels more on the surface of the conductor rather than equally throughout its cross-section. This requires a larger solid or stranded conductor than would be required for DC.

AC has a few things going for it - the ease with which voltage can be transformed, the ease of generation with rotating generators, and ability to drive large, multiphase motors efficiently.

With AC circuits that have non-zero reactance, you must choose a conductor that can carry Imax, but the power delivered to the load is only Vrms * Imax *
cos(phi), phi being the phase angle between the voltage and current.

With respect to RMS vs peak, you are thinking of the *voltage* rating (what the insulation can withstand), NOT the current rating. Economically the insulation rating is not low hanging fruit for optimization - the copper conductor is by far the bulk of the cost and this can not be downsi

No it doesn't. Its the voltage that matters, not the type of current. It just so happens that most industrial DC applications - eg metro trains - use lower voltages that their AC equivalent which means higher currents for the same power delivery and hence thicker cabling.

The major drawback to DC power is in the wiring. Direct current requires larger gauge wiring than AC power, which increases material costs considerably. In general, DC power is economical only if the wiring between the computers and the DC source is less than 35 feet in length. More than that, AC power becomes more economical.

FTFA:

the project team moved the one tonne AC converter through the University library and into the roof space, removing and rebuilding walls to transport it

Somehow, I suspect that the cable run to the individual machines is more than 35 feet.

The major drawback to DC power is in the wiring. Direct current requires larger gauge wiring than AC power, which increases material costs considerably. In general, DC power is economical only if the wiring between the computers and the DC source is less than 35 feet in length. More than that, AC power becomes more economical.

FTFA:

the project team moved the one tonne AC converter through the University library and into the roof space, removing and rebuilding walls to transport it

Somehow, I suspect that the cable run to the individual machines is more than 35 feet.

His figure of 35 feet is of course completely made up, or at best applies only to one very specific situation. Its a rather complex non-linear solution that depends on current level, local union labor contracts, price of copper wire, UPS and battery capacity, etc.

Obviously, if you are charging an ipod at a zillionth an amp after a 12V to 5V converter, you can run that thru thousands of feet of small gauge (cheap) speaker wire before the voltage drop will matter. And if you're doing the thousand watt gamer PC or NAS farm you'll need something approaching welding cable to keep the voltage drop low enough. In between, well, its in between. But by no means as simple as a 35 foot cutoff.

Not really. It depends on the current drawn through the wire. For power P (constant for the computer, more or less) required at a voltage V, you need I=P/V amps. You're not going to distribute 3 V or 5 V, which is what your ICs want, I hope! You could distribute 120 V DC with the same size wiring you use for the usual AC connection. You could send around 1,200 V (DC or AC) and use 1/100 the copper. (Power lost to heating goes as I**2.) The high voltage limit is set by safety and cost of DC-DC convert

It was before its time, I guess. This article reminded me of the same thing. Con Edison offered reliable, safe and efficient power. It won't kill you like AC power when you accidentally touch the wire.

In a single room or even perhaps a floor of an office building I guess I could see DC distribution. It would tend to reduce the power supply losses. Laptops are already doing total DC-DC conversion for the different voltages they need and there probably isn't much of a reason you couldn't run 12 volts to each computer and have it convert it over to 5 and 3.3. I would think your benefits would be significantly less if you were running 100 volts DC and requiring it all to be downconverted as DC-DC conversi

Today if you blow a power supply (one of the most common computer failures) you lose one computer. If you blow the power supply for the office floor you might lose 100 or 200 computers.

For reasons which are a long story, I have had several servers up and running on 12V for many years now. The powerstream guys are pretty much the gold standard of ATX 12 volt power supplies, as far as I know:

Note that these are "honest wattages" not the "marketing wattages" seen in the AC power industry. The price of a 300 watt DC supply seems high compared to a 100 watt AC supply from China that has a sticker claiming 300 watts. However its not too bad compared to a AC supply that actually only provides 300 watts despite having a sticker labeled 800 watts or a million watts or whatever marketing felt necessary. Also the powerstream supplies, to the best of my knowledge, are some of the few computer power supplies you can buy that do not have forged FCC and UL registries, which is worth something to me. In summary, expensive, but strongly recommend based on years of experience.

Anyway, what happens when the primary rectifier goes down, is my battery bank will run the asterisk PBX and friends for something like half a day, during which time I can source a generator and charger, or perhaps casually purchase a new supply, etc. Also I have multiple supplies any of which could theoretically power the whole works (at a cost of high heat and much shorter capacitor lifetimes, etc). So you Y-cable them to run multiple plants off one supply. Guess what, the same Y cable can be used to run multiple plants off one battery, if one fails. Etc.

Theoretically, I could run the entire phone system off an idling car, assuming you have enough gas in the tank. Unfortunately my entire plant draws just a little too much for the cigarette lighter plug, probably 15 amps total. If I could invest in new phones / new servers / etc and get total plant draw down to 5 amps, not only would my batteries be 1/3 cheaper or last 3 times longer in an outage, but I could also run the works conveniently off a car cig lighter port.

Obviously if you have zero battery capacity then you are instantly in deep doo doo, but given three or so figures of amp-hours you're good to go for a very long time.

Wire everything in Amphenol power poles, exactly like the ham radio guys so you can use their DC products, and keep a stock of extension cords and Y cables and other gadgets. Use fuses, and as a subset of that rule, only use automotive fuses because they are infinitely available. Use 12 volts as your standard because you probably own a mobile 12 volt generator (aka your car). Perhaps if you're in the.mil and have a 24 volt humvee, do 24v instead, whatever.

Initial tests show that the system in Bath emits approximately half as much energy as heat than the previous AC powered system while running much faster.

If you mean "much cooler", you already said that. If you mean "much faster", you should probably sign up for that physics (or electronics) course.

I'm betting the new systems were much faster because they were, well, newer than the old ones, and the fact that they ran faster is completely unrelated to the fact that the cable running into the box that provides power carries DC rather than AC. But what do you expect from a propaganda-laden puff piece released by a university PR department, scientific accuracy and truth?

With DC power, the electrons get to run laps, and every time they get to your computer, they can do a little bit of work, spreading it out among all the electrons. With AC power you got those electron thingies racing back and forth and back and forth, but never getting anywhere. Only the few electrons near the computer actually do any helpful work, and they get worn down really quickly, so they stop working as efficiently, and the CPU slows down, and it's just generally bad.

It's not clear to me how this is any better than specing normal machines with a decent power supply. Anything rated at "Gold 80 Plus" will convert to DC with 80-90% efficiency depending on the power draw.

Is this just a matter of replacing old inefficient noisy machines with newer efficient (and thus quieter) ones?

My W510 draws only around 50 watts, but its power brick is rated for 120 watts. My desktop is likewise over-provisioned as well to account for power consumption spikes.

However by combining the AC to DC conversion of 100 computers, the over-provisioning factor can be much much less, because it's very unlikely that the computer enter into power spikes at the exact same time.

My server at home is begging for a DC conversion, for example, as are my switches and other gear.

The idea is great, but like so many things, we are entrenched in our AC power systems. So until we come up with a common implementation of the "AC to DC" power supply for everything and everything comes with an option to plug in "DC" I will have to wait.

There are DC power systems for servers and such today, but such things are pretty much special order and expensive. I hope that it catches on at a level which

Third parties would spring up to provide cables to connect the router you already have to this DC outlet in place of the wall-wart.

I sort of did something like this once; for some reason or another I had an ethernet switch without a working transformer. I simply chopped off the DC power plug, soldered it onto the 12v pins on a 4-pin molex, and plugged it into my PC power supply. As long as the PC was up, so was the network! I might even have run a small LAN party this way for a weekend.

So they went AC-DC to try and get back in (the) black?My concern is that the initial conversion would cost a touch too much, and it ain't no fun waiting for the energy savings to cover the investment - the down payment blues.Still, in my experience the power supply is often the point of failure that finally kills the whole computer, so goodbye & good riddance to bad luck.

DC datacenters have been around longer than AC. All major telco's globally use DC distribution for their networking and communications equipment. They always have, and likely always will. The newer datacenter companies are just having to learn for themselves why it makes sense.

...running a large network of devices on DC rather than AC is both more secure and more energy efficient. AC electric power from the grid is converted to DC and runs 50 specially adapted computers in the University Library.

Computers already run on DC -- the only question is where the conversion takes place. The downside to having a single converter (rectifier) is that you have a single point of failure, but obviously you can place it away from the actual computers to reduce noise and such.

Just a couple of days ago I was talking with a friend about how my company already does this with some of its servers.

I know that we use AC for transmission because it loses less power over long distances than DC, but is there any reason why we don't have DC converters installed in the electrical panels of homes other than the fact that many appliances currently require AC? More and more appliances seem to be using DC lately, requiring wall warts. If we could convince more manufacturers to produce DC a

We all have houses full of DC powered devices but no DC power. How many wasted AC/DC converters is that per year? Computer power supplies, etc. wasting 30-50%+ of the input AC due to inefficiencies and poor sizing for the task. Office buildings full of DC powered computers and gear and yet we still just keep on going with pure AC power.

On my Atom based fileserver, with a variable speed power supply fan the only noise I could hear was the tiny fan that cools the chipset (the CPU is fanless but the chipset has its own dedicated fan). It got so annoying that I unplugged it and set up a quiet 80mm fan in the case to blow on the chipset heatsink.

There are plenty of silent PC options that don't require running special wiring just for DC power. In fact, some thin clients will power themselves over PoE (which I guess is technically a form of DC

AC is much easier to transport. DC resistance in the AC world is impedance. As impedance is complex, If you choose the correct frequency and voltage, you can move power very very far distances with extremely little loss. You cant do that with DC.

Also, when houses were getting power, all they had was lights, and motors, all of which run fine on AC.

When electronics started to come out, a house only had a handful, TV, Radio, that was it.

Now, I would say that a significant part of consumption is DC at a home, but there is no standard in the house for a parallel DC infrastructure. Would you do -48V like the phone companies? 12V? How would the in wall wiring work?

IANAP, but it's quite simple. Every time alternating current changes direction, some energy is lost to overcome the capacitance in the wires.

But when it comes to cost, the equipment to change AC voltage is much simpler/cheaper than the equipment for DC. So it's a trade off of the money saved through increased efficiency and additional equipment costs.

It's funny that you'd bring that up. Yes, actually Physicists and noted researchers are observing subtle but significant changes in the way that electricity flows through conductive materials. Electrons used to be so-called "longer" a few decades ago (end to end, mind you), so you'd actually get a fewer number of them traveling through the wire to you per second.

Now that the magnetic poles of the earth are shifting, and we're (slowly) approaching the geomagnetic "pole-arity" reversal (a little physics humor

Physicists and noted researchers are observing subtle but significant changes in the way that electricity flows through conductive materials. Electrons used to be so-called "longer" a few decades ago (end to end, mind you), so you'd actually get a fewer number of them traveling through the wire to you per second.

Ah! but DO the electrons flow through the wire or just shuffle back and forth as the holes flow through the wire? I've seen these very same people get into fist fights over that question too.

I believe the telecom industry has known this for years. Since telecom equipment in COs can be run on both DC based and AC based power, consumption levels can be monitored between the AC and DC based offices. AC/DC - Rock on.

Not exactly - it's because of the requirement that equipment operate during power failures. Hence the large banks of lead-acid batteries in older switching centers.