Syndicate

You are here

Smarter power when the supply changes

In furtherance of my prior ideas on smartpower, I wanted to add another one -- the concept of backup power.

As I wrote before, I want power plugs and jacks to be smart, so they can negotiate how much power the device needs and how much the supply can provide, and then deliver it.

However, sometimes, what the supply can provide changes. The most obvious example is a grid power failure. It would not be hard, in the event of a grid power failure, to have a smaller, low capacity backup system in place, possibly just from batteries. In the event of failure of the main power, the backup system would send messages to indicate just how much power it can deliver. Heavy power devices would just shut off, but might ask for a few milliwatts to maintain internal state. (Ie. your microwave oven clock would not need an internal battery to retain the time of day and its memory.) Lower power devices might be given their full power, or they might even offer a set of power modes they could switch to, and the main supply could decide how much power to give to each device.

Of course, devices not speaking this protocol, would just shut off. But things like emergency lights need not be their own system -- though there are reasons from still having that in a number of cases, since one emergency might involve the power system being destroyed. However, battery backup units could easily be distributed around a building.

In effect, one could have a master UPS, for example, that keeps your clocks, small DC devices and even computers running in a power failure, but shuts down ovens and incandescent bulbs and the like, or puts devices into power-saving modes.

We could go much further than this, and consider a real-time power availability negotiation, when we have a power supply or a wire with a current limit. For example, a device might normally draw 100mw, but want to burst to 5w on occasion. If it has absolutely zero control over the bursts, we may have to give it a full 5w power supply at all times. However, it might be able to control the burst, and ask the power source if it can please have 5w. The source could then accept that and provide the power, or perhaps indicate the power may be available later. The source might even ask other devices if they could briefly reduce their own power usage to provide capacity to the bursting device.

For example, a computer that only uses a lot of power when it's in heavy CPU utilization might well be convinced to briefly pause a high-intensity non-interactive task to free up power for something else. In return, it could ask for more power when it needs it. A clothes-dryer or oven our furnace or other such items could readily take short pauses in their high power drain activities -- anything that uses a cycle rather than 100% on can do this.

This is also useful for items with motors. A classic problem in electrical design is that things like motors and incandescent lightbulbs draw a real spike of high current when they first turn on. This requires fuses and circuit breakers to be "slow blow" because the current is often briefly more than the circuit should sustain. Smart devices could arrange to "load balance" their peaks. You would know that the air conditioner compressor would simply never start at the same time as the fridge or a light bulb, resulting in safer circuits even though they have lower ratings. Not that overprovisioning for safety is necessarily a bad thing.

This also would be useful in alternative energy, where the amount of power available changes during the day.

Of course, this also applies to when the price of power changes during the day, which is one application we already see in the world. Many power buyers have time-based pricing of their power, and have timers to move when they use the power. In many cases whole companies agree their power can be cut off during brown-outs in order to get a cheaper price when it's on. With smart power and real-time management, this could happen on a device by device basis.

These ideas also make sense in power over ethernet (which is rapidly dropping in price) which is one of the 1st generation smart power technologies. There the amount of power you can draw over the thin wires is very low, and management like this can make sense.

Comments

Smarter power is an interesting concept for small scale loads (laptops, cell phones, and all of the other dozens of electronic doo-dads that people are carting around these days). But I don't see it scaling well at all for loads in the microwave oven/air conditioner/clothes dryer size range. In addition to intelligence (computation) which is getting cheaper with time, a smart power outlet needs sensing and control elements.

Sensing of large currents in high voltage circuits doesn't benefit much from Moore's law. At present, a sensor that can measure 0 to 15 amps at 120 volts AC (standard AC outlet) costs significantly more than the outlet itself. Although every few years something new comes out that lowers costs by a notch or two, it will be while before the cost becomes "negligable".

Worse yet, _controlling_ large currents in high voltage circuits is even more resistant to Moore's law. Yes, MOSFETs have been steadily improving, but the cost for a switch that can control a couple of kilowatt drops by perhaps half every decade at best. It will be at least several more decades before that cost gets low enough to even consider a switch in every power outlet.

(Of course, like any technology, there will always be niche applications where the benefits outweigh the costs. I'm talking about mainstream uses.)

Its not just the cost of the semiconductor switch either. Assume you have a 20A 200V MOSFET, and its carrying 10A. If that power is traveling through 50 feet of wire from a distribution panel, that is at least ten microhenries of inductance. MOSFETs keep losses low by switching quickly from on to off or vice versa. You can't quickly switch 10A in a multi-microhenry inductor without causing voltage transients in the kilovolt range - poof goes your 200V FET.

When you start talking about any serious amount of power, you also have to consider safety. Almost every semiconductor power switch I'm aware of (20 years in industrial power electronics and motor control) fails shorted, not open. So you'll still probably need fuses or circuit breakers as a backup system.

The "benefit" of reducing inrush for motors and light bulbs is also of value only in niche applications. The ancient technology of spinning generators, transformers, and copper wires is remarkably tolerant of momentary overloads. An overload of 10 times rated current for a few cycles is nothing. 150% for a minute is also well within the capability of the system. It is only when you introduce electronics into the power handling path that you start seeing failures during momentary overloads.

I'm beginning to ramble, but you get my drift. "Electronics" and "power" are two very different fields, and it is very rare that you can take something that applies to electronics, and scale it up to residental or worse industrial power levels without completely unexpected side effects.

I won't disagree with much of what you said, but you're incorrect about the cost of sensing large AC current. That's very cheap and has been for a long time. You simply place a loop around the conductor and that induces a smaller current in the loop proportional to the current in the main conductor. These sensors cost pennies.

Controlling large currents is another matter, but we are now seeing it to be quite common for modern stoves, fridges, dryers etc. to have electronic soft-touch controls.

I think the main application in large-current devices would be when running on reduced power (backup batteries) or alternative power (solar) where you must be much more careful about load.

> the cost of sensing large AC current. Thatâ€™s very cheap and
> has been for a long time.

Cheap relative to the cost of an ordinary duplex outlet? $1.69 at Lowes. If I wanted to nitpick, I'd say since the outlet is duplex, TWO sensors need to be cheap compared to $1.69.

> You simply place a loop around the conductor and that induces
> a smaller current in the loop proportional to the current in
> the main conductor.

It sounds like you are describing a current transformer, but thats not how they work. You place a closed magnetic core (ferrite or iron-powder toroid, or steel laminations) around the wire, and then wrap many turns of wire around the core. If you have 500 turns of wire around the core, then the current in that winding is 1/500 of the main current.

If you are describing something else that exists today, please let me know (point me to a supplier's website or something). I use current sensors of all sizes, and would like to know if I'm missing something.

Iron cores and wire windings don't benefit from semiconductor scaling, so even with automation there will be some minimum cost to make such a transformer. An alternative technology would be a semiconductor Hall sensor near a conductor. If the geometry of the conductor and sensor are known and fixed, you can scale the Hall output to get current. As time goes by, Hall sensors and the associated circuitry _will_ get cheaper.

> I think the main application in large-current devices would be
> when running on reduced power (backup batteries) or alternative
> power (solar) where you must be much more careful about load.

Granted. I would claim that those are the "niche" applications where I said such technology could offer benefits that outweigh the price penalty. I just don't see it happening on a wide scale until the total cost is less than $1.00 per outlet, and I don't see that happening for quite a long time.

I must admit that I'm biased, because I make my living figuring out what can be designed, built, and sold over a time-frame of a couple of years, using today's technology. You are looking farther out, and considering things that might need breakthroughs to be practical. Differnet world-views.

For two reasons. First, I was expecting the smart devices that could engage in a power management protocol like this would be aware of their own power consumption, and be making requests for power from the power distribution system.

Detection of actual current consumed (in case they report incorrect numbers or are dumb devices) is more appropriately done on the per circuit basis, and inside the power distribution panel where the circuit breakers are. In fact, what would make sense is a smart circuit breaker which actually measures the current flowing through it and reports it somehow.

One guy I know has built a complete power control system for his house by putting inductive rings (sub-$1) around all the conductors coming out of the circuit breakers. These are connected to a multichannel A to D box so his computer can read current at any time through any circuit. He leared a lot about all the power consumption in his house and is productizing it. (You can also sense current using the hall effect, but I think that's more expensive than using a core when you know you are dealing with 60hz sine wave AC.)

Today a typical fridge uses several circuits, with dedicated ones for oven, microwave, appliances and often fridge. One could certainly imagine a system using fewer circuits because the fridge simply avoids turning on the compressor while the microwave is going, or the coils in the oven only heat while the fridge compressor doesn't run etc. This is not about measuring current, actually. If the devices make a mistake and draw too much current, the breaker blows, as always.

It sounds like you aren't really talking about "making requests from the power system", since by definition a request can be denied. Denying requests for power on a per-outlet basis requires power switching at the outlet, which was my initial problem with the concept.

The much less expensive system you are now suggesting has only per-circuit sensing, and only overload protection. It is incapable of physically "denying" a request by turning off the power to the requesting device. (Of course, it can trip on overload, shutting down the entire circuit, but thats not the goal here.)

What it can do, is deny a request by "telling" the requestor that the power is unavailable, and trusting the requestor to act accordingly. That works very well, for smart devices which probably already have power switching internally, and therefore have zero added cost for the power switch.

Not so well for dumb devices. If you plug a dumb toaster or hair dryer into the outlet, the system may be able to turn off smart devices after it sees the toaster load appear, but it can't keep the user from turning on the toaster in the first place.

Eventually, we could migrate toward more smart devices. But with toasters and such having average lifetimes of 5 years, 10 years, or even longer, "dumb" legacy devices will need to be supported somehow.

In addition, some devices like (like hair dryers or toasters) would have a consumer backlash if you said "sorry, you can't use this now". The system would have to prioritize loads so that non-critical ones could be turned off while more "cricital" ones run. (A toaster is hardly critical in the usual sense, but as a "foreground" application, it needs to take priority over "background" operations like the hot water heater or refrigerator compressor.)

This is really more of a peak shaving application. As such, you don't even really need to do it on a per-circuit basis, but on a total household demand basis.

Protecting the copper wires in a individual circuit from "starting surges" or other very short duration overloads is a non-issue - they can take it.

Protecting the wires from sustained overloads could have some benefit, but not much. Most people expect to be able to do several different things at once in their kitchens. Having the microwave turn off while they make toast, or the toaster refuse to run while the coffee is brewing, just isn't going to fly with the average consumer. The wires simply need to be sized to handle the kind of loads that the user is going to demand.

One of the reasons that electrical codes require multiple circuits in kitchens is exactly this consumer expectation. They know that people are going to run multiple loads at once, and they know that if nuisance trips happen, people are going to do unsafe things to avoid them. (The traditional penny in the fuse holder, or in more modern systems replacing a breaker with a higher rating without upgrading the wires, or running an extension cord thru the doorway from another room.)

Where this scheme can help is at the whole-house level. It is very reasonable to turn off the water heater, electric dryer, or air conditioning compressor while the toast is toasting. That is where I think your system has real merit. Unfortunately, except for people on backup, solar, or other off-grid power, there is very little incentive to do any form of peak shaving today. Before smarter power distribution inside the house happens on a large scale, we need smarter power measurement at the meter, and utility rate structures that provide a financial incentive for users to limit their peak load.

While at first I was thinking about this for low-power devices with limited current circuits (such as devices powered by 2.5 watt USB or 8 watt PoE) I think it has application in large power too.

What I propose is indeed for smart devices that ask before they draw extra current, and which also communicate the urgency of their ask, and can deal with messages that say they can't have the current, or that they will have it at a certain time.

The goal, however, is for it to be invisible to the user. Many devices run on a cycle, including refrigerators, microwaves at less than full power, and most heating devices including stoves, ovens, dryers and the like. It's possible for them to coordinate their cycles according to power capacity, which could allow you to run a modern kitchen -- transparently -- with far less peak capacity or far thinner wire. Some devices, while not normally designed to cycle are capable of temporarily turning off or down for short periods. This is not going to be always invisible but it would be mostly invisible.

While I did not have major plans for the stupid devices, I believe they can be handled precisely because wires can handle short surges. So plug in a toaster and possibly the fridge or oven takes a pause in its cycle and uses more power later when the toast is done. Plug in something that sucks all the power for a long time and you can no longer be invisible, you'll have to signal an alarm, or possibly have the power to cut off the dumb, high-use device. But ideally that would be very rare.

Now, are we in such dire need of wiring the whole kitchen with just 8kw instead of 20kw? No -- unless we're running of alternative power, or in a power outage or brown-out. This could make a lot of sense in side an RV, for example, or any off-grid house. It could also make a lot of sense, as I noted, if coordinated with the master power company.

Power company costs are highly associated with peak capacity. There are costs per kwh, sure, but a lot of the cost is in peak kwh. A power grid able to -- largely invisibly -- control loads, particularly air conditioning loads, would gain a lot. Of course, the more homes on the grid the more there is the inherent balancing of randomness.

Back at the low power devices using very thin wires (like PoE) the need is more direct. You could run more devices on the same wires with this approach.