Lower operating voltages give the biggest bang for the buck when longer operating life in battery-powered devices is your goal. But those lower voltages also present power-supply designers with new challenges as they strive to wring every last watt-hour from the battery.

Two major factors contribute to the power-conversion circuit's ability to extend battery life. The obvious metric to check is the basic conversion efficiency when the power supply is operating. Not so obvious may be the voltage at which the power supply stops operating.

Although every engineer seems to have an intrinsic understanding of the effect efficient energy transfer has on operating life and heat buildup, extracting this information from some semiconductor data sheets sometimes requires far too much guesswork. Efficiency data is most often presented at fixed input voltages, but designers of battery-operated equipment never have the luxury of assuming a relatively fixed input voltage. As the battery discharges, input voltage clearly declines dramatically, thus altering the efficiency of the conversion circuit. What is needed is an average operating-efficiency number.

When using a low-dropout regulator (LDO), the linear conversion efficiency is proportional to VIN - VOUT, making an estimate of average efficiency over the life of the battery relatively simple. Inductor-based switching regulators have highly variable efficiency as input voltage changes, thus making the task of producing an average-operating-efficiency number difficult.

All too often, today's devices force users to discard-or at least change- batteries "before their time." Regardless of the amount of energy remaining in the battery, when the power supply stops working, so do you. The critical factor here is dropout voltage. For instance, a lithium-ion (Li-ion) battery can be safely operated between about 4.2 V and 2.5 V. If an LDO needed to produce a 3.3-V output, it might stop working at 3.45 V-assuming a 150-mV dropout voltage. In the case of a coke-anode Li-ion battery, over 40 percent of the energy is unused. A three-cell alkaline pack still contains 33 percent of its power under the same conditions.

Another factor driving the engineer's focus on dropout voltage is the effect highly variable or pulsed loads have on the output voltage of the battery. Depending on the equivalent series resistance (ESR) of the battery, output voltage can drop substantially as the applied load changes.

Nowhere is a pulsed load more prevalent than in cellular-telephone devices. The increasingly popular time-division multiple access (TDMA) protocols used in GSM (Europe), PCS (United States) and PDC (Japan) all require rapid pulsing of the cell-phone power amplifier (PA). As the PA draws 1,000 to 1,500 mA, input voltage to the power supplies for the digital signal processor, frequency synthesizers, mixers, low-noise amplifiers and audio and display circuitry may dip 0.5 V or more for Li-ion batteries. Although Li-ion chemistry provides the highest available energy density measured by both weight and size, unfortunately the ESR can be much higher when compared with NiCad or NiMH batteries. In addition, it increases as the cell discharges.

A power supply's ability to operate with a very low minimum input voltage becomes critical for extending the phone's talk time. When the input voltage (3.6-V nominal from Li-ion) is close to the desired output voltage, using an LDO to generate those voltages is a fairly high-efficiency proposition. For a 3.3-V rail, an LDO achieves about 88 percent average efficiency over the portion of the battery discharge cycle it uses. As output voltage requirements drop, efficiency will linearly decline.

But do we care? In fact, empirical evidence would suggest we don't. Users today enjoy over 200 hours of standby time on cellular telephones, and during the vast majority of that time the PA is inactive. Market data suggests that users will not pay more for more standby time; what they want, and will pay for, is more talk time. Talk time is dominated at an 8:1 to 10:1 ratio by the PA's consumption of power. If all the other circuitry in the phone is consuming only 10 percent of the power, increasing the conversion efficiency on that portion does not noticeably affect talk time.

So what is the limiting factor to talk time? Today, many transmit power amplifiers shut down around 3 V-they are typically connected directly to the battery. However, since an LDO cannot boost, any supply rail above about 2.9 V shuts down earlier. That may be acceptable, except that those steady-state assumptions do not account for the load profile of TDMA phones. As the power amplifier pulses on, drawing an amp or more, the ESR of the battery causes a transient voltage drop that may knock an LDO out of regulation long before it normally would with a lower steady load.

The increasing ESR burdens the power supply with very low input voltages, even though the battery has a large amount of energy remaining. For example, one commercially available 600-mAh graphite-anode battery has an 830-mohms ESR at 3.5 V, dropping the output voltage to 3 V with a 1,200-mA load. Now at 3.5 V, 12 percent of the battery capacity remains unused. Using other Li-ion chemistries such as coke-anode materials can result in a much higher percentage of unused capacity.

We have found that from a power-supply perspective, a low minimum input voltage (dropout voltage) has the biggest effect on increased talk time, since it utilizes the battery more completely. But as output voltages decline, the conversion efficiency of an LDO-based power-supply architecture becomes a much bigger factor. A 1.8-V rail results in an awful 49 percent average efficiency using an LDO alone. Not only is the low dropout important, but increasing efficiency now has a 5 to 10 percent effect on increased talk time.

One method cellular-telephone power-supply designers use to combat this problem is to front the LDOs with either inductive boost or inductive buck regulators. The full battery capacity can be used by boosting a Li-ion battery above its fully charged voltage-about 4.2 V-and then using LDOs, but this also results in poor conversion efficiency, especially at lower output voltages. Buck-only solutions suffer from a higher dropout voltage, especially when used to generate voltages between 3.3 and 3 V. So, although good efficiency can be obtained during operation, some significant portion of battery energy will remain untapped, shortening the potential operating life.

Using step-up/step-down or buck-boost converters solves both problems, since an output voltage can be picked that allows efficient conversion with trailing LDOs, and yet utilizes the full battery capacity because of very low minimum input voltages. For instance, choosing a 3-V output at "A"-between the output of a buck-boost regulator and the input of an LDO-will support the 2.5-V requirements of tomorrow's systems. It will work at relatively high efficiency, and enable users to get maximum talk time by fully utilizing the battery.

Only now are switched-capacitor architectures becoming available that dynamically move between buck and boost modes, depending on the input voltage and output load. Average operating efficiency compares quite favorably with single-ended primary inductance (SEPICs) converters. By using multiple-gain modes to optimize efficiency, one buck-boost switched-capacitor converter achieves between 82 and 84 percent average efficiency over the discharge profile of both coke and graphite-anode Li-ion batteries when producing a 3-V output voltage. Since the input voltage range encompasses the entire operating range of the battery, the full battery capacity can be converted for the loads.

Most inductive SEPIC converters require two inductors, while buck-only or boost-only require one. Removing the need for inductors in portable systems not only saves space and allows thinner designs, but also can lower manufacturing costs and ease difficult procurement issues in high-volume production.