I understand how DC-DC buck converters save power when compared with a linear regulator, but do they save power when compared to not using power conversion all together? In my specific case, I have a microcontroller that outputs 3.3V to a sensor device which could run at 2.5V. If I step down the voltage from 3.3V to 2.5V, am I saving power and allowing my battery to last longer? Or does the overall power use remain the same?

\$\begingroup\$Start thinking in terms of power and not voltage. Check the power consumption of your device when fed with different voltages.\$\endgroup\$
– Eugene Sh.Aug 8 '18 at 17:55

2

\$\begingroup\$First figure out how much less power the sensor would use when operated on 2.5v. Then figure out the quiescent power consumption and efficiency of the step down converter. However, it looks like there may be an error in at least the wording of your question: generally a microcontroller should not be supplying power to a sensor. If your sensor requires so little power that an MCU I/O signal pin can properly supply it, a stepping regulator may not be worthwhile.\$\endgroup\$
– Chris StrattonAug 8 '18 at 18:29

\$\begingroup\$DC converters rapidly switch on and off as necessary to reach the desired output voltage whereas a linear regulator just looses all power over whatever the voltage drop is. So if you step down 9V to 2.5V and the devices draws 100mA that's 9-2.5*0.1*6.5=4.2W. But if you drop 3.3V to 2.5V and it draws 5mA that's only 3mW. So, like anything, it depends.\$\endgroup\$
– squarewavAug 8 '18 at 18:56

\$\begingroup\$Why would you use a Buck Converter instead of a voltage divider to drop 1.3V? Active devices use more energy than passive devices to get the same output...\$\endgroup\$
– SamRMay 20 at 2:01

I understand how DC-DC buck converters save power when compared with a linear regulator, but do they save power when compared to not using power conversion all together?

This is just a little bit complicated. The reason for the perceived power savings in the first place is a misinterpretation. You can look at voltage conversion as operating a process on the electricity, and there are different processes you can use to accomplish this task. Each different process has a different power cost associated with it. The cost is expressed in documents as efficiency, or the amount of power that comes out of the device compared to the amount that goes in.

So the perceived power savings is simply the result of comparing one large cost (the linear regulator) with a smaller cost (the switching regulator). If you don't change the voltage at all, there is no inherent cost at all.

All of this said, much of the time when we change voltages it is to avoid power loss. The longer the distance you need to transmit power, and the smaller the wires you would like to transmit it over, the more likely it will be worthwhile to change to a higher voltage, paying a cost in power, to avoid a larger cost in power due to transmission losses. There are many factors at play here, cost of conductor and insulation, mounting equipment, cost of safety, aside from the actual worth of the electricity being transmitted.

Some devices (LEDs for example) have an efficiency tradeoff with voltage, becoming less efficient as voltage increases. Some devices consume so little power in the first place that it would take extenuating circumstances to make the power savings from the change in voltage worthwhile. There's a good chance this is where your device falls, so it is probably worthwhile to measure it's power consumption and make a decision based on that. There is a very good chance you will save some amount of power by decreasing the power to it, and also a very good chance the amount of power saved will not be worth your engineering time, the additional component cost, or the weight in the finished device. Saving 10% of total device power will noticeably increase your battery life, but if you're only saving 30uW on that single component and your whole project is using 1W total, you won't perceive the difference.

\$\begingroup\$Is this an answer or a question? If you have a question you should post it as a question. But to answer, there is no such thing as "not using power conversion all together". Even if you're just powering an LED through a resistor, there is still a voltage drop (supply minus LED voltage) which must dissipate power (as heat in said resistor).\$\endgroup\$
– squarewavAug 9 '18 at 2:57

\$\begingroup\$He is asking how much power would be saved by operating a device at 2.5V rather than 3.3V, and asks specifically about not using power conversion. There are many devices that will operate well across such a small spectrum. I suppose I should have also mentioned that not using power conversion can be an option at higher voltage differences when different voltage parts are available. The point of this is to illuminate the source of the misconception: "compared to what?"\$\endgroup\$
– K HAug 9 '18 at 3:03

I have found quite often that the consumed current is more or less constant for electronic components such as microcontrollers and sensors.

The reason why the voltages were higher before is that the operating voltages of the transistors were higher.

Consumption is largely determined by the clock rate in a microprocessor/microcontroller which is due to the charging/discharging of the internal (stray) capacitors, mostly on the clock, but also for any other changing signal.

The power required to charge and discharge a capacitor is \$C.V^2\$ (the energy stored in the capacitor is half of that, the other half is lost when charging it).

You can see that the energy in the capacitor is proportional to the square of the power. So changing from 3.3V to 2.5V saves close to 40% of dynamic power.
If your DC-DC has an efficiency of 80%, you could save 20% of this dynamic power.

Some microcontrollers have an internal linear regulator and/or a DC/DC. The linear regulator makes sure that the core always runs at the same voltage, which is another reason why consomption depends more on the current than on voltage.
If the microcontroller has a DC/DC, adding an external one is likely inefficient.

Static power computes differently. Approximately it is \$V*I\$, so dropping from 3.3V to 2.5V saves about 25% of power.

But you need to take into account static power losses. How much does the DC/DC consume when there is no load for instance?

So you need to make a power budget:
- How much power do you need (consider frequency profiles, etc);
- What is your static and dynamic power (dynamic power depends on frequency);
- What is the efficieny of the DC/DC (look into static losses);
- Consider this for the different voltages (when the current consumption changes with voltage).

Also, do not forget that you can quite often modify the operating frequency of your microcontroller, which can be a great way to save power. Measure how you are consuming power by putting a "small" resistor in series on your regulator output and by monitoring that voltage on an oscilloscope. Adjust the resistor's value so that it has a small power drop that does not stop your circuit from working, but which enough for measuring it without too much noise (I've used \$10\Omega\$ resistors for that).

Do you need it/what do you need?:
There is of course also the cost of the DC/DC, the cost of designing it in, the extra point of failure, the space it occupies, etc. It's a tradeoff of all of that and the need for a longer battery life with regards to the expected impact on sales and profits.
I went through the "trouble" of selecting an SDCard, optimizing the processor's frequency and SDCard operations in order to increase battery life to a week in an application that was measuring 24h/day with LEDs and recording those measurements on the 4GB SDCard. I went from about half a day of autonomy to 1 week by optimising the program and selecting and understanding the best SDCard. But it was a requirement!