When it comes to dc power supplies, what happens as you try and draw more current than it is rated for? Does the power supply slowly tapper off it's current until it maxs out, or does it just stop abruptly?

A PSU is rated for it's Power, which basically is it's maximum voltage and current it can source.

Any Supply will drop it's output voltage when you try to draw more current that it can source. Voltage sag depends on the amount of excess current drawn. Which is dependent on the load Resistance.

There are supplies that has current limit and short circuit protection build in.
With current limit function you can adjust to the value of current it can source from any given value to it's maximum capacity.
And if you happen to short circuit it's output the protection circuit kicks in shutting down the supply or limiting it's voltage to approx. 0V or to a safe level.

And there are those without these function. it will have basically a fuse to blow when the current exceeds and others will simply develop a fault if the appropriate protection is not built in.

supplies come in many configurations, so the answers will vary. Isolation transformers can supply large short circuit currents. Regulation may or may not employ current limiting. Generally, you choose the supply that fullfills your needs.

If a supply is not adequately protected against overload, "develop a fault" may equate to it overheating or even going on fire.

You should never risk overloading a power supply unless you know that it has a safe and effective protection system, such as an inbuilt current limiter or cut-out. Do not assume that every power supply is protected.

Electromagnetic "wall wart" type supplies often include a resettable thermal fuse in their transformer windings as their only means of overload protection and are unregulated so the voltage floats and sags quite a bit with load variation, but what kind of supply were you asking about?

A PSU is rated for it's Power, which basically is it's maximum voltage and current it can source.

Any Supply will drop it's output voltage when you try to draw more current that it can source. Voltage sag depends on the amount of excess current drawn. Which is dependent on the load Resistance.

There are supplies that has current limit and short circuit protection build in.
With current limit function you can adjust to the value of current it can source from any given value to it's maximum capacity.
And if you happen to short circuit it's output the protection circuit kicks in shutting down the supply or limiting it's voltage to approx. 0V or to a safe level.

And there are those without these function. it will have basically a fuse to blow when the current exceeds and others will simply develop a fault if the appropriate protection is not built in.

Click to expand...

So, power supplies are limited by their architecture (what the internal components themselves can handle), or by the fact that voltage drop inevitably creates a maximum? What causes the voltage to drop as you try and draw too much current? Can you safely draw more current than is rated if you draw it at lower voltages? Can you give me a more realistic model of voltage/power sources? Kinda like how real batteries have a series resistor effectively in place?

Thanks for the quick replies and help everyone! I appreciate it greatly!

When it comes to dc power supplies, what happens as you try and draw more current than it is rated for? Does the power supply slowly tapper off it's current until it maxs out, or does it just stop abruptly?

Could you explain how they achieve this?

Click to expand...

Exactly what supply do you mean? A good lab supply will simply go into constant current mode and reduce it's output voltage. It will not hurt the unit. It has built in control circuitry to always keep the current at the point set by the front panel knob.

Some types of power supply can deliver more current at lower voltages, for instance by using re-configurable transformers, or by use of certain types of switching converters. Many more variable-voltage supplies have the same current capacity whatever voltage is selected. The only way to be sure is to read the specification for the supply concerned.

The maximum current available from a power supply can be set in one of several ways. Simpler unregulated power supplies show some obvious drop in output as the load current increases, due basically to internal volt-drops, but the ultimate limit is normally imposed by excessive heating. A thermal cut-out or a fuse may (or may not) be included to call a halt before a new supply or indeed a visit from the fire brigade is required.

More advanced regulated power supplies may show very little reduction in output as the current rises until some maximum value, beyond which the voltage drops much more quickly. Not all are protected against overload, but many are, like the lab supplies referred to by another poster. Intentional current limiting may be at a constant (possibly adjustable) maximum level as in many lab. supplies, but some other units have "fold-back" current limiting where a reduced current is delivered if the output is short-circuited.
You really need to look up the manufacturers' information for the supply concerned to find out what overload behaviour is specified - and even then caution may be advisable.

So, power supplies are limited by their architecture (what the internal components themselves can handle), or by the fact that voltage drop inevitably creates a maximum? What causes the voltage to drop as you try and draw too much current? Can you safely draw more current than is rated if you draw it at lower voltages? Can you give me a more realistic model of voltage/power sources? Kinda like how real batteries have a series resistor effectively in place?

Click to expand...

Current is limited by the components used.
Like..

The Transformer
It can be designed in several ways...
To any given output Voltage & to any given current, BUT remember, Power OUT equals (ideal) or will be less than the Power supplied.
Power can be defined by Ohms law where V=IR.
The components
Mainly in the components you will be referring to a series pass element which performs the regulation. It will act as a series resistance to the load where by dividing voltage and dissipating the excess power in it. This can be felt as heat generated by the unit.

And of course the series element can be of any wattage. Since there are a lotta to chose from in the market.

So, power supplies are limited by their architecture (what the internal components themselves can handle), or by the fact that voltage drop inevitably creates a maximum? What causes the voltage to drop as you try and draw too much current? Can you safely draw more current than is rated if you draw it at lower voltages? Can you give me a more realistic model of voltage/power sources? Kinda like how real batteries have a series resistor effectively in place?

Click to expand...

The current is limited by the components generating the current. If the circuit has no designed current limit it may blow a component if the current gets too high.

Don't really understand what you mean by "the fact that voltage drop inevitably creates a maximum".

No, in general you can't draw more current at a lower voltage. In fact, for the common type of linear regulated supply the internal dissipation and heat generated actually goes up when drawing a given current at a lower voltage. That's because the regulating element has to drop more internal voltage from it's unregulated supply source. In fact many linear regulators reduce the current limit as the output voltage is reduced to keep the internal dissipation from getting too high.