Did you make this instructable?

No, perfect voltage sources nor perfect current sources exist. Everything has what might be called as output impedance (or Z). Think of output Z like a hypothetical but apparent phantom resistor in series with the output from some ideal voltage source. Sometimes this may even be called ESR, or Equivalent Series Resistance.

Something that has a really low output Z would act like a voltage source, such as big lead acid batteries or LiPo's, for example. Something that has a really high output Z (approaching infinity in fact) would act more like a current source. In fact, if you had a supply voltage that was approaching infinity and the Z that was some multiple of that voltage, then you would indeed have a "ideal" theoretical current source based on the ratio of that voltage to that Z.

However, most real-world supplies are neither, as they have some finite voltage output and output Z, as Infinity and zero resistance to not really exist. the idea of Z in this sense is more of a mathimatical concept than anything but it's useful to understand this in power supply design, impedance matching or bridging, and many other things. In fact, the same core concepts apply to ALL forms or energy, let it be thermal (temperature, heat, and thermal resistance), fluid-dynamics (pressure, flow, and backpressure), etc etc etc.

You can just as easily configure a power supply to have a "constant current" output or "constant voltage" output, or even both, where when the load attempts to draw more current than the power supply current limit is set to, the voltage will sag such that the current is regulated to what has been set. In fact, this is what most lab power supplies do! Just set the maximum voltage and maximum current limits and whether if it is in CV or CC mode depends on the input Z of the load!

Hopefully all that made sense, to answer your question more directly, the simple answer is that that is simply not true. We tend to use constant voltage-ish power supplies because it is just kind of an arbitrary convention, however constant current sources are also used a LOT for many applications as well, just look inside of any analog amplifier and you will see current sources and current sinks, and current mirrors, as well as inside of digital chips as well!

LED drivers are are almost always Constant Current because powering them with Constant voltage is not good for a few reasons. For one thing, LEDs are super picky about what voltage is across them. even half a volt too high and you might blow the LED, and 1/2 volt too low and it may not even come on! Another reason is because as the LED warms up, the really small voltage tolerance needed will shift around, meaning as it warms up, the Vdrop gets lower, and the current gets WAY higher, and you can see where this will go. That is why the LED driver that I pulled out of a former street lamp has a current output rating of 1500mA, and a output voltage rating of 120V---480V.

A current source is used when one wants to control using a current focused paradigm. For instance, when handling *current-dependent devices, (diodes, certain sensors, etc) it makes more sense to operate from the perspective of a current source, rather than a voltage source.

It's consistent values that change the world from random to predictable. By using a current source, we contain some or most of that randomness. allowing us to predict the behavior of the device with higher accuracy, thereby allowing us to produce useful, predictable items and take useful, accurate data.

You get unique, usable, electrical properties from a constant current source. And any sufficiently high voltage with a sufficiently high output resistance is, for all intents and purposes, a constant current source.

There is no perfect voltage source, there is no perfect current source.

As an example, you can't drive LEDs with constant voltage, only constant current.