Most digital cameras these days have CMOS sensors - Complementary Metal-Oxide-Semiconductor. That's the material that has emerged over several decades of trial and error as the material with a sensitivity to light that most nearly approximates human vision. A camera is, after all, a human vision simulator.

However, CMOS is less sensitive than human vision, in that we still (just about) perform better in low-light situations. However CMOS is *more* sensitive than human vision in another way: it has a broader sensitivity to wavelengths.

[Important point: color = wavelength.]

Humans, for whatever evolutionary reasons, can only see the wavelengths 400 nm - 700 nm. This range is described, rather circularly, as 'the visible spectrum.' Another way of saying that is that only photons of these wavelengths activate our retinas to activate the optic nerve; other wavelengths are absorbed as dumb energy. But CMOS reacts to photons from about 320 nm - 1100nm, meaning it can 'see' infrared and ultraviolet - the zeroth and eighth colors of the rainbow.

But in order to more closely mimic human vision, camera manufacturers install internal filters inside the camera to block these wavelengths. Get inside your camera and remove the internal filters, and you have a full spectrum camera.

When transposed back into visible - ie, when displayed on the camera's screen - full spectrum images look pretty weird, distorted by the large amount of infrared knocking around. But if you then put a lens filter on the front of your camera, to only let through the wavelengths you want - a UV Pass filter, for instance - then what you've got is a UV camera. That's how the sausage is made. (Here's a pretty good how-to.)

More on this to come. Meantime, I'm working with the National Council on Skin Cancer Prevention, the American Academy of Dermatologists and Nivea (biggest skincare brand in the world), among others, to get Westerners to the point where they stop doing this, and sunscreen becomes as standard a thing as brushing your teeth.