Camera Sensors At Work: how your digital camera turns light into an image

How sensors work

The sensor is made up of millions of light-sensitive units, often referred to as pixels, but at this stage more accurately called photosites.

These can measure as little as 0.004mm across (around 1/16th of the width of a single human hair).

Each one creates its own electrical signal in proportion to the brightness of the part of the image that it covers.

Individual pixels from the above image

But these individual photosites can’t see colour – only luminance. To produce a full-colour image, each photosite has a miniature coloured filter, either red, green or blue.

A pixel with a green filter will only see colours that have some green light in them. But as practically all colours can be made by mixing red, green and blue light together, this still provides valuable information.

The clever bit is how the pixels work together to assign accurate colours to each pixel in the final image.

The green-filtered photosite can effectively ‘see’ red light by using information from neighbouring red- and blue-filtered photosites.

Known as ‘demosaicing’, this interpolation process makes an informed guess about the colour of each and every square that makes up the image.