Spectral Cognisance 3 – When Black turns Red

We have established that emission and absorption spectra are quite important in image sensing.

Why is this so?

First, an image sensor cannot distinguish between wavelengths. All that matters is the number of photons. There are, however, some more subtle effects.

Color sensors candistinguish wavelengths.

Different media have different sensitivity to wavelengths, i.e. image sensor A vs. B, human eyes, film… I will discuss some of this in the 4th post in this series.

Blue photons deliver more energy and have a shallower absorption depth than “red” photons. This will make a difference in a later post on Anti-Blooming and Shutter Efficiency.

Today, I will focus on the color reproduction of a silicon sensor under certain spectral emission content. I have been in imaging long enough to see the advent of mainstream machine vision color image sensors, readily available from all major camera vendors – including us. Coming from a largely black-and-white world, there were a few things that needed to change when applying color.

Return once again to our spectral plot. This time I changed the monochromatic sensor absorption curve (QE) to a typical response curve with RGB (Red-Green-Blue) filters on the array.

Figure 1: RGB Sensor Response and Black Body Emission (3500K)

There are two point sI want to consider here:

1) You can clearly see that there will be many more photons passing the Red filter than the Blue filter, basically shown by the overlapping areas of Emission and QE curves.

2) If you look closely, you can also see that many photons will pass ALL filters to different degrees for different wavelengths. This is most pronounced for wavelengths above ~700nm.

The first point – different “available” intensities for different colours – results in a certain color balance of the image. Reds and yellows will be more pronounced than blues. As humans, having lived with sun light and light bulbs for a long time, an image balanced thus will still appear “white”. There is one caveat: an image sensor tends to be less responsive in blue than in red, so the incoming blue signal is further reduced. Therefore, “white balancing” (meaning making a digital image look white) usually increases the “blue gain” by approx. 2x over the red to accommodate not for the lighting but the non-uniform sensor response.

The second point was of major interest for me when we developed our first colour camera.

A Field test with a new camera delivered the following image.

Figure 2: Standard Color Image

At first glance, it seemed nothing was wrong with this image. Except… weren’t those old DALSA caps black???

Figure 3: Color Image with IR-cut Filter

Yes indeed, they were!

What happened?

The image in Fig 2 was taken with the camera “as-is”, meaning the sun’s black body emission (~5000K) is allowed to create photons in the RGB sensor at will. Review what we discussed in Fig 1 and you will see that there are a lot of red photons that appear in the image as indistinguishable color (they pass through all R, G and B filters similarly). Hence black, where there are no photons in the visible spectrum, can still reflect deep red photons (>700nm) which in turn create signal on all filters with the R filter showing the dominant signal. The black cap turns red.

In addition, if you look at the plants in the foreground, you can see that some of the green/yellow information is distorted. The plant seems to reflect some of the deep-red spectrum where all RGB filters transmit and distort the nice green to a brownish yellow.

Once we add an IR-cut filter in Fig. 3, effectively removing all spectral content above ~650 nm for the sensor we obtain a nice image with good color reproduction. A next step is to add color processing to the data stream, to achieve the highest possible color fidelity.

This raw image shows, however, that the information content from the sensor is quite decent when an appropriate spectral content is offered to the sensor. And this is right along the lines of discussion for this series.

Next I will detail a bit more on the color perception of humans in “Is Lux short for Luxury?”

Until then,

Matthias

About Matthias

Born in the early seventies in Northern Germany I graduated in Physics at the University of Ulm. I worked for the BOSCH automotive group in Stuttgart before joining DALSA in 2000. Since then I have worked on layout, design, project lead, architecture and program management for various DALSA CMOS image sensors. I enjoy creating sensors and seeing them through production and implementation at the customer end.

Author

Born in the early seventies in Northern Germany I graduated in Physics at the University of Ulm. I worked for the BOSCH automotive group in Stuttgart before joining DALSA in 2000. Since then I have worked on layout, design, project lead, architecture and program management for various DALSA CMOS image sensors. I enjoy creating sensors and seeing them through production and implementation at the customer end.

Further Reading

We're always expanding our horizons when it comes to industry knowledge and technical excellence. You can find more information on the technologies discussed here by exploring these additional resources on our site: