The human eye can differentiate about 10 million colors. A 24-bit display mode can produce about 16 million. Why, then, do operating systems have 32-bit and higher display modes if the apparent quality is the same as a 24-bit mode?

My two guesses: Just because they can, or...marketing.
–
Virtuosi MediaDec 6 '12 at 2:32

2

A guess is that there are parts of the spectrum that the human eye can differentiate more precisely. Higher color-depth might not be noticable in parts of the spectrum but needed in others.
–
Fabian ZeindlDec 10 '12 at 19:47

6 Answers
6

It's not clear exactly how many colors humans can see. For
example, the table at the top of this page about the number of
colors distinguishable by the human eye cites various academic
papers as saying anything from "more than 100,000" to "roughly 10
million." In any case, the number of colors visible to humans
appears to be lower than the number of different colors which can be
represented by a computer screen, although because the number of
distinguishable colors is not known exactly, the people who made the screens
probably decided to play on the safe side.

The way memory is laid out in the computer, data is easiest to store and quickly access when the memory units are in powers of two. This physical constraint is why we have a full byte (2^3 bits) per color channel rather than just 6 or 7 bits. This preference for powers of two also plays a role in the decision to include the fourth (2^2-th) channel. It makes for a much more uniform layout in memory and thus significantly faster processing than would be the case if there were only three channels, especially given how optimized graphic cards are for specific types of parallel processing.

Now I come to the crux of your question. You point out that the 24 bits used to store the red, green, and blue color channels would already be sufficient on their own to produce more colors than we can see. That makes the final byte (the alpha value) appear a bit superfluous. But the alpha channel has a great usability value to programmers. Adjusting it adjusts the brightness of the whole pixel simultaneously rather than having to write the code to adjust each of the three channels independently. Among other things, the alpha value drastically simplifies the math needed for greenscreening, blending images, and setting transparency. Fewer operations doesn't just mean that the code can be written faster; it will take less time to debug and run faster as well.

In addition to all of the above there is some evidence that we see some frequencies better than others (e.g. we are better at differentiating greens than we are blues and reds because of the frequencies the rods and cones in our eye respond to). So the colours that we can differentiate are not spread evenly over the whole visible spectrum.
–
adrianhDec 6 '12 at 9:29

2

These points are all valid, but the main reason is what MSalters described.
–
oefeMar 6 '13 at 21:25

You're incorrectly assuming that the distribution of those colors over the gamut matches the human eye. The distribution of the 16 million colors is chosen for technical simplicity, ignoring even the difference in sensitivity for red and green.

For the same reason, there's a sizeable part of the gamut which many monitors can't display at all (15% is usual)

24 bit isn't really 16 million different colors. It 3 times a single color at different intensity which your eye/brain interprets as a single color, it isn't. So, try this exercise, show all the 256 different "reds" with the other colors at 0. Then you'll find out that 8 bits per color x 3 actually isn't that much...

Your problem is that you are thinking in colors which doesn't always mean what you think because those are the individual colors not counting the colors achieved by blending and mixing those colors. Your eye an see something like 2.4 million colors but this does not take into account shades and tones of those colors which puts us more in the 100 million colors range play around with a 16 bit photo shop image for a while there are trillions of colors available see below :
"an 8-bit image, which would be "2 to the exponent 8", or "2 x 2 x 2 x 2 x 2 x 2 x 2 x 2", which gives us 256. That’s where the number 256 comes from.

Don’t worry if you found that confusing, or even worse, boring. It all has to do with how computers work. Just remember that when you save an image as a JPEG, you’re saving it as an 8-bit image, which gives you 256 shades each of red, green, and blue, for a total of 16.8 million possible colors.

Now, 16.8 million colors may seem like a lot. But as they say, nothing is big or small except by comparison, and when you compare it with how many possible colors we can have in a 16-bit image, well, as they also sometimes say, you ain’t seen nothin’ yet.

As we just learned, saving a photo as a JPEG creates an 8-bit image, which gives us 16.8 million possible colors in our image.

That may seem like a lot, and it is when you consider that the human eye can’t even see that many colors. We’re capable of distinguishing between a few million colors at best, with some estimates reaching as high as 10 million, but certainly not 16.8 million. So even with 8-bit JPEG images, we’re already dealing with more colors than we can see. Why, then, would we need more colors? Why isn’t 8-bit good enough? We’ll get to that in a moment, but first, let’s look at the difference between 8-bit and 16-bit images.

Earlier, we learned that 8-bit images give us 256 shades each of red, green and blue, and we got that number using the expression "2 to the exponent 8", or "2 x 2 x 2 x 2 x 2 x 2 x 2 x 2", which equals 256. We can do the same thing to figure out how many colors we can have in a 16-bit image. All we need to do is calculate the expression "2 to the exponent 16", or "2 x 2 x 2 x 2 x 2 x 2 x 2 x 2 x 2 x 2 x 2 x 2 x 2 x 2 x 2 x 2", which, if you don’t have a calculator handy, gives us 65,536. That means that when working with 16-bit images, we have 65,536 shades of red, 65,536 shades of green, and 65,536 shades of blue. Forget about 16.8 million! 65,536 x 65,536 x 65,536 gives us an incredible 281 trillion possible colors!"
http://www.photoshopessentials.com/essentials/16-bit/