However, in the dress image there is only one clear black source, the patterned fabric in the lower-left behind the dress. That is nearly pure black. The camera auto-focused on this object more than the foreground dress or other background elements. This increases the chances this object was selected by the camera as the source for black balance.

It must be exposure error because all the clues in the image tell us the camera chose accurate hues to represent white and black. The exposure error is that the camera fired it’s flash and the flash severely overexposed (or washed out) the dress in the foreground. This turned the dark rich blue into whitish pale blue and the satiny reflective black into gray with a golden cast from the overhead halogen lights. This was a substantial exposure error driven by the photo being extremely backlit by daylight from the window, yet the camera saw the dress in the foreground as very dark so it opened the iris wide and fired the flash causing the colors of the dress to be blown out from the flash.

2. Why do people perceive the photo differently?

First, it’s important to note the blue/black seers are not any more “correct” than white/gold seers just because the dress is actually blue/black. Why? Because the shade of blue that blue/black seers perceive in the original, inaccurate photo is not the same shade of blue they see in secondary, accurate photos of the dress. If you are a blue/black seer compare them now. The inaccurate photo shows a faded pale blue, while the real dress is a deep rich blue. So, both groups failed to see the true colors of the dress in the original, inaccurate photo. Both groups are wrong and the only difference between the groups is in which direction the errors made by their visual system went.

Blue/Black

Their perceptual system interprets the original, inaccurate photo to be a dress which is dramatically over-exposed. Their perception is that the environment is super bright resulting in a photo with very low contrast (thus no true blacks on the dress) and the dress is being front lit by bright, warm light like sunlight or halogen, which they mentally subtract leaving gray, which they then mentally darken to black because they interpret the image is over-exposed. They assume there is no blue in the

White/Gold

Their perceptual system interprets the original, inaccurate photo to be a dress which is properly exposed in a normally lit indoor environment with a mix of indoor lighting and daylight through windows. This leads them to interpret the tan/brown color as gold and, because they assume some daylight bluish cast to the light, they subtract blue leaving white.

Both groups are wrong, but is one ‘less’ wrong than the other?

Yes. The original image is wildly inaccurate with ambiguous color cues but more of the color cues point in the direction of white/gold than blue/black, even though in reality white/gold is not the correct interpretation. What are those cues?

The background infers the image is indoors, not outdoors.

The image is strongly backlit with daylight, inferring the foreground is darker than the background.

The gradient on the floor behind the dress gets increasingly darker as it comes closer to the dress and camera. This implies the dress is not in blindingly bright light but that the brightest light in the environment is behind it.

The pattern of highlights on the dress does not have any tell-tales indicating the photo is lit by flash. Often when an image is overexposed by flash, the brightest areas are flat or peak white.

The shading and gradients of color on the dress indicate the image is not overexposed.

The black and white patterned fabric behind the dress in the lower left corner shows a black that is much darker than the darkest shade on the dress, indicating that the photo is properly exposed.

I’m speculating but perhaps blue/black seers are more used to being outdoors and more often take photos that are overexposed. I also think perhaps when they first viewed the photo their eyes went directly to the dress and interpreted it’s likely color before assessing the background environment and other cues in the image. White/gold viewers may be the opposite. As for myself, when I first looked at the photo I spent a fraction of a second establishing the context and environment before parsing the dress itself. So, this may have something to do with differences between “whole scene” parsers and “object of focus” parsers.

How to see the dress the other way

To interpret the color of the dress as blue/black requires seeing the dress as extremely washed out or super bright with low contrast. To try to see the dress this way, look at the lower right corner of the dress where it is darkest and block the upper parts of the image with your hand. Imagine it is illuminated from the front by a searingly bright light focused only on the dress that is causing the colors to be faded and blacks to be lighter. I trick my eyes into seeing the image as a blue/black dress illuminated from the front by golden sunlight, interpreting the whitish pale blue as blown out shiny dark blue and the tan/brown as washed out black reflecting the gold hue of the bright light.

To interpret the dress as white/gold, concentrate on the black and white patterned fabric behind the dress in the lower-left corner of the photo. Cover up the rest of the photo except for the daylight in the upper-right corner. Imagine yourself walking outside on a bright sunny day in a shopping district. You walk inside a store and it is dark, you remove your sunglasses but the store is still dark. As you walk further into the store your eyes begin to adjust to the interior darkness and it appears normal. When you turn back in the direction of the door the daylight coming through the windows seems very bright compared to all the darker merchandise in the store. Now, look at the dress.