These two cameras have been announced by Nikon just this month, and they appear to be quite the same. However, getting the D800E instead of the D800 costs around $300 more. I don't know why that is since they appear to be exactly the same. I read this could be related to anti-aliasing but I don't understand what that is. Anyway, these two are now the highest resolution full frame-sensor cameras, at 36+ MP. Does anybody know why there's that price difference between these two?

3 Answers
3

Whenever you digitize something, there is going to be some amount of information lost. When the original is reconstructed, that loss of information may lead to results that have little to do with the original signal. That applies to sound, electronic signals and to light patterns projected onto an imaging sensor.

As long as the things we digitize are larger (have a lower frequency) than the resulting digital signal, then the original can be reconstructed with at least decent fidelity. (The maximum frequency that can be faithfully digitized must be less than half of the sampling frequency. It might help to look at the Wikipedia entry for Nyquist frequency.)

When we try to take digital samples of objects with fine patterns, like regularly-spaced lines, the sensor might not be able to keep up, and when the picture is reconstructed we wind up with a moiré pattern, which generally shows up as an area of false colours in a digital image. Instead of the fine pattern, you'll get a splotch of colour that isn't in the original, or lines running at opposite angles to the lines in the original pattern.

To get around the moiré problem, most small-format (full-frame 35mm and smaller) digital cameras incorporate an optical low-pass filter into the sensor assembly. Essentially, its a filter that blurs the image somewhat so that there are no harsh transitions at a finer level of detail than the camera can accurately reconstruct from the sensor recording. The "ordinary" D800 works in exactly that way.

With the sensor resolution now sitting at over 36MP, though, there are a lot fewer instances where the detail you are trying to record cannot be resolved and reconstructed accurately -- especially if you are working in a studio situation and can change things if you bump into the Nyquist limit and create moiré (changing the magnification to make the pattern larger so that it can be resolved properly, smaller so that it doesn't really resolve optically due to limits of the lens, or changing depth of field are all ways of attacking the problem). In order to get the maximum image resolution, then, it might be worthwhile foregoing the low-pass filter, as medium-format DSLRs (and a few high-end cameras, like the Leica M9) do.

Now, you might think that taking something out of the camera should cost less than putting it in, and you'd be right. The D800E doesn't exactly leave out the low-pass filter; it has a sandwich of filters instead. There is still a thin low-pass filter, but it's backed up by another thin filter that largely undoes the effect. That allows the cameras to be produced with the same basic tooling and tolerances. Leaving the low-pass filter out of the equation would make the sensor thinner, and require different mounting and alignment to keep the focal plane in the same position relative to the lens mount flange and the reflex mirror. The extra $200-300 for the modified sensor is probably a lot cheaper than a whole different tooling setup for the body castings.

The upshot is that the D800E should be able to take sharper and more detailed images, but it does that at the risk of creating moiré patterns in areas of fine detail. Both cameras may have the same number of pixels, but the D800's pixels will be "mushy" when compared to those from the D800E.

Nice explanation, but let's also keep "mushy" in perspective. 36 Mpix over 24x36 mm is 200 pixels/mm, or 100 "lines"/mm. The mushyness would be at that level, which is lessy mushyness than most lenses have.
–
Olin LathropFeb 12 '12 at 15:59

3

Human eyes, by the way, don't have this problem, because our "photosites" are arranged at random rather than in a grid.
–
mattdmFeb 12 '12 at 16:08

@mattdm: The below quote suggests that there is a LPF in the human eye lens. "The avoidance of aliasing is also important in the design of the eye. For example, in the human eye the lens filters out any spatial variations finer than 60 cycles/degree. The Nyquist theorem tells us in this case that the photoreceptor spacing should be at least 120 sample/degree, which is exactly what we have! " redwood.berkeley.edu/bruno/npb261/aliasing.pdf
–
UnapiedraFeb 12 '12 at 16:20

And here: research.opt.indiana.edu/Library/imageProcessing/… (It says some aliasing occurs if the lens is bypassed, and on the whole system only on the periphery.) This would speak against the random arrangement thesis. Do you have a source for the random arrangement of cones in the human eye? I know it's not uniformly distributed but would be curious to know more.
–
UnapiedraFeb 12 '12 at 16:34

The chance of moire with the D800E is very low, because of the pixel density. Maybe it could be a problem with fashion photography with very fine structure smaller then the pixel density of the Nikon D800E.

So I'll go for the little bit more expensive D800E and I will be curious to see what the difference will be with the Leica S2

The D800E doesn't have an optical low-pass filter (LPF). The filter sits on the chip before the photosensitive diodes.

I suspect the D800E is more expensive because of lower volume for the camera (think an order of magnitude lower), maybe it is also more expensive because it speaks to very specialised users, and maybe even as a deterrent. The D800E will produce Moire or aliasing in almost all photos, if you don't understand what is going on, you might want to blame the camera for that.

Aliasing

Wikipedia has some further explanation WP:Aliasing but the principle is rather straight forward: You have a signal (the image). The true signal has various frequency components, you are now sampling it (put it into pixel, with each pixel having one value). In the pixel you don't know whether the gray value at this part of the image is constant, or whether it is changing really quickly. The pixel value is simply the average of the true values in that area where the pixel is.

The sensor can thus only sample up to a specific frequency (called the sampling frequency). The Nyquist-criterion then says that you can only tell about the signal up to half the sampling frequency. To avoid aliasing you only let frequencies below that to reach the sensor.

D800E vs D800

The D800 is a very expensive and professional-aimed device so I am clearly not giving advice on which one you should buy. The answer is neither if you have to think about it. But I can say why the D800E exist. All (small format) cameras on the market have a LPF included. The D800 has 36 million pixels. That is so much that one could decide to not need anti-aliasing during production and instead put it in software.

There are a number of medium format digital cameras with lower pixel counts than the D800 that don't have an OLPF (the Hasselblad H4D-31, the Pentax 645D, the Leica S2, the Phase One/Leaf 22, 25 and 33MP backs, and so on). Moiré isn't always an issue; and when it is, there's a new tool in the version of CaptureNX included with the E model to deal with it. And yes, there is still an OLPF splitter in the D800E, but it's paired with a recombiner immediately behind it.
–
user2719Feb 12 '12 at 15:13