AuthorTopic: That famous AA filter... (Read 26709 times)

I was thinking about the AA (Anti Aliasing) filter. DSLRs have it while MFDBs don't. I presume that DSLRs have it for some reason, especially as they are not exactly free. Would be nice to have some explanation about the need for AA filtering.

I do understand so much:

1) AA filters are needed because the bayer-pattern on the sensor will produce color moirés if the lens resolves above the Nyquist limit.2) AA filters do reduce sharpness3) Some sharpness can be regained using sharpening techniques

1) AA filters are needed because the bayer-pattern on the sensor will produce color moirés if the lens resolves above the Nyquist limit.

It is not the Bayer interpolation that 'produces' aliasing effects, although it might work towards accentuating them. It is the nature of the digital sampling process per se. Color moire is the most obvious but not the only aliasing effect possible to experience. So called 'false resolution' (i.e. false detail) is another.

« Last Edit: December 31, 2008, 05:31:21 AM by NikosR »

Logged

Nikos

Slough

I was thinking about the AA (Anti Aliasing) filter. DSLRs have it while MFDBs don't. I presume that DSLRs have it for some reason, especially as they are not exactly free. Would be nice to have some explanation about the need for AA filtering.

I do understand so much:

1) AA filters are needed because the bayer-pattern on the sensor will produce color moirés if the lens resolves above the Nyquist limit.2) AA filters do reduce sharpness3) Some sharpness can be regained using sharpening techniques

I would appreciate some good technical comments.

Best regardsErik Kaffehr

1) To add to the previous post, moiré occurs due to the discrete sampling of the image by a regular array of pixels. When the image also has a pattern e.g. a checkerboard design on a fabric, and the spacing of the pattern is close to that of the pixels, then something odd happens, and you get a moiré pattern. You can understand it if you think about photographing a grid pattern, and the spacing between the grid lines on the sensor is just a bit bigger than the spacing between pixels. The Bayer matrix introduces another compexity, which means you get colour effects too. 3) Not really. Once resolution has been lost, it has been lost. Sharpening accentuates edges, which increases perceived sharpness.

Personally I would rather have no AA filter, since my interest is nature, and regular patterns are rare, except perhaps insect eyes.

If you have edges, you have high detail. Regular high detail shows up as moire when you don't have a OLPF. With Bayer pattern sensors, you can also get spurious colours when there on any sharp edge.

A zone plate is "wonderful" for showing aliasing artifacts. Here we see chroma aliasing, moire and bayer reconstruction artifacts on a 1D mk III. It's not often that you get such patterns in real life, but you can see the effects in all sorts of images and subject matter. The stronger the OLPF, the less of these artifacts you'll see, but the softer the overall image will be.

It's a complex issue, and one where there many factors at work. At one extreme, you have no OLPF at all, which practically guarantees artifacts in most in-focus photography where sharp edges are visible. At the other extreme, you could filter the image so you never or practically never see any artifact from aliasing at all. The image will be softer though.

So it's a balancing act that the camera designer needs to work with to ensure the end result is suitable for the intended use of the camera.

But it's also worth remembering that just as an image that lacks specific details can never have those details magically added afterwards through clever post-processing, that once aliases contaminate an image, they can not be easily removed, and certainly the original underlying image cannot be restored.

I was thinking about the AA (Anti Aliasing) filter. DSLRs have it while MFDBs don't. I presume that DSLRs have it for some reason, especially as they are not exactly free. Would be nice to have some explanation about the need for AA filtering.

I do understand so much:

1) AA filters are needed because the bayer-pattern on the sensor will produce color moirés if the lens resolves above the Nyquist limit.2) AA filters do reduce sharpness3) Some sharpness can be regained using sharpening techniques

I would appreciate some good technical comments.

Best regardsErik Kaffehr

The main reason why DSLR have an AA and MFDB don't is mainly because people using MFDB are usually professional while DSLR users usually aren't.People using MFDB use almost only RAW format and post-process their shots. So they can use a low-pass software filter if a moiré effect is visible, while retaining maximal sharpness when there is none (if you use JPEG, the compression algorithm is not lossless and post-processing with a low-pass software filter might not be possible). They can also try different demosaicing algorithms, and some of these algorithms are quite efficient at reducing moiré effect, and can not be implemented "in-camera" as they might be too slow or require too much memory.

However there are DSLR that do not have low-pass filters. The quite famous and unsuccesful DCS-14n was one, and the leica M8 (not a DSLR, but not an MFDB either) hasn't either (but Leica had not much choice regarding this issue). So, DSLR manufacturers do not have the same policy regarding low pass filters; there are even some (Pentax I think did it at least once) which have anisotropic low-pass filter (not the same strength in the horizontal ad vertical direction).

If you can read french, there is an explanation of the demosaicing process, moiré effect and low pass filter here.

Thanks for the french article. It has also links to two english language articles related to the issue. Unfortunately I don't read french but I nevertheless can grasp the idea.

Erik

Quote from: james_elliot

The main reason why DSLR have an AA and MFDB don't is mainly because people using MFDB are usually professional while DSLR users usually aren't.People using MFDB use almost only RAW format and post-process their shots. So they can use a low-pass software filter if a moiré effect is visible, while retaining maximal sharpness when there is none (if you use JPEG, the compression algorithm is not lossless and post-processing with a low-pass software filter might not be possible). They can also try different demosaicing algorithms, and some of these algorithms are quite efficient at reducing moiré effect, and can not be implemented "in-camera" as they might be too slow or require too much memory.

However there are DSLR that do not have low-pass filters. The quite famous and unsuccesful DCS-14n was one, and the leica M8 (not a DSLR, but not an MFDB either) hasn't either (but Leica had not much choice regarding this issue). So, DSLR manufacturers do not have the same policy regarding low pass filters; there are even some (Pentax I think did it at least once) which have anisotropic low-pass filter (not the same strength in the horizontal ad vertical direction).

If you can read french, there is an explanation of the demosaicing process, moiré effect and low pass filter here.

I hoped that you would "chime in" on this discussion! Thanks for posting!

As far as I understand modern interpolation algorithms can reduce aliasing, but they possibly demand very much computing power. On the other hand I would suggest that it should be possible to regain most of the information lost due to OLPF filtering,if the PSF (Point Spread Function) is known, using deconvolution.

Best regardsErik

Quote from: Graeme Nattress

If you have edges, you have high detail. Regular high detail shows up as moire when you don't have a OLPF. With Bayer pattern sensors, you can also get spurious colours when there on any sharp edge.

A zone plate is "wonderful" for showing aliasing artifacts. Here we see chroma aliasing, moire and bayer reconstruction artifacts on a 1D mk III. It's not often that you get such patterns in real life, but you can see the effects in all sorts of images and subject matter. The stronger the OLPF, the less of these artifacts you'll see, but the softer the overall image will be.

It's a complex issue, and one where there many factors at work. At one extreme, you have no OLPF at all, which practically guarantees artifacts in most in-focus photography where sharp edges are visible. At the other extreme, you could filter the image so you never or practically never see any artifact from aliasing at all. The image will be softer though.

So it's a balancing act that the camera designer needs to work with to ensure the end result is suitable for the intended use of the camera.

But it's also worth remembering that just as an image that lacks specific details can never have those details magically added afterwards through clever post-processing, that once aliases contaminate an image, they can not be easily removed, and certainly the original underlying image cannot be restored.

I hoped that you would "chime in" on this discussion! Thanks for posting!

As far as I understand modern interpolation algorithms can reduce aliasing, but they possibly demand very much computing power. On the other hand I would suggest that it should be possible to regain most of the information lost due to OLPF filtering,if the PSF (Point Spread Function) is known, using deconvolution.

Best regardsErik

The problem of using deconvolution methods to retrieve information lost by low-pass bi-refringent optical filters has already been tackled, but I don't think it is on the mainstream of demosaicing research.I just added two more articles on my web site. One of them (an issue of Electronic Imaging) might be of interest to you.

Once aliasing corrupts a signal, there's no real way to get back to that original signal. It's corrupt and gone. Chroma moire in a bayer pattern sensor can be removed to an extent by using an uncorrupted luma as a guide to reconstruct a reasonable facsimile of what the chroma would have been, and this can work quite well indeed. Removing luma aliasing really doesn't work because if the aliases are bad enough, they "fold back" into the image at a low frequency. That low frequency has a long wavelength and hence shows up as large area artifacts that are hard to detect and remove convincingly.

That is why we always look to stop aliasing artifacts entering a system before they have a chance to corrupt the precious signal.

It is very processing intensive to deconvolve the OLPF, but certainly possible, whereas no such process exists for de-aliasing an aliased image. As mentioned in the links above, chroma aliasing can be effectively "dealt with" if you're clever in the demosaicing.

Now, where I get concerned is not so much the still-image situation, but when images move. Moving aliasing and moire is more visible than when still as it moves in the opposite direction to the movement of the object or camera. This causes all manner of issues down the distribution chain which often tend to use motion estimation based compression.

Maybe someone knowledgeable can comment on the significance lenses might have on the decision of dSLR manufacturers to use AA filters while MFDB manufacturers by default don't. A lens can effectively act as an AA filter depending on the relationship of its transfer function to the sampling frequency (i.e. pixel pitch).

If the sensor is of so high a resolution that the lens cannot output significant contrast at that resolution, then an optical low pass filter is not needed. However, that does not mean you'll now get a sharp image at a pixel level, as that only really occurs when you allow aliasing into a system. But you won't be limiting the resolution of the system further than it should be. It's pretty obvious to me that higher and higher resolution are probably the way forward. It makes sense to take resolution beyond what is necessary so we can properly over-sample.

My favorite explanation is by Graeme Nattress: When the lens resolves more detail than the sensor can capture and that extra detail is *not* filtered out by an AA OLPF, you're trying to fit more resolution than the pixels dimensions can handle. In real life when you put more than a pint of beer in a pint pot, the beer spills out and makes a mess. In image processing, the beer folds back on itself and corrupts the image - aliasing, but just another mess of beer really. Aliases are spurious false data that are not correlated with the actual detail in the image. They're effectively noise, or extra detail that's not actually part of the image.

I would add that a lot of photographers like aliasing artifacts. Most that shoot Sigma Foveon, for example, call the images "crisp", "sharp", etc., but what they're actually seeing is aliasing. You can even get aliases with a camera that has an OLPF, by just downsampling with a bad algorithm, such as nearest neighbor.

To me, aliasing looks fake and computer-generated, like fonts that are not anti-aliased. Correctly filtered images appear more natural and desirable to my eye.

Logged

--Daniel

Slough

It is very processing intensive to deconvolve the OLPF, but certainly possible,

Is that possible in the sense that it has been done with a commercial camera, or in the sense of being theoretically possible but not proven? My own experience with several commercial deconvolution software tools is that the results are not worth bothering with. Of course these tools might have been pants, especially given the cost, though they did take a long time to process. It is possible that adding in knowledge of the specific camera OLPF would improve the results since these were generic tools. Not knowing anything about the underlying physics of an OLPF, I wonder whether the blurring is due to a random feature of the OLPF, in which case surely deconvolution would not work.

Curiously I have just read that, according to one source anyway, Nikon apply an additional blurring between image capture and RAW file creation, using some hardware/software algorithm. AS to whether or not this is accurate information ...

I have a Kodak SLR/n which does not have an AA, I had to stop using it for jobs because moire was a big problem, i have not seen any program that can remove moire completely. I have also a couple of Canons with AA's. moire is no longer a problem. There are buts though, I think the Kodak images despite less pixels and less detail have more pop. The reason I think this is, is because the AA degrades the picture, levels the field across the frame between sharp and un-sharp areas the non AA pictures hold the differential better. Just my opinion not from any specific tests to show this.

The OLPFs I've used are the bifringent crystal type and are a constant and even attenuation of high frequencies, and don't use any random factors, so should be de-convolve-able. I'm not an expert on this type of processing, so I don't know who's doing it, or if anyone is specifically doing it for an OLPF, but it's certainly do-able.

Aliasing does give an effect on some edges that people like. What happens is that as the detail that cannot be sampled gets folded back into the image, it records as false detail. To some eyes, this can look good.

However, it can be achieved by another sampling process often used in image process - downsampling an image - by choosing an appropriately aliasy downsample filter.

telyt

... However there are DSLR that do not have low-pass filters. The quite famous and unsuccesful DCS-14n was one, and the leica M8 (not a DSLR, but not an MFDB either) hasn't either (but Leica had not much choice regarding this issue).

Maybe someone knowledgeable can comment on the significance lenses might have on the decision of dSLR manufacturers to use AA filters while MFDB manufacturers by default don't. A lens can effectively act as an AA filter depending on the relationship of its transfer function to the sampling frequency (i.e. pixel pitch).

I think the decision regarding DSLRs is simply marketing. Prime example was Kodak's 14n. All previous Kodak DSLRs were sold either without AA filters or with AA filters as an option. Those were radically high priced systems compared to today's DSLRs. Users were pretty much exclusively serious pros as they are with current medium format backs. When Kodak brought out the 14n at a much lower price point than previous cameras they aimed at a completely different market but they kept the no AA filter approach. The camera was blasted in early reviews because of aliasing and moire issues. Those of us who had used many previous Kodaks knew how to deal with it an loved the camera. New users hated it and panned it royally in various reviews. It was the final nail in Kodak's pro camera coffin. The majority of buyers of most current DSLRs are not pros... and even most of the pros using DSLRs would be put off by having to constantly deal with color moire issues if their routine subject matter was the sort that often caused it (bridal veils?).

The higher the resolution of the camera, the less an AA filter is needed. My first Kodak, a DCS410 was 1.5 megapixel with no AA filter. You got color noise on just about anything you pointed it at. Training on that thing made the sort of noise a 14n produced child's play to deal with. I still prefer the look of a no AA filter camera. I can deal with the few instances where it poses a problem. I'd like to see it offered as an option on more DSLRs but I'm probably an oddity in the market. They're not designing cameras for me.

The problem of using deconvolution methods to retrieve information lost by low-pass bi-refringent optical filters has already been tackled, but I don't think it is on the mainstream of demosaicing research.

Do you have a reference for that? I'd be interested.

Quote from: Daniel Browning

I would add that a lot of photographers like aliasing artifacts. Most that shoot Sigma Foveon, for example, call the images "crisp", "sharp", etc., but what they're actually seeing is aliasing. You can even get aliases with a camera that has an OLPF, by just downsampling with a bad algorithm, such as nearest neighbor.

There are two uses of the term "aliasing" in common usage. One is the stairstepping of diagonal lines, also called "jaggies". The other, which is what I believe Graeme is referring to, is the shifting of spatial frequency of a signal by a multiple of the sampling frequency k_max, due to the fact that the sampling cannot distinguish a signal of frequency k and k-k_max when k>k_max. For some pretty pictures, see

I would think the main reason for the AA filter is to combat moire and color aliasing. Thought the latter can be mitigated by post-processing techniques, the former is quite nasty and nearly impossible to to back out, since the information that would allow one to distinguish that an oscillation of luminance at frequency k-k_max really should have been frequency k is irretrievably lost.

On the other hand, there is substantial room for improvement in demosaicing algorithms. Many of the ones I see break down strongly on texture data near the Nyquist frequency, and if they don't produce moire as a result, introduce maze artifacts and other structures that are just as bad. AA filters help here, but I think there is progress to be had on the processing side.

I think the decision regarding DSLRs is simply marketing. Prime example was Kodak's 14n. All previous Kodak DSLRs were sold either without AA filters or with AA filters as an option. Those were radically high priced systems compared to today's DSLRs. Users were pretty much exclusively serious pros as they are with current medium format backs. When Kodak brought out the 14n at a much lower price point than previous cameras they aimed at a completely different market but they kept the no AA filter approach. The camera was blasted in early reviews because of aliasing and moire issues. Those of us who had used many previous Kodaks knew how to deal with it an loved the camera. New users hated it and panned it royally in various reviews. It was the final nail in Kodak's pro camera coffin. The majority of buyers of most current DSLRs are not pros... and even most of the pros using DSLRs would be put off by having to constantly deal with color moire issues if their routine subject matter was the sort that often caused it (bridal veils?).

The higher the resolution of the camera, the less an AA filter is needed. My first Kodak, a DCS410 was 1.5 megapixel with no AA filter. You got color noise on just about anything you pointed it at. Training on that thing made the sort of noise a 14n produced child's play to deal with. I still prefer the look of a no AA filter camera. I can deal with the few instances where it poses a problem. I'd like to see it offered as an option on more DSLRs but I'm probably an oddity in the market. They're not designing cameras for me.

I have to agree. I started with the dcs 760 and it could alias quite badly pointed at the right subject , but much of the time it was never a huge problem if at all. With some of the later converters like ACR it was rare to see moire at all and simple to deal with it. The 14n wasn't so bad either but had other issues. The 14nx/ slrn was an improvement and I rarely saw moire - and I got moire in similar places ( always materiel) with a 1ds2 - much to my surprise. With a 1ds3 (aa filter ) and imacon 22mp back ( none) the back produces more detail thanks to no AA filter and some moire where the 1ds3 produces very slightly less detail and none visible moire and its debatable how useful that tiny edge of detail is . I have wondered if the AA filter in the Canons doesn't sometimes help hide noise a little as well

http://www.maxmax.com/hot_rod_visible.htmYou can get a few models also converted to non AA by this company. Given how tiny artifacts are in the new 24-25MP cameras, I think it might be a good "upgrade" once they start to support these newer models.(wish they'd hurry up!)

Note - the Fuji S5Pro with this modification seems to be a great performer due to the fact that the sensor in it doesn't really need an AA filter since it brackets and blends the shot already. Oh - speaking of which, using a program like the Zero Noise software that's mentioned on this site in a couple of threads, you can bracket a shot on a camera without an AA filter and pretty much eliminate all of the ugly problems.