From time to time I have used Magic Filters and have been following Jame's and AndreWebbs post's using filters in Little Cayman http://wetpixel.com/...mp;#entry130769I would like some opinions as to what the various advantages or disadvantages of filters vs post-processing are. I like to dive with strobes, even in shallow clear water, so that I have the option to use them if I choose for fill light etc. If I commit to an on camera filter then these filters often mess with my nicely balanced colour which my strobes usually provide. I show an example of this when I forgot to remove my Magic Filter from my 10.5 once at Shark Shootout last year.I then show a couple of before and after examples taken at Sting Ray City of simple post-processing using the white balance tool in Aperture and choosing a white point with the eyedropper tool. These post-processing results look similar to what I may have obtained with a Magic Filter, I think. I would venture that one could obtain similar results or any desired result by simulating the desired colour filter in Photoshop, Aperture etc.I see two disadvantages of filters; the first is that one is committed for the duration of the dive with any in housing filter, and the other is that a filter by adding another layer of glass / gelatin or whatever will increase exposure and possibly degrade the quality of the image, however slightly. Are filters necessary now that we have Photoshop etc?Does anyone have any problem with my reasoning, contrary opinions, ideas on advantages of filters over post-processing except for the fact that they can of course save the post-processing step?Possibly Alex will weigh in on this.

Regarding your two listed disadvantages of filters, the first is no different than many of the other things we commit to for an entire dive. Yes, internal filters can't be changed during a dive but neither can the lens. Some rigs, notably video and PnS setups, have filters that are removable during the dive. Regardless, this is a consideration that we are accustomed to.

The second I mostly disagree with. Yes, a filter is potentially an image-degrading element but it is also an image enhancing element or you shouldn't be using it. Saying that it causes an increase in exposure is a matter of perspective. A filter does remove light but only the light that you don't want. In other words, a filter enables your camera to achieve proper exposure on the light you do want whereas PP is simply compensating for poor exposure in some color channels. All that light that is making your exposures easy is your enemy, not your friend.

At a high level, you should never plan on using postprocessing to fix exposure problems in images, and bad color balance IS ultimately an exposure problem. Every stop of postprocessing used is one stop of performance degradation imposed on your camera and, while today's cameras are pretty good, there are limits to how far you can push them before image quality suffers. In practice, relatively minor color balance issues are not worth the trouble to fix in any other way than post. There is no substitute for getting the light correct going into your lens!

Filters aren't just for fixing color balance, either, they offer creative opportunities in mixed light situations. When we shoot wide angle with strobes, filters offer us the opportunity to control background color independent of our foreground subjects. There is a common belief that filters and strobes are incompatible but that is untrue. Many color balancing filters don't have convenient complementary strobe filters but it is possible to use useful complementary sets as James has recently demonstrated in Little Cayman.

Finally, I wouldn't say that filters save any postprocessing steps. Underwater shots require some care in post and the amount of work is basically the same in either case. If what you are doing in post is massive color correction then coping with huge amounts of resultant color channel noise then I suppose that it so, but that should be evidence to you that you should be using filters in the first place!

Regarding your two listed disadvantages of filters, the first is no different than many of the other things we commit to for an entire dive. Yes, internal filters can't be changed during a dive but neither can the lens. Some rigs, notably video and PnS setups, have filters that are removable during the dive. Regardless, this is a consideration that we are accustomed to.

The second I mostly disagree with. Yes, a filter is potentially an image-degrading element but it is also an image enhancing element or you shouldn't be using it. Saying that it causes an increase in exposure is a matter of perspective. A filter does remove light but only the light that you don't want. In other words, a filter enables your camera to achieve proper exposure on the light you do want whereas PP is simply compensating for poor exposure in some color channels. All that light that is making your exposures easy is your enemy, not your friend.

At a high level, you should never plan on using postprocessing to fix exposure problems in images, and bad color balance IS ultimately an exposure problem. Every stop of postprocessing used is one stop of performance degradation imposed on your camera and, while today's cameras are pretty good, there are limits to how far you can push them before image quality suffers. In practice, relatively minor color balance issues are not worth the trouble to fix in any other way than post. There is no substitute for getting the light correct going into your lens!

Filters aren't just for fixing color balance, either, they offer creative opportunities in mixed light situations. When we shoot wide angle with strobes, filters offer us the opportunity to control background color independent of our foreground subjects. There is a common belief that filters and strobes are incompatible but that is untrue. Many color balancing filters don't have convenient complementary strobe filters but it is possible to use useful complementary sets as James has recently demonstrated in Little Cayman.

Finally, I wouldn't say that filters save any postprocessing steps. Underwater shots require some care in post and the amount of work is basically the same in either case. If what you are doing in post is massive color correction then coping with huge amounts of resultant color channel noise then I suppose that it so, but that should be evidence to you that you should be using filters in the first place!

I think you miss my point; post-processing should allow you to accomplish everything a filter does, just at a different stage in final image creation, neither is necessarily better. By it's very nature, a filter will actually remove some of the light reaching your sensor, whereas doing it in post allows you to control precisely how you want your image to look. Furthermore working in raw, will allow you to make non-destructive changes in your image. I see no real advantage in using a filter vs. post is my point, unless one is again using the 'purist' argument that a photograph captured exactly at the time of exposure is somehow superior to one manipulated in the digital or traditional darkroom. Your argument that post-processing somehow is compensating for poor exposure in certain channels makes little sense. Simply one is capturing all the available light at the time of exposure, and then manipulating in post, rather than capturing just the light you want at the time of exposure, and filtering out the rest.

Postprocessing and filtering do not do the same things. For relatively minor adjustments, either can be used with comparable results as you have shown. Filtering works by selectively removing light so if a filter is working the way you want then it is removing light that you don't want. That is a positive thing, not a negative one as you have asserted.

In an ideal world, digital cameras would have infinite dynamic range, infinite resolution, true color representation and no quantization loss. In that naive but common view of how digital works, postprocessing can do everything that filtering can without penalty. In that sense you are correct but that is not what we have.

What really happens in a digital camera is that the image is quantized in intensity and color as well as in two dimensions. Cameras are of limited dynamic range and have a remarkably limited color system modelled after the deficiencies of our eyesight. A huge amount of information is irrecoverably lost in the digital conversion of the image. When the plan is to do large color corrections in post, as you are suggesting, rather than optically using filters, what you are doing is further discarding what data is left. For minor adjustments that isn't a problem but for major ones it is.

Looking at this another way, lets say we are shooting an ambient light shot at moderate depths without filters. In this arbitrary, hypothetical case, we may have red light 6 stops surpressed from the surface, yellows 3 stops surpressed, greens at full brightness and blues 1 stop down. What the photographic result is depends on the subject matter, but its easy to see that the red channel in our sensor may well be 4 stops underexposed, the blue channel one stop underexposed, and the green channel properly exposed assuming we nailed our exposure. Of course, we may well be tempted to underexpose the water and make matters even worse. If this sounds farfetched, I assure you it is not. Light conditions at 60'/20m are worse than this!

Hopefully you can see by this example that what you are doing without filters is severely underexposing reds in order not to blow out greens, then cranking up reds in post in a desperate attempt to color balance after the fact. You should know that underexposing 4 or 6 or 8 stops results in a totally unusable image yet this is EXACTLY what you are doing when shooting unfiltered ambient except in the shallowest water. The result will be heavy chroma noise and banding/tonality problems. You will also get unreliable color due to metamers.

Instead, ideally you want to expose reds, blues, and greens as fully as you can by surpressing the overabundant green (and, to a lesser extent, blues, cyans, oranges, yellows) so that all sensors "fill up" regardless of color. You do this by using the proper filters. With a good filter matched to the type of water, time of day, and depth of shooting, you can achieve surprisingly great color, although you may need high ISO to achieve it. Video shooters with the right gear can get great color in excess of 100 feet because those cameras have remarkably powerful white balance systems. Post processing cannot do that because it is working with the data that's left, not all the data that there was.

As I said before, filters also provide creative options in mixed light shooting not easily achieved in post processing. Some people, myself included, prefer blue water and strongly desire to surpress greens in the background. Others may choose to surpress background color, enhance background color, or perhaps even enhance the green water in unique locations. All this is possible with complimentary filtering. It is essentially impossible with postprocessing. These techniques do not require matching filters to conditions but they do require matching filters on lens and strobe. Not all strobes, or lenses, take filters well.

Hopefully, you see that I didn't miss your point. Filters ARE "necessarily better" than post but they are not desirable or appropriate for all conditions. It is hard to justify the use of filters for macro, for low light black and white imaging, for shallow water, or for wide angle where the color the photographer gets without filters is simply what he wants. Filters are a useful tool to have in the toolbox, not always interesting to photographers, but for which there isn't a substitute. No videographer would do without them.

Thanks for that. I understand your point at depths where exposure essentially has to be 'pushed' let's say in the reds in order to create our desired image to the extent that the remaining colours would be blown out of the sensor's range if no filter were used. Effectively then you are increasing exposure to compensate for the light that has been filtered out....true.
However when dealing with images similar to those I have shown (15 feet here for example), it appears to me that as long as I stay within the dynamic range of the sensor, acceptable ISO parameters etc, I should be able to obtain similar results with filters or post-processing. Where I may benefit though in these circumstances is high dynamic range images, where I actually want to selectively reduce the amount of light hitting the sensor.

The tools we use for controlling light underwater are pretty blunt instruments. The effects of 15 feet of blue water are relatively mild compared to what we frequently deal with and a good filter for that application (probably a CC20R) is too weak for many to bother with (depending on your goals). If I were doing a dive strictly for 15 foot ambient shots and I had a filter I'd use it, but I understand why others wouldn't. The choice between filters and PP for these <2 stop problems is legitimate, it's the bigger adjustments where filters shine.

I haven't looked at the magic filters though I intend to. Choosing a complementary filter for them may be difficult. What makes a good ambient light filter, though, is not necessarily what you want for mixed light shots. The advantage of a magic filter with complementary strobe filters is being able to do both kinds of shooting at the same time. In that case, you still have to decide how much warming you want to apply to your background. Matching complementary filters isn't done from a fixed formula!

OK Craig, another question for you, as you obviously understand this very well.
When shooting RAW, one is still able to adjust white balance in camera. My understanding of white balance control in camera, is that one is adjusting the relative gain of the various color channels to each other.
In the context of this then, one should be able to achieve the same effect as filters to some extent by adjusting white balance to be more sensitive to the colors that are missing at depth. Say turn the WB dial to a higher K number under water to compensate for the relative increase in blue or leave it on auto? And one would also not suffer the penalty of increased noise etc in the other channels by doing this. (Again only to a limited degree) Set me straight here.

The white balance settings do not affect the data in a RAW capture. They will affect the image review on the screen, possibly the histogram displayed, and the setting will be embedded as metadata. It is possible that white balance could be tied into the analog ISO circuitry and therefore have the effect you describe, but I'm not aware of any dSLR that does it that way.

White balance works by manipulating the digital data after capture so it has the same effects as any other manipulation in post. When performing it in RAW, it occurs before the color space conversion so there is a theoretical benefit. The difference is slight. Our goal is to get our colors as correct as possible before the conversion to digital.

As a practical matter, white balance controls are somewhat limited in their range in dSLRs. When you shoot underwater, you quickly get filtering effects too severe for the white balance systems to cope with. It's not that they couldn't (and video systems are better in this respect), it's that the software engineers that designed these white balance systems aren't thinking of that situation.

Wow, this is a great discussion about filters! Loftus has brought forth many of the basic misconceptions about filter photography and Craig has addressed them very well. Thanks Craig for taking up a lot of time writing this all down. I appreciate it too.

My view about filters is a simple one - they keep my sensor from getting overwhelmed by blue and green light resulting in more of the "good stuff" getting mixed into the final image. When you take a photo w/ a dark reef and a sunball in the same photo, the information you capture of the dark reef is limited by the blow-out factor above.

Now I've done many dives w/ complementary filters, a few dives w/ the magic filter, and many w/ no filters. I like bright dramatic colors in the foreground so I'll probably stick w/ the complementary setup unless I start shooting different subjects.

The Magic filter (although it has magenta in it) has a quite different filtering effect to a M30CC and therefore could not be used in the same way to effect water colour and control cyan burn out and halos.

The Mf is designed to work with the camera's WB for available light shooting and is not designed to effect water colour like this.

To quote from the Mf website:

The other question we get asked a lot is what is the ideal complimentary strobe filter for use with the Magic? In other words, what is the opposite filter to the Magic filter, which I could fit on my strobes to give spectrally balanced flash fill?

The answer is there isn’t one! And I didn’t realise this until I tried it. The correct complimentary strobe filter for the Magic filter actually has to be the opposite of the combined effect of the Magic filter and the WB adjustment made by the camera. And of course this varies considerably with depth.

In other words, the correct flash filter must spectrally balance the strobe light with the ambient light at the depth you are photographing. And therefore the technique becomes very impractical because you would need to change the strobe filter each time you changed depth.

If you want to experiment with this technique I would recommend a strong blue filter, one of the best being a 38A, which should work at around 10m depth.

If you want to try complimentary filters I would suggest starting with the ones that James's used.

Good question Arnon. As Alex has said, his particular "mix" of filters is not made for complementary filter photography.

If you read Craig's original article you can see that he recommends adding a warming filter to the magenta. If you mix two discrete filters like that it may be possible to find a good complement. I haven't tried it yet but I think Craig is doing some experiments now.

* Great point that a filter subtracts light that, regardless of color, the camera needs to account for in deciding exposure.

HOWEVER...

* A filter is a one-trick pony. Unless you plan to shoot at the same depth for the whole dive, the filter is either subtracting too much or too little of the "wrong" light. In a perfect world, the camera would be able to introduce a variable-density colored filter based on light readings. But that's a bit tricky.

Maybe someone can build one on a rotating wheel that would sit in front of the lens... then you could dial-a-filter...

While it's true that a filter is only optimum for one depth or configuration, depending on what you are trying to do, if the range in which you shoot constitutes a 4 stop variation in white balance then a filter that corrects two stops allows you to split the difference rather than have the camera optimized for the surface where you rarely shoot.

When I shot video, I typically used a filter just weak enough that I could white balance at the surface. With the cameras available at the time I could use the same filter and white balance successfully to 60 or 70 feet. Without the filter, 30 feet would be the limit.

Of course, all this talking assumes it's easily done. In reality, water has continuously varying effects and choosing an optimal filter is an impossible dream. It's more reasonable to think in terms of using filters in a manner which improves results rather than maximizes the potential of the equipment.

In a perfect world, the camera would be able to introduce a variable-density colored filter based on light readings. But that's a bit tricky.

In a way, that's what white balance is of course. It's conceivable that the analog gain stages could be varied for each color but then WB would impact RAW and it would only achieve maximum benefit on a camera like the 5D that has a range of ISOs that perform more or less identically. For a camera like that I'd prefer more ADC resolution than analog white balance. That way I get the extra information back into the RAW file. :-)

I don't have the depth of experience that James and Craig do, and have learned a lot from Craig's posts (along with Alex'). I have had some success with compensating and correcting filters, and lately with the MagicFilter .

It is interesting to me, and may be to others, that the use of filters to achieve light balance optically (rather than in PP) is related to the "expose to the right" idea (discussed extensively at Luminous Landscape , among other places). Not everyone agrees with this concept; I have found it to help a lot with lots of images.

Basically, the problem is that the sensor is linear, whereas perceptual vision (the eye) and editing (e.g., PS) is logarithmic. What that means is that the brightest stop in the camera covers fully 1/2 the number of digitized levels in the editing image. For a 12-bit raw, that's 2048 of the available 4096 tones! Or, put another way, the brightest stop gets 2048 levels of representation, and the lowest (typically 4 stops dimmer) gets but 128.

So what? You may want to compensate your exposure so that the histo is far right as possible without clipping (this will pull out more detail in textured shadows as well as in highlights), because if you don't expose to the right, you stand more chance of poorly resolving tonal contrasts or posterizing the image. The "to the right" image may look overexposed, but it can be corrected in raw conversion with less loss. And so it will often have smoother tonal variations, at least in my experience (which, in this context is mostly landscape vis-a-vis underwater).

So how is this related to filters? A reddish filter gives you a better chance of exposing the red sensors "to the right" without blowing out the blues. If the reds are not "exposed to the right" their tonal variation would be less smooth -- and there is nothing that PP can do to fix it.

I think the only way you could get as much in PP as with a filter is if cameras were built with a "Truly Magic Filter" that allowed you to simultaneously expose ALL the different color channels to the right, and then you had a way (e.g., neutral target) to reset the white balance correctly.

If you have more dynamic range and more bits, it helps make PP more like a filter in terms of possible fidelity. But at the end of the day, you'll do best by "almost clipping" as many of your channels as possible -- for any bit depth, any DR, and any PP, so long as sensors are linear and our vision is not. Gamma != 1.