4 Answers
4

An 8-bit image will most likely have its pixel values truncated. What you need to do is interpolate the missing low order bits before applying a colour transformation. There is no way to know what the missing values are, you would have to make certain assumptions, such as smoothness.

To answer your question I don't know of any image editor that offers this feature. Why not? Speed for one, the curves/levels command in most photo editors is fast enough to work in real time so you can see the effect on your image. Now you could have a separate function that operates when not previewing, but this would be a pretty high level feature, and you'd have to implement it for all operations that apply scaling to pixel values. Finally it's a problem that completely goes away if you have a camera which supports RAW and adopt a 16-bit workflow.

You can always do the interpolation yourself, e.g. by converting to 16-bit, resampling at a higher resolution, downsampling. An alternative scheme that will probably work just as well would be to randomize the lower order bits (by adding a small amount of noise). Or a combination of resampling and randomization.

Don't be too put off by the look of the histogram in your image the quantization error of that operation is unlikely to be noticeable if there is any noise in the image provided you don't have any areas with shallow colour gradients (however in cases with smooth transitions and little noise, interpolation would work very well).

From a theoretical point of view there is no "True" way of getting those values back. convolution kernels (like guassian smoothing, or an edge preserving kernel) and dithering will make assumptions to bring them back, rank filters may not (like median).

However, there is no way to "know" how those values should have been before the quantization, so it is a subjective judgement, which convolution based filter you prefer. You might need to use multiple filters and mask your image as to where you want them applied.

The best way to overcome this is by processing in 16bit mode , so that the "zebra shaped value space" get resampled when going to 8bits. This should eliminate most skipped values in 8 bit space, unless you added a lot of contrast/gain.

I understand all this, and it's useful background to the question, but doesn't answer it....
–
mattdmNov 5 '12 at 21:21

I guess to formulate it as an answer, but it would be too short: "Even if they did do something, it would be theoretically wrong" meaning that: "they would choose a filter that they believe works in the mainstream situations" but you are the better judge of that for you particular image. Even worse is that their choice might ruin your chance to apply the best filter afterwards (by smoothing the details you want to keep, or adding aliasing artifacts).
–
Michael NielsenNov 5 '12 at 22:21

How about a choice of filters, then, as is common with scaling? I've done the very crude experiment of doubling the size of my image with bicubic, adjusting the curves, and then scaling back down, with reasonably good results. I'm not buying the "ruin your chance" argument in these modern days of non-destructive image editors.
–
mattdmNov 8 '12 at 13:32

You don't ruin your chance if you do experiments yourself, but if the developer hides a function inside another that you can't switch off, they do ruin your chance to fix it in your own way. You could resample the picture with spline fitting, changing the order of the spline depending on the amount of details. Your bicubic scaling is good for when the "banding" is within 1-2 pixels and the texture isnt affected too much by it. Otherwise HermiteXn might work. If you know you have a smooth region you might just use a mean filter the size of your bands. Local contrast enh. gives you some, too.
–
Michael NielsenNov 8 '12 at 19:04

1

I don't think it really matters that you can't recover the "correct" intermediate values, all that matters is how the result looks. A little fine grain is usually preferable to noticeable banding...
–
Matt GrumNov 30 '12 at 12:08

Can you explain the "don't use levels" comment? As I understand it, it's actually a crude interface to exactly the same thing. In my exampl adjustment above, the curve (a straight line in this case) is exactly the same as what one would get by dragging the black and white point sliders inward on the levels tool.
–
mattdmDec 1 '12 at 0:37

I have just found by use that dragging the mid point slider in levels is a harsh adjustment to images. My comment isnt based on the math or algorithm behind the process. I would be curious to know if they are the same thing and what use there would be for two interfaces of the same thing.
–
underarockDec 1 '12 at 2:30

Yes, adjusting the midpoint slider by more than a tiny bit introduces a very strong curve. There's more than one interface because the curves tool is harder for many people to understand. (If you have Gimp, it's actually easy to see what the effect of a levels adjustment would be as a curve, because the levels dialog has a button which switches to the curves dialog with the same adjustment. If you don't have Gimp, post a new question asking about this and I'll post an explanation and some examples as an answer.)
–
mattdmDec 1 '12 at 5:20