The idea is that you create a histogram of the colors a particular image, then calculate the mean pixel color and the standard deviation around that mean. You then consider the minimum pixel value of mean-(n*stddev) and the maximum pixel value of mean+(n*stddev). You then set the min pix val to image-output color zero, and the max pix val to image-output-color 255. Then you interpolate the remaining colors in the image into the 'stretched' color-space, causing higher contrast for the densest portion of the color-space.

Gabriel Roldan
added a comment - 03/Feb/09 7:59 AM Is this the same than the histogram support added in SLD?: < http://docs.codehaus.org/display/GEOTOOLS/Raster+Symbolizer+support >
See the example called "Portrayal using an SLD with no Color Map, Gray Channel selection and Histogram Contrast Enhancement"

No, this is not quite the same thing, although it's close. I believe the code to look at is ContrastEnhancementNode.java. ContrastEnhancementNode defines a number of allowed "contrast enhancement" types.

To solve this bug, a new contrast enhancement type needs to be added called (roughly) "histogram-mean-stretch" which will also take a parameter, the number of std devs to stretch by (not sure how to tie this into the SLD render, perhaps as an attribute? operation="HISTOGRAM-MEAN-STRETCH" stdevs="2")

The implementation will do the following calculation for each band:

1. Calculate the mean value across all pixels in the image, as well as the std devation around that mean
2. Create a new "false min/max" at N std-deviations above and below the mean
3. consider each pixel in the original image:
a. if its value is below or equal to the false min, send it to zero (or whatever the min for the band is)
b. if its value is greater than or equal to the false max, send it to 255 (or whatever the max for the band is)
c. if its value is between the false-min and false-max, its new value is (total pixel range/ (false max - false min)) * (pixel val - false min) -- basically you're "stretching" the values between the false min/max into the space of the whole color-range

Saul Farber
added a comment - 03/Feb/09 2:45 PM Gabriel,
No, this is not quite the same thing, although it's close. I believe the code to look at is ContrastEnhancementNode.java. ContrastEnhancementNode defines a number of allowed "contrast enhancement" types.
To solve this bug, a new contrast enhancement type needs to be added called (roughly) "histogram-mean-stretch" which will also take a parameter, the number of std devs to stretch by (not sure how to tie this into the SLD render, perhaps as an attribute? operation="HISTOGRAM-MEAN-STRETCH" stdevs="2")
The implementation will do the following calculation for each band:
1. Calculate the mean value across all pixels in the image, as well as the std devation around that mean
2. Create a new "false min/max" at N std-deviations above and below the mean
3. consider each pixel in the original image:
a. if its value is below or equal to the false min, send it to zero (or whatever the min for the band is)
b. if its value is greater than or equal to the false max, send it to 255 (or whatever the max for the band is)
c. if its value is between the false-min and false-max, its new value is (total pixel range/ (false max - false min)) * (pixel val - false min) -- basically you're "stretching" the values between the false min/max into the space of the whole color-range