Lending you a hand with image processing – basic techniques 1

This is a PA ulnar deviation x-ray of my left wrist from last month, which gives a good view of the scaphoid bone from above.

The bone is chipped in the area pointed by the arrow, due to a fall that occurred 20 or so years ago. Somewhere in there, there’s also a tiny detached fragment of cartilage that calcified (as seen in a CT scan at the time). I was lucky, because typically the result of a fall with outstretched hand for people aged 17-40 is the scaphoid fracture, which are known to have unpredictable healing. Lately, however, due to a tendonitis, the fragment too is acting out. I’m left handed so this is causing some trouble, and that’s why the recent x-rays.

Family doctor, radiologist, and specialist all agreed there’s nothing I can do about the fragment, but taking care of the tendonitis with physiotherapy should improve the situation. The long-term outcome will depend on whether at the time of the fall the gliding surface of the cartilage was damaged in which case I’d get arthritis and there’s no way to know that unless I went in for an endoscopic procedure with a microcamera.

Nothing I can do. That’s when I decided I WAS going to do something at least with these x-ray. The idea stems from some good examples of image processing using a hand x-ray I saw some time ago on The Image Processing Cookbook by John C. Russ. I plan a whole (semi-regular) series taking examples from that book and from The Scientist and Engineer’s Guide to Digital Signal Processing. I will start with some basic image display, processing, and analysis techniques using my hand x-ray and other images, and then work my way up to a few more advanced techniques. I will post my Matlab code, which was written in Matlab version 2007a so you will need this or later version to run the code, but not the Image Processing Toolbox.

So yes, I am literally lending you a hand with image processing.

In this first post I will start with importing one of the x-ray images, then try a number of displays to see if they enhance details in the image. I will not be using the image above but rather one that shows more details of the hand and fingers. You can download it in here if you want to replicate exactly what I did. To import into Matlab just use the import command in the file menu.

Let’s start with displaying the original in grayscale. Here’s the code below. The first few lines will convert the data from RGB truecolor to intensity which are more compact (single matrix), easy to manipulate, and can still be displayed with different colormaps (for more on displaying data as images in Matlab read this excellent blog series).

Here’s the resulting figure. This is how radiologists and doctors in emergency look at x-rays (for more on that read this post), the reason being that it requires virtually no processing, it is simple to view them on bright light panels, and fractures will stick out nicely.

But we are not interested in this image for diagnostic purposes only, so let’s see if we can enhance this display. Very often the first thing it is tried is to replace pseudo-colors for the intensity values. The rationale is that human eye in normal conditions can detect far more color hues (from many hundreds to up to millions) than shades of gray levels, and also gradual variations in color more easily. I wil ltry this next and use a rainbow colormap. To call the rainbow colormap in Matlab (called jet) and change the title we modify two lines in the code above to:

Looks a bit weird? Not surprisingly. It is unfortunate that many of the standard colormaps, like rainbow and spectrum do more harm than the good above mentioned: they often obscure some details and confuse the image by introducing artifacts. I will tackle this topic in depth in a series of future posts future post , and I will give some alternatives to the standard colormaps. For now, if you’re interested in the subject, please read this note by the IBM Research Centre, particularly the first two examples, which highlight how the rainbow colormap creates these perceptual artifacts. In addition, rainbow is very confusing for people with color deficiencies, which occur in only 0.4% of women, but up to an 8% in caucasian men. There’s an excellent simulation of these confusing effects in Fig.1 of this article of the American Geophysical Union’s EOS publication.

As suggested by Dr. Russ in The Image Processing Cookbook “there is no general rule for how people interpret different colors and therefore the successful use of pseudo-color displays is difficult”. A more successful approach is to create a pseudo-relief image by replacing elevation for intensity, and then combining it with the use of illumination, because surfaces are part of everyone’s everyday experience and we all intuitively recognize and interpret the effects of direct, reflected, and scattered light on surfaces and the effect of changes in its direction. It is the same ability that allows us to navigate around objects and tell the difference between a bump and a hole in the ground (we may decide to walk on the bump but we will avoid the hole). Steve Lynch provides in this paper a compelling empirical demonstration (through a survey) of why pseudo-surface is more successful than color, followed by sound scientific explanation based on the theory of human trivariate color vision.

Here is the result. This is to me a far superior image than both the color and the original. Notice how details reallu jum out of the page.

Can we do even better? I sometimes like to create a different matrix for the elevation to increase its dynamic range and enhance the relief effect. This is easier than just using the daspect command, and it gave good results with topographic or geophysical data. Let’s try it and see what happens with the hand:

%% creates elevation matrix
elev=log2(hand+1)*10; % creates arbitrary elevation matrix (I did this
% through trial and error).
% N.B. The function surf used below, would have
% taken the variable hand as input for both relief
% and color, but I like more control that what is
% afforded with the command daspect alone
%% creates elevation matrix
elev=log2(hand+1)*10; % creates arbitrary elevation matrix (I did this
% through trial and error).
% N.B. The function surf used below, would have
% taken the variable hand as input for both relief
% and color, but I like more control that what is
% afforded with the command daspect alone
%% display result as 3D surface with phong lighting
figure3=figure;
surf(elev,hand,'FaceColor','texturemap','EdgeColor','none');
% used elev for relief and hand for color intensity to be mapped
% to heated body colormap
colormap(bone);
colorbar;
set(gca,'YDir','reverse');
daspect([0.5 0.5 1]);
view(0,90);
lighting phong;
material shiny;
h=lightangle(45,60); % sets the position of the specified light
%% some figure and house cleaning
set(figure3,'Position',[640 300 950 708]);
set(figure3,'Color',[0.9 0.9 0.9]);
set(figure3,'OuterPosition',[636 396 958 790]);
title('Render of X-ray - left wrist','FontSize',14);
axis off
axis tight
clear n k xray figure1 figure2 figure3;

And here’s the result.

To me this really stands our even more. But it also brought up the low intensity areas corresponding to the fleshy part of the hand, which now looks like the surface of the sea as seen from above when hit by sunlight. You know, all that glare? Is this perhaps distracting from the task of trying to interpret what you are looking at, i.e. the bones.

I clap my hands to you!
Beautiful and interesting post, it made me remind of the University times when I did quite a lot of image processing. And I’m sincerely impressed from your clarity.
Keep it up with the good work!

Thanks for the encouragement Umberto. And for the opportunity to acknowledge my first Matlab mentor. It was – this dates us both – version 4.2c. I don’t think I was a good linear algebra student but I certainly came away from our informal lessons convinced of Matlab’s enormous potential. Matteo

Matteo,
This is just wonderful. Keep up the thoughtful work, and thanks for sharing your code too! Excellent. I think it would be helpful if you put the last two images side by side to make a comparison. Additionally, what if you subtracted them? In both cases, I find the induced texture very intriguing. I love this idea of casting the image back into an augmented grayscale so the remaining colors in the visible spectrum can be reserved for additional attributes or layers, if need be. I think map makers and graphics folks jump to the full color spectrum (and linear gradients) too quickly, and consume the entire visible band for displaying information.

Thanks Evan,
These are really helpful comments. I added the two images side by side as you suggest.
Subtraction is a great idea, I hadn’t thought about it. My 3rd post in the series is going to probably be on creating a bumpmapped display in Matlab, but I think I will definitely include the difference in my 4th post of the series, which I am planning to use to showcase display functionalities in ImageJ, a fantastic open source image processing program (http://rsbweb.nih.gov/ij/docs/concepts.html).

I have never “blogged” before, mainly because most of my computer time is dedicated to sitting in front of a PACS station looking at diagnostic imaging studies (radiology resident). If I had a colour radiograph to look at every once in a while, it would make my day a little brighter. You have pretty good insight in to the scaphoid fracture, given that you are likely not a physician, btw. As a note, I really like the pseudo-elevation version as it brings out the trabeculae of the shaft of the bone, which we look at when assessing bone mineral density. Otherwise, you are right, the spatial resolution takes a hit. Thanks for the study break. Christina.

It would be interesting to see a series of images with the lighting at different azimuths. It s a well known roperty of lighting that it highlights features perpendicular to the direction of the light. I wonder if other lighting directions might show up other features.

Steve you are absolutely right, that is often the case.
For this image though I chose the lighting that achieved highest visual contrast between highs and lows, and no other direction seemed to highlight other features.
I will be posting soon a series on acquisition, enhancement, and visualization of exploration gravity data with examples to illustrate use of directional lighting and directional derivatives for interpretation.
Matteo

Great post! I like how you first explain it in the text, then show it in code and finally display and discuss the results. Makes it interesting to read through and play with the image data. I might try this in GNU Octave.

Blogroll

Meta

Go ahead if you want to use my code, modify it, improve it, for non-commercial AND for commercial use. You are also welcome to download and reuse my media files - unless otherwise stated. With both code and images, please give full and clear credit to Matteo Niccoli as the author and mycarta.wordpress.com as the source.
WordPress bloggers are welcome to reblog my posts. For republishing outside of WordPress or any other request, please e-mail me at: matteo@mycarta.ca