Do properly calibrated Voyager images have a correct, intended brightness, or just a correct brightness relative to other Voyager images? The way I've been processing them into color has involved contrast stretching the brightest image and applying an identical stretch to the other, relatively dimmer images of the same target. When I apply the same stretch to Titan as I would to Saturn, Titan appears very dim, as it should be relative to Saturn.

Also, is there a correct way to balance the colors in a false color image? UV images of Jupiter are pretty dim relative to the visible light images... do those UV images have to be brightened?

I've just been constrast stretching every image as bright as it can go, which I'm sure is wrong... the colors of the composites are often odd looking, e.g. Io (though the approach and rotation movies of Jupiter, Uranus, and Neptune look okay, I think). Good idea though to apply the same stretch to every image, wish I had thought of that.

At the moment the channel weights would need to be tweaked manually, so there needs to be a more automatable way to balance the colors. I was thinking of having a reference image for each target with good colors, then adjusting the channel weights to try to approximate the correct spectrum. This would be nice for Neptune, e.g., since then you could colorize the movie to match the nicest blue images.

But I don't know about the UV filter - those images are pretty dim. I guess it depends on what you want it to look like?

And the other problem with brightening images is the areas in the corners that are sometimes whited out, or other noise in the image - they throw the contrast stretching off. I'd like to automate ignoring those hot pixels somehow also as there are a lot of those images scattered around. Something about there being a big gap between the brightest pixels and the mass of the rest of the histogram - will have to fiddle around with that.

You'll always need fudge factors and color channel mixing to get realistic color. It should be possible to 'standardize' these for at least all images of Jupiter and possibly for all of the Jovian system images (I'm working on this). Io is an exception - it's impossible to get good color for Io from Voyager images due to Io's weird spectrum and Voyager's lack of a red filter.

Automatically contrast stretching individual images without taking adjacent images (or images taken of the same area with a different filter) into account is usually a bad idea.

QUOTE (Brian Burns @ Aug 18 2016, 10:02 PM)

And the other problem with brightening images is the areas in the corners that are sometimes whited out, or other noise in the image - they throw the contrast stretching off. I'd like to automate ignoring those hot pixels somehow also as there are a lot of those images scattered around. Something about there being a big gap between the brightest pixels and the mass of the rest of the histogram - will have to fiddle around with that.

In the calibrated images, a special value (32767) is used for saturated pixels. If this can be automatically ignored (non-saturated values are much lower) or somehow masked out this shouldn't be a problem.

Io is an exception - it's impossible to get good color for Io from Voyager images due to Io's weird spectrum and Voyager's lack of a red filter.

That's too bad - I guess you could borrow Galileo's red channel for the global map somehow, but then it wouldn't quite be a Voyager movie, though it might look nice. I guess I'll just have to see how it comes out.

Some of the Io images do have a nice gold color though, which gets somewhat close, e.g. these two which I had manually tweaked - (and the small one has a nice blue volcano plume if you zoom in) -

These could be used to seed the global map, and then maybe be used to balance out the remaining images somehow.

QUOTE

In the calibrated images, a special value (32767) is used for saturated pixels. If this can be automatically ignored (non-saturated values are much lower) or somehow masked out this shouldn't be a problem.

Ah, that's good to know - that should help with those big white areas. In some of the images I was looking at though there was also a band of noise along the bottom (the last few rows), with pixels scattered across the histogram, which would throw it off also. Might need to do the denoise step first, though had wanted to save that till later.

I think what I'll try to do is just brighten each image individually, and then when creating composites try to balance the channels to match the expected spectrum - then it wouldn't have to worry about denoising the images first, and keeping groups of images levelled the same amount, which would be pretty tricky given the variety of composite and single channel images that were taken. Not sure how well this would work yet though.

So the compositing step would push the images to the global map, then pull in missing data, then try to balance the spectrum - and it might need some manual tweaking for the first global color image, e.g. for Neptune to set the colors how you wanted it to look.

I'm still working on getting distant targets to work with ISIS - ie when a target doesn't take up the whole image frame, and the pointing is off as with Voyager, the different ISIS camera alignment programs won't work. At the moment I can shift the image so the target appears where the pointing kernel expects it to be and then run cam2map correctly, but this would end up losing information as the target fills more of the frame. So I'm trying to rotate the camera matrix with a new ISIS program so it will point correctly at the target, but so far haven't been able to get it to work - it winds up pointing way out of the image frame.

Another problem is the amount of time needed to run cam2map, which projects from an image to a cylindrical map, using attached SPICE data - for a 1600x800 map it takes about 40 seconds on my i3 laptop - if I were to run this on every image it would take 73000*40/60/60/24 = 34 days. I looked through the code and it seems to be written more for clarity than speed, which is fine - I do need a faster computer though. I'd initially tried doing the projections in Python with OpenCV using the remap function, and at least for the simple case of a distant spherical target it would project to a 1600x800 map in 15 seconds - which would take a total of 13 days. I think cam2map is also doing corrections for the Voyager camera distortions though.

So I was thinking of cheating a bit and using predefined cylindrical maps for the missing info - it wouldn't work with Jupiter or Neptune due to cloud movements etc, but maybe it would work for things like Io, Europa and Triton. For Jupiter maybe I could run cam2map - for Voyager 1 and 2 it would be on the order of 30k images - with a faster computer using all cores maybe it wouldn't be too bad.

Still have the issue ahead of correcting the camera pointing for closeup images, though as far as I understand it the ISIS program jigsaw should handle those cases. Then I'd be able to pull back the missing info from the map and colorize the images - fortunately map2cam runs quickly. Not sure how fast jigsaw runs though... I'm guessing it's slow - it uses iteration to solve for the camera matrix.

And also need to handle the different map scales - e.g. at the highest resolution the map of Jupiter would be enormous, so will need to just focus on the relevant regions.

Was also thinking of using the cylindrical maps to animate the flybys, e.g. simulating the view as if Voyager was taking pictures of each target the whole time - would slow things down so the terrain would go by slowly underneath. Might be good for making a short movie of all the flybys.

IMAGE COPYRIGHT
Images posted on UnmannedSpaceflight.com may be copyrighted.
Do not reproduce without permission. Read
here for further information on space images and copyright.

OPINIONS AND MODERATION
Opinions expressed on UnmannedSpaceflight.com are those of the
individual posters and do not necessarily reflect the opinions
of UnmannedSpaceflight.com or The Planetary Society. The all-volunteer
UnmannedSpaceflight.com moderation team is wholly independent
of The Planetary Society. The Planetary Society has no influence
over decisions made by the UnmannedSpaceflight.com moderators.

SUPPORT THE FORUM
Unmannedspaceflight.com is a project of the Planetary Society
and is funded by donations from visitors and members. Help keep
this forum up and running by contributing
here.