Posted
by
timothy
on Thursday August 16, 2012 @10:35AM
from the ocean-front-property dept.

New submitter bbianca127 writes "Curiosity sent a picture down to us, and it looks a lot like Earth. Actually, the picture's color quality has been changed — to human eyes, the landscape would look a lot more reddish. Still, it looks remarkably like the southwestern United States (bringing to mind the Arrested Development quote about how Lucille Bluth would rather be dead in California than alive in Arizona)." Definitely a different sense of the place than the one given by the reddish-brown posters I remember from elementary school.

You are incorrectly confusing the name of an obscure logical fallacy with a simple English phrase.

The name of the logical fallacy itself comes from an archaic use of "beg" meaning assume/demand, still seen in "begging your pardon", "the committee begs to report", "beg to differ". Specifically, to take for granted without justification. Moreso, whenever anyone uses it in the context of the logical fallacy, they almost always use it to name the fallacy. "That argument is 'Begging The Question'."

Yes, and I humbly apologize. I was under the (incorrect) belief that the original poster had used the phrase correctly. h4rr4r corrected me, I looked it up (as I should have done in the first place), and found that he was right and I was wrong. I'd withdraw the earlier comment if I could, but/. doesn't let you do that, unfortunately...

It's no shame to be wrong and learn something new. If I had mod points I'd give you +1 "Learned something instead of defending wrong position with poorly considered arguments." I can't remember, was that one of Slashdot's categories?

Oh, give me land, lots of land under starry skiesDon't fence me inLet me ride through the wide open country that I loveDon't fence me inLet me be by myself in the evenin' breezeAnd listen to the murmur of the cottonwood treesSend me off forever but I ask you pleaseDon't fence me in

Interesting perspective. I'm a hiker and I'd love to hike Mars. All these photos are tantalizing, to imagine some of the great vistas available, which only a robot can see for the present.

Some day the Sierra Club will be trying to protect areas of the planet, to keep open and undeveloped. Trails will descend into Valles Marineris and there will be campgrounds. No scorpions, no rattlesnakes. A trail or two will ascend Olympus Mons and even in daylight you will be able to see the brighter stars and constellations. The hint for every martian Geocache will be under a pile of rocks. It'll be a glorious place to wander. I'm seriously envious of those who will enjoy all Mars has to offer, aside from just another place for the human race to populate and industrialize.

Interesting perspective. I'm a hiker and I'd love to hike Mars. All these photos are tantalizing, to imagine some of the great vistas available, which only a robot can see for the present.

When I was about ten or twelve years old, I saw a full-page ad in some magazine that showed a bunch of guys dressed all in black wheely-ing and skidding BMX bikes on the surface of Mars. For some reason, that ad really captured my imagination, and I've wanted to go mountain biking on Mars ever since. Just imagine the big air (err...okay, more nearly "vacuum" than air, but I digress) you could catch in 1/3 g!:) And yeah, I totally second trails on Mons Olympus!

The atmosphere is so thin it's basically vacuum, so the view of the stars should be pretty good. If we could engineer cottonwood trees that thrive in vacuum, high radiation, temperatures as low as -150 celcius, and no water, we'd be good there too. Of course then we'd have to engineer humans that didn't suffer bone decalcification due to the low gravity...

The atmosphere is so thin it's basically vacuum, so the view of the stars should be pretty good. If we could engineer cottonwood trees that thrive in vacuum, high radiation, temperatures as low as -150 celcius, and no water, we'd be good there too. Of course then we'd have to engineer humans that didn't suffer bone decalcification due to the low gravity...

Snarky as your comment may have been meant, I think you need to check your numbers again what constitutes "so thin it's basically vacuum."

Mars has an average surface atmospheric pressure of 0.636 kPa. Earth has 101.325 kPa. So yes, while it is 160-times thinner, that's still pretty thick, especially if dust is kicked up. After all, remember that with 1/3rd gravity, much less air friction and no moisture, dust particles can stay afloat for quite some time.

And then, compare that to the moon, with a pressure of 10^-7 kPa (~1 nPa), Mars still has a 6.36 million times denser atmosphere. And compared to interplanetary space, that's still practically solid, as space has 400.000 times less pressure.

In other words: If Mars is a near-vacuum at nearly 10^17 times more molecules per cm than interplanetary space, then a snail that moves at only 3*10^10 cm/s.

That's because that photo is the white-balanced version! A white-balanced photo is what the scene in Mars would look like if you literally took the scene, cut out that whole area of ground, transported it to Earth and viewed it under the Earth's sky.

As much as I love the awesome idea of moving a chunk of terrain between planets, I'm going to shoot for an informative mod and answer the question.

There is a sundial mounted on Curiosity [nasa.gov], with a few colored stripes on it. Those stripes' colors (red, green, blue, and yellow) were recorded under Earth's lighting, Now that those same stripes are on Mars, their apparent color change in new pictures is the result of Mars' different lighting. By comparing the stripes' pictures, an approprite transformation can be determined, then applied to other pictures to compensate for the change in lighting.

We are sure because we're assuming that those stripes' actual colors haven't changed significantly during flight or landing.

The thing is, If I were on Mars, the colors wouldn't look like they look here anyway -- because of the lighting that you mentioned. I'd rather see what it would look like on an alien world in its native lighting conditions, not rebalanced to look like it had our light conditions.

You would, but geologists wouldn't. They are used to what rocks and minerals look like under our own earthly lighting. As such it makes sense to adjust the color of the image to match earth-normal lighting conditions.

Actually, you probably would see it more like the white balanced photo than the regular one. Your brain is very good at auto white balance.

Perceptual re-balancing is very different from absolute colorimetric re-balancing, which is what is used here.A late evening shot (which this basically is) looks very different when you balance it against a Gretag Macbeth [wikipedia.org] card than if you balance it according to human perception.

NASAs goal here is clearly to make the picture as useful as possible to those who study them, not to give the public a "true" image of what we would perceive if we were there. I think there should be room for both.

Should be pointed out that human eyes automatically white balance anyway. It's the reason white always looks white to the naked eye, whether under tungsten or halogen light, but look different in photographs.

Need to be careful here; what you say happens, definitely, but it's not the eyes that do it. It's one of those "zomg my brain adjusted the data to a known pattern" type things.

Kind of like a sommelier's nose, you can train yourself to see the differences in white points without having to place swatches next to each other, and it's very useful when switching through several temperatures of light sources for simulation purposes. What sucks is once you do, you can't turn it off.;)

The colors in this image are not what a human standing on Mars would see — the presence of dust in the atmosphere would make the scene appear much redder. Instead, the pictures have been white-balanced to show how it would appear under typical Earth lighting conditions. This will help the Earth-centered geologists who are trained to recognize features based on how they look using more familiar light.

Like you, I too would like to see what it would look like if I was actually standing on Mars. However, the APOD website [nasa.gov] describes what is probably the same photo as in the Wired article (Surprise! I didn't RTFA yet), which contains this blurb: "Images from Mars false-colored in this way are called white balanced and [are] useful for planetary scientists to identify rocks and landforms similar to Earth." So while you and I might appreciate the novelty of seeing what Mars would actu

Even on earth we have this issue (I've made a fairly healthy living navigating through color space to color space and light source to light source over the years). People seem to forget that our own sunlight can vary during the day, geographical location, cloudy days, etc, and indoor lighting is the beast with a billion backs. Even your own eyes can betray you, needing a moment to adjust, and often one eye sees color slightly differently from the other.

Color scientists have had an absolute color and light source standard to measure against (CIE LAB) or 40+ years; Mars (or anywhere in the universe that receives light in the visible spectrum) fits just dandy into this model for color transformations, it's just a bit further away than usual. The less light there is to measure, the smaller the total color gamut will be, but you can extrapolate pretty well, if you don't mind some +/- errors along the way.

Typically, a true simulation would need several hundred color swatches for analysis, plus an iterative scanning approach to nail down the color gamut points that are furthest away (say, blues could be further off than reds, so require more attention for a transform). Still, for a general "this is approximately how it'd look on Earth" a 4 swatch RGBY spectrum is close enough.

It's something like the difference of having a precision of tenths to a precision of hundred-thousandths, when all you're doing is counting apples. You may be plus or minus a tenth of an apple, but so what?

The only thing that's a little surprising is that they didn't include a calibrated black strip, but I suppose they didn't really need to account for the variation between deep shadow areas or very dark objects in this case.

Also a little interesting: Typically people want to match colors across color spaces; as in "I want my print to look like my monitor!!!".

This case is the opposite; the goal is to punch the saturation, contrast, and luminescence to that of a randomly chosen Earth standard. We want to take the equivalent of a printed image (small color gamut) and see what it looked like on a monitor (large color gamut) prior to printing.

In general sweeping terms, this is pretty easy to do, provided an educated guess i

On the rover are color calibration targets (here is the one for the rover's arm's instruments [nasa.gov]). We know exactly what the colors of those targets are supposed to look like, when imaged by the cameras on the rover, under normal Earth-like lighting conditions. By looking at how those targets appear in the images we get back under Mars lighting conditions, we can do two things:

1) Learn a lot about the lighting conditions on Mars.
2) Correct the appearance of images we get back to correct for that Mars lighting.

One is white-balanced and one colored. The white-balanced version represents what the scene would look like to human eyes under an Earth sky. The colored represents what the scene would look like to human eyes on Mars.

The point of using white-balanced photos is that geologists are used to looking at rocks on Earth. So when a geologist wants to judge rock characteristics using color, it helps to white-balance it so the color is similar to what it would be if looking at those rocks on Earth.

Which is useful, because it lets us see things in a more familiar frame of reference. Under the Mars Atmosphere things will look more alien to us making normal stuff seem worthy of extra interest. Making the images more earth like, will help us point out what things are more interesting to look at and what to ignore.

How about doing the reverse, i. e. adjust the white-balance of photos taken on earth to look like they were taken on mars? Can this be done accurately if we take the picture of Curiosity's sundial as a martian reference?
I think it would be very interesting to see earthly scenes the way they would look on mars!

If you don't mind being unable to take color shots of relatively fast moving things, you can use a conventional greyscale sensor, swap color filters between frames, and then crunch the result into a color image(or, if you have the space and don't mind a moderately complex optics package, you can have three greyscale sensors, each with a fixed color filter). If you need a color image within one frame, you use a fixed bayer(or similar) filter and demosaicing. Eats nontrivial resolution compared to the pure greyscale or swapped filters strategy; but you get everything in one shot and fewer moving parts. Then you have the somewhat oddball Foveon approach, where your greyscale sensors are stacked vertically, and use the different rates of absorption in silicon of different frequencies to do the filtering...

In very broad terms, they all have the 'greyscale sensors and filters' strategy; but there are a fair few ways to go about it. If you count chemical and biological sensors, you are more likely to find sensor elements that are actually tuned to a specific wavelength, rather than filtered to it; but the final image is still a matter of crunching together results from individual elements that are really only giving you intensity data for a relatively narrow slice of frequencies.

Definitely a different sense of the place than the one given by the reddish-brown posters I remember from elementary school.

That's because the picture has been altered to remove the red haze, in order to produce an image that more closely resembles a landscape on Earth.

From the article:

The colors in this image are not what a human standing on Mars would see — the presence of dust in the atmosphere would make the scene appear much redder. Instead, the pictures have been white-balanced to show how it would appear under typical Earth lighting conditions. This will help the Earth-centered geologists who are trained to recognize features based on how they look using more familiar light.

Earth is a big place. You can pretty much guarantee that any rocky planet will have parts that look like other rocky planets. When will we get any science? We KNOW the place is a reddish, dusty, rocky desert. Move on.

Gosh, some guy on Slashdot wants to move on. Hey, everybody! Stop testing the MSL! Forget all the calibration tests. Drop the checkout sequences. No need to make sure anything is working right. This guy said go. Just apply all available current to all the wheels. No, there's no time to make a traversability map. All power to the forward shields! Damn the torpedos! Full speed ahead!

Seriously, are you trolling or simply do not understand that this IS scientific information about Martian terrain, geology, soil, tectonics, atmosphere etc. With respect to earth, it tells us a lot about the Goldilocks zone's extent. Mainly because the other 2 terrestrial planets - Mercury and Venus don't seem to have terrain like the earth.

Do you think there is just one kind of dusty, rocky desert?Go to the Atacama desert, and then to the Gobi desert, and to the Sahara. Tell me if you think they are the sa

> The colors in this image are not what a human standing on Mars would see — the presence of dust in the atmosphere would make the scene appear much redder. Instead, the pictures have been white-balanced to show how it would appear under typical Earth lighting conditions.

So the story is that a photo of Mars that has been adjusted so it looks like Earth to make it easier for geologists to interpret... looks like Earth. Wow.

When the spectrum of ambient light does not match that of "white" light (which is simply the particular spectrum we evolved to perceive), the eye's photoreceptors become disproportionately fatigued, and perception of the light's color drifts toward white. You can experience this phenomenon yourself if you light a room entirely with red party lights. Soon, your red photoreceptors will become fatigued and the colors of objects in the room begin to appear more normal. I think explorers on Mars would experience the same effect. So photos like this are actually how it would look to them.

Yyyyeah...they're not "altering" the photo. What they're doing is balancing the color so that people can know what they are seeing. The reason for this is that the Martian atmosphere has radically different color properties from that of our own. What this means is that visible observations cannot be made reliably: for example, a red rock on mars may not actually be red as we understand the color, and so conclusions geoloists make based on a color may be erroneous, because they are basing those conclusions on colors observed under earth's sky.

Any proposals on what to do with images produced by instruments that sample outside of the human visual range? The guys down at legal said that I'm not allowed to use true-color displays for anything higher energy than longwave UV anymore... Not my fault what happened to those kids.

You know, geology is a science too. And geologists like to look at rocks. Most of them spend a lot of time on Earth, so they get used to looking at rocks under the kind of lighting found here on Earth.

That's why the photo has been adjusted to account for differences in martian lighting -- So that scientists looking at it can pick out details that they recognize.

Not to pick on you, but I'd say that you are perhaps working yourself into a tizzy over nothing. The difference between the photos of Mars from the 70's and what we are getting from the rovers now is hardly the result of NASA "lying...with photoshopped [sic] pictures." It's the result of better technology providing a more accurate representation of what we would actually see if we were there (or in the case of the white-balanced image in TFA, what we would see if that landscape were on earth, which can be useful for certain kinds of scientific investigation).

There is indeed a very, very fine line between simply processing a digital image and "Photoshopping" a digital image, but I would argue that these images are on the processing side of that line, rather than the "Photoshopped" side of the line. Consider this: my Canon Powershot -- admittedly, a much, much simpler device than Curiosity's cameras, I imagine -- doesn't produce RAW images; it processes every RAW image into a JPG. That introduces aberrations (JPG uses lossy compression after all, among other inaccuracies). Is that an "unscientific...photo alteration?"

Also, a lot of the photos we see from Spirit, Opportunity and now Curiosity are digitally stitched mosaics. For example, if you look at this photo [nasa.gov], you can clearly see the boundaries of many of the individual photos. Are you going to get uptight because this wasn't a single photo, but rather was digitally "altered?"

If this kind of processing irks you, then I humbly suggest that you take your own digital camera and do some experimentation. Go indoors and shoot a handful of photos at different times of the day, with and without indoor lighting. Do the colors match what you see with your eyes? What if you display the images on a different monitor? If you have the ability to shoot photos in RAW and JPG formats, compare them both with what you see. Now play with some of the settings on your camera. My Powershot has settings for natural (sunlight) lighting, incandescent lighting, florescent lighting, tungsten lighting, etc. These software filters adjust the white balance to the kind of lights that are being used inside your house because the CCD in a camera doesn't react to all frequencies of light in the same way your eye does. In fact, most digital cameras include an IR-cut filter over the CCD because the CCD is much more sensitive to IR light than your eyes. Is that hardware filter "altering" the photo? Your eye won't detect those frequencies of light, but it's really there, and the filter is removing it from the photo.

The colors in this image are not what a human standing on Mars would see â" the presence of dust in the atmosphere would make the scene appear much redder. Instead, the pictures have been white-balanced to show how it would appear under typical Earth lighting conditions. This will help the Earth-centered geologists who are trained to recognize features based on how they look using more familiar light.

They have the images which aren't color corrected. But for certain kinds of science, it's

This is not like Hubble images where they're assigning colors to radio / infrared / ultraviolet / xray frequencies that your eyes can't even see. The difference between the two images in this case is similar to what you get every day by putting on or taking off sunglasses, or looking out your window at midday vs. near sunset. Colors are shifting all the time, for the most part you are insensitive to it. Most people taking pictures don't even bother to use a gray card to get correct(?) white balance.

Speaking as someone who moved to Arizona so I could study Mars... that photo does not have NEARLY enough things that will poke you, scratch you, sting you, bite you, poison you, or wait patiently for you to die so they can feast on your still-warm remains.