Posted
by
samzenpus
on Monday May 14, 2012 @10:49AM
from the realer-than-real dept.

Diggester writes "The satellite, known as Elektro-L No.1, took an image from its stationary point over 35,000 kilometers above the Indian Ocean. This is the most detailed image of the Earth yet available, capturing the Earth in a single shot with 121-megapixels. NASA satellites use a collection of pictures from multiple flybys stitched together. The detail in the pic is just amazing."

Thats only a square 11000 pixels on a side. A 300 dpi laserprinter would make a roughly one yard/one meter printout.

At a slightly higher resolution that would be a metric A0 paper size. printers that big do exist but are kinda expensive. Best upload it to your local printer/office store and let them print it instead of do it yourself.

A photo image like this tends to have pixels that each store 24 bits of RGB color (one of about 16.7 million light colors). A color laser printer pixel usually has one of four pixel ink colors (cyan, magenta, yellow, or black). You can compare the pixels, but you shouldn't compare them one to one.

I specified that based on visual acuity limits. There's a lot of optical theory explaining why over 300 dpi is mostly useless for toner on paper. Unless your eyeball lens diameter is 10 times bigger than the average human or your retina cell layout is different than all known humans, it is not optically possible to resolve 3000 dpi or whatever on paper under normal conditions and lighting. Depending on how close you can hold the paper before you can't focus on it anymore, and tangentially depending on how bright the light it (little pinhole camera iris) humans top out around 300 dpi.

Now, projected thru transparencies onto a overhead, higher res works, if you have old fashioned overhead projectors and sit close to the screen. Also there are ugly aliasing and anti-aliasing effects that can be avoided by higher res with real vector scaling. And high res allows better/smoother color mixing, in that bluring together 2**8 pixels of 2**16 color is the same as one 2**24 pixel, more or less. There are also relative brightness/consistency effects where making a "line" that varies from 8 to 9 pixels wide looks a lot less consistent than a line that is 85 or 86 pixels wide at 10 times the res, look at the percentage variation of one pixel. If the lighting is really bad, there are strange shadow effects where you can perceive over 300 dpi if the shadows land just right. Also there are some strange toner based textural issues where the plastic surface of thinner lines literally looks different. And some 3-d effects of toner on paper. So over 300 dpi is not a complete waste of time, just mostly a waste with average pictures under average conditions. It would be extremely hard to justify over 1200 dpi even in the weirdest corner cases.

Even though you can't tell the difference between 300 and 600 dpi black text on white paper, you sure as hell can tell the difference between 85 lpi (laser printer halftones) and 175 lpi (glossy magazine halftones). And you cannot pull off said 175 lpi (even 150 lpi) with less than 1200 dpi, 2400 dpi being recommended. The 600 dpi printer just isn't exact enough.

I wonder if DPI is like "megapixels" or "sears air compressor horsepower" where the engineering definition no longer has any relationship to the marketing definition.

For example, marketing could sell four toner colors at 100 dpi as being "400 dpi" after all its four toners each at 100 dpi, right? Something like that would explain why the OP's printer can't successfully output at better than 100 dpi, despite marketing claiming 600 dpi. Hell I could s

Agreed, why not just set the infrared "vegetation" band to some hue near green so that it can at least look a little like the real thing? Or maybe just leave the IR pass out altogether? I like my Nasa-made "ghettopixel" blue marble image much better, thanks.

Looking carefully, I think it's not chromatic aberration, it's a slight change in lighting conditions, cloud cover, and (perhaps) satellite orientation during the shot. I expect that each color filter image was taken separately, and a several minutes apart. Any movement, or color change between each exposure leads to those edge effects you see.

At least it doesn't have that fake, way too thick and bright "atmosphere" that the more natural colored NASA image has (the famous one centered on North America). T

Well, If you look at how much is covered in concrete and asphalt, then have a look at where the Mississippi dumps into the Gulf couple with how drinkable most river water is.............It's a dead rock with sick black oceans.

When you zoom in a little on google maps, all the green stuff is life, all the grey stuff is cancer.

I'm not a photography expert especially when it comes to infrared imagery, but are there RAW files of that sort of thing where the post-processing could be done in an image editor? I think some people would prefer green for vegitation.This looks like we have plenty of Spice reserves.

Also, I looked at the zoomable image and zoomed in all the way in and.... saw mostly macroblocks? Is that still "amazing detail" in a sense that eludes me?

That particular Gigapan upload was 1.12Gpix which suggests that they did some sort of interpolation to make it appear more grandiose. And the rust-orange is because that is the most creative thing the russians could think of to use the IR band for (heck with making it some shade of dark green...)

Yeah, I noticed the same thing. I would have thought that with 1km resolution you might be able to pick out a vague smudge where some some of the larger cities in India and China that were visible from the satellite, but no, just massive amounts of chromatic aberation from the imaging method used. Clearly roads are going to be out, so I tried again with the 1080p video clip - thought that maybe Mumbai or Shanghai would show up as a brighter spot in the darkness - after all you can see city lights on much

One reason the NASA global-coverage image sets that were released in 2002 (with updates starting in 2005) have become the de-facto standard source is that: 1) anyone can download them; and 2) they're in the public domain, so anyone can use them for any purpose. You can get a bunch of versions here [nasa.gov] and from the Visible Earth site linked at the bottom of that page.

This one looks cool, but further use will be limited if the only thing I can do with it is look at it in this online zooming browser.

Also, there seems to be a lot of chromatic distortion on the image. Check out the clouds - there are three separate registrations for each color in the cloud image. Were their optics not calibrated, or did they take each color picture separately?

I came into the comments to say this. Holy hell is the chromatic aberration on that image absolutely terribly. It looks a lot like they took the different color channels separately (that would explain why the clouds, which are moving, were especially bad), and TFA says the pictures take ~30 minutes each, so that's the only thing that makes sense to me.

For processing such an image for publicity release, it'd be customary to estimate motion vector fields between each pair of consecutively taken images, and apply motion compensation to register the clouds with minimal aberration. They apparently didn't do that.

If I'm understanding the article correctly, it sounds like they sent the raw data to "an educator named James Drake" on request. Presumably he's the one who did the overlay, but possibly doesn't have any specialist background in this area, so did it the quick-and-dirty way.

If only there was an article [theverge.com] somewhere that described how they made the image...

The image certainly looks different than what we're used to seeing, and that's because the camera aboard the weather satellite combines data from three visible and one infrared wavelengths of light, a method that turns vegetation into the rust color that dominates the shot.

That's what happens when you spring for a high end body (to keep up with the Joneses) and then cheap out on crap lenses - and then don't bother creating a lens profile in Lightroom to correct CA in post.;)

I suspect it is from how the image was composited. The article, if you'd bothered to read it, indicates that the camera takes shots using four color filters: RBG, but also an infrared filter. The image you see above is the composite of those four images (with the infrared given a reddish brown tint, which makes all the vegetation look brown), and there may well be some registration error that wasn't accounted for.

The "Blue Marble" image you're pointing at is based on EOS (Terra/Aqua) imagery. The most recent NASA Blue Marble (Blue Marble 2012 [nasa.gov]) is a composite based on the new NPP Suomi spacecraft, with approximately a 1-km pixel resolution.

As to "accurate"... I think the Blue Marble images (based on the visible-light band sensors of their respective spacecraft) are closer to what a naked eye in orbit would perceive than the Russian imagery, which seems to include false-color infrared. But "naked eye in orbit" is scientifically less useful than the multi-spectral IR and visible all of these spacecraft can sense.

Correction: the spatial resolution of the NPP Suomi (VIIRS instrument) imagery is about 500 meters per pixel, not 1km. 1km is the approximate resolution of the MODIS instrument for the previous-generation Blue Marble pictures (Terra and Aqua spacecraft).

I really dislike the 2012 Blue Marble, due to the very visible stripes where it's been quilted. It may have far more pixels, but I think the original 1972 Apollo 17 image is far more visually impressive.

"The image certainly looks different than what we're used to seeing, and that's because the camera aboard the weather satellite combines data from three visible and one infrared wavelengths of light, a method that turns vegetation into the rust color that dominates the shot."

I expected the weather (by that I mean clouds) to move faster. I may just be used to seeing weather maps on the TV update in 4 hour slides. I'm sure the 100mph winds I'm looking at, at that scale is fast enough:)

The cameras used in the Apollo program included a 70mm Hasselblad. IIRC, years ago as digital cameras struggled to pass the 2 to 3 megapixel range, it was said that to be equivalent to 35mm, you'd need 15-18 megapixel. That was, I believe, to match the grain densities of 64 or 100 speed film. So scale that up about 4x to go from 35mm to 70mm. I'd say those Hasselblads did just fine.

Our current series of geostationary weather satellites operated by NOAA have been taking images at 1 km resolution for the visible band and 4 km for four IR bands since 1995. The primary difference with Elektro is that it has more bands, two visible bands at 1 km and 8 IR bands at 4 km (which is why it looks blocky when you zoom in). A description of that imager can be found here:
http://database.eohandbook.com/database/instrumentsummary.aspx?instrumentID=784 [eohandbook.com]
The image referenced in the article is a false color composite, which has been a common product from weather satellites (geostationary and otherwise) since we started using them decades ago. It shows vegetation more than we have seen from GOES because it has a near-IR band. GOES typically takes "full disk" images every three hours.
The US has a new platform going up in 2016 with 16 bands - visible bands are 0.5 km and IR are at 1 km. That sensor will not be able to do true color (some of us fought hard for that...) but it can be simulated to an extent (the sensor will have red and blue wavelength sensing abilities, with a near-IR band allowing use of a look-up table to generate green; the surface under thin clouds, around coastal areas, and some other cases don't look quite right). Japan has bought the same sensor from the same vendor but swapped out a band and replaced it with green, so they will have true color images at roughly 22,000x22,000 pixels in the 2014-2015 time frame. This new sensor can take "full disk" images every 15 minutes (that is the scan schedule set for the US, it could go faster than that).
The US took true color images from a geostationary camera on ATS-3 in the late 1960s. As far as I know no one has taken true color images from the geostationary orbit since.
I haven't looked closely at Elektro data but the loop I've seen indicates light leaking into the telescope as the sun starts to light the Earth in the east (ie sunrise) - it looks like a lens flare. Many weather satellites have issues like this to some extent, but in this case it was more pronounced than I've usually seen it.

There has to be one in there somewhere! Quick, get a army of volunteers to go over it!
Not that I am trying to keep anyone from playing Diablo 3 tonight after midnight. Has nothing to do with this..honest.

That light-dark cycle has been going on for billions of years, ceaselessly, perfectly. An amazing machine.

The importance of perspective is underscored as well. From the geostationary satellite, it looks as though the earth is still. And it is - from that perspective. From the perspective of other universal bodies however, the earth is moving.

Kudos to the Russkies for capturing this perspective and to James Drake for creating the video.

The rust is annoying though... Because they're compressing 4 wavelengths into 3 wavelengths. An image with only the RGB would look nicer. They could store the 4th IR channel as alpha channel...

No matter which way you "look" at something you are either compressing or ignoring some quality of light. The "art" of astrophotography is therefore about how much information you intend to leave out and how much you squeeze into the narrow bands of light we humans can perceive. If you are not happy with the rendering, you might be able to source the uncompressed scientific data -- which will still only ever contain partial-information due to optical, CCD and other limitations -- and render it yourself [spacetelescope.org]... Assuming Roskosmos make their equivalent of FITS data available to the public like NASA does.

You put the U.S. into such a panic about falling behind in science and technology that they funded my science education.

I couldn't have done it today. No more free tax-funded education. We have to go out and buy our education the free market. No more free tuition at City College. You have to be rich to study engineering in America now.

You put the U.S. into such a panic about falling behind in science and technology that they funded my science education.

I couldn't have done it today. No more free tax-funded education. We have to go out and buy our education the free market. No more free tuition at City College. You have to be rich to study engineering in America now.