“The self-same atoms which, chaotically dispersed, made the nebula, now, jammed and temporarily caught in peculiar positions, form our brains; and the ‘evolution’ of brains, if understood, would be simply the account of how the atoms came to be so caught and jammed.” –William James

Up in the heavens, there are planets, stars, and galaxies all clearly visible in the night sky.

Image credit: Dan & Cindy Duriscoe, FDSC, Lowell Obs., USNO.

But those stars weren’t always there, and they won’t be there forever. The other class of object in the night sky — the nebulae — come in two types. On one hand, there are the nebulae that result from the death throes of stars, dying in either a supernova explosion or in a gentler blowing-off of their out layers in a planetary nebula.

This release of around 50% of the mass of the star into interstellar space puts enough hydrogen gas out into space that some day down the road, this gas could get another chance to burn as fuel in the nuclear furnace of stars.

But the other type of nebula — including the most famous of nebulae — represents a race to form that immediate next generation of stars.

Image credit: Mike Hankey of http://www.mikesastrophotos.com/.

Of course, this hardly looks like a famous nebula; like practically all nebulae visible through a small telescope, it appears like a faint, fuzzy cloud, with perhaps a red tint to it if you can gather enough light.

Of course, you might recognize this — far more well-known — view of this nebula a little bit better.

Image credit: J. Hester & P. Scowen, STScI, ESA, NASA.

These famous gaseous structures — the Pillars of Creation — are located at the heart of the Eagle Nebula, and tell part of the story of where new stars in the Universe come from.

Practically every galaxy in the Universe, including our own Milky Way, has significantly more hydrogen gas in it than it has stars, in terms of mass. Most of that gas is diffuse, but in a few locations, the gas has clumped together into large molecular clouds, some of which we can see.

Image credit: NASA and The Hubble Heritage Team (STScI/AURA).

Over time, this cool gas will collapse under its own gravity, contracting into denser and denser regions.

When the temperature of those most dense regions inside rises to the critical value necessary to initiate nuclear fusion, a new star is born, and then the great cosmic race begins in earnest.

From deep inside these clouds of interstellar gas, gravity works to pull every atom in that it can and form more and larger stars. But the stars themselves emit intense, ultraviolet light, evaporating and ionizing the surrounding gas and blowing it out into the interstellar medium.

Eventually, after maybe 10% of the gas has formed objects like stars and planets that cannot be blasted apart by mere radiation, the stars will inevitably win.

So when you look up at the night sky and see those faint nebulae, with their reddish hues from the recombination radiation of hot, UV-ionized hydrogen, you are watching the last stages of that great cosmic star-formation race.

Image credit: John Nassr at Stardust Observatory.

So why, then, do the most famous pictures of these nebulae not look red at all, but rather colorized in this multichromatic fashion?

Image credit: T. A. Rector & B. A. Wolpa, NOAO, AURA.

This false coloring is done by taking narrow-line spectroscopy of the nebula in three different bands, with each band sensitive to the light that’s emitted from a particular element. Although the light coming from the hydrogen atoms far outstrips the light from all other elements, and is red, it is shown in the above composite in green, while oxygen (in blue) and sodium (in red) are more heavily weighted in order to balance out the false color displayed in the final image.

On the minus side, the human eye would never see anything like this from looking at any part of the Eagle Nebula.

On the plus side, the false coloration definitely brings out the contrast of the gaseous regions, including the 4-light-year-long Pillars of Creation, and also the even larger Fairy of the Eagle Nebula, both of which are in the final stages of forming new stars inside while they are slowly obliterated from both internal and external ultraviolet radiation.

Image credit: The Hubble Heritage Team, (STScI/AURA), ESA, NASA.

These structures — known as Evaporating Gaseous Globules or EGGs — are the locations of the last stars that will form in these great nebular complexes. The race between gravity and photoevaporation is one that will be no contest in every known nebula, with perhaps 90% of the gas failing to make it into a star or planet. When all of the EGGs are gone, it’s just a matter of time before the rest of the gas remnants are boiled away by the newly formed stars, until only a brilliant star cluster is left.

Film was more sensitive to the red of Hydrogen Alpha end than the blue-green of Nitrogen/Oxygen III. But most of the light output by such nebulae are in the OIII range, where the film emulsion is experiencing a lack of sensitivity, whilst our eyes have the opposite effect.

Most especially with the Orion Nebula, you can with good skies and 8 or more inch Newtonian get a very very tiny hint of pink in the wings, most pictures you see are aping the colour cast of the original film views.

But visually with an 8″ Newtonian you’ll see a lot more bluish grey, from the OIII emissions in Orion than any form of red.

The horsehead nebula being dark (it’s a dark nebula) against a weak and deep red (where your eye is insensitive) background make it a very difficult view.

Wow, that is an interesting astrophotography fact that I did not know, about the old film and the sensitivity to red.

Thanks for sharing! Other than an amateur astronomy night or two, I’ve only ever looked through scopes much smaller or much larger than 8″; it’s weird to have familiarity with up-to-4″ scopes and 18-24″ scopes, but nowhere in between!

So are the pillars empty voids where gas has collapsed, surrounded by a sea of gas we can see through? Or is it the opposite, with the pillars being dense with gas, surrounded by mostly open space that has been cleared by the burning stars?

It took a little searching. I needed to enter both “Milky Way” and “Eagle nebula” to get the answer.

But I found this, “The distances to the M16 Eagle Nebula and the M17 Omega Nebula are not known with precision. There is little doubt that these clouds of star formation lie farther away than the more brilliant Great Orion Nebula, the star-forming nebula that’s visible to the unaided eye in the winter sky. When you look at M16 and M17, you’re gazing at deep-sky wonders in the next spiral arm inward: the Sagittarius arm of the Milky Way galaxy.

The M16 Eagle Nebula lies at an estimated distance of 7,000 to 9,000 light-years, and the M17 Omega Nebula is thought to be around 5,000 light-years away. In contrast, the Orion Nebula resides within the Orion spiral arm (the same spiral arm as our solar system) at some 1,300 light-years distant.” EarthSky.org

So I’ll rewrite the idea and put it on wiki with anything else you educate me about the proper place of the Eagle Nebula in our Milky Way Galaxy. Hmm, it’s been a while since I’ve put something on wiki (It works though); I’ll figure it.

How does a modern CCD with Bayer filter compare? I know the Bayer has double the green sub-pixels to account for our eye’s sensitivity in that range, but is the underlying CCD biased?

Would an H-a filter let you see the red in Orion visually?

Similarly would it help with the Horsehead? Or is it not an issue of background light washing out the red and rather just the low amounts of red light we aren’t sensitive to, so the filter would just represent a net loss?

The beyer mask on any CMOS/CCD digital imager (even I believe the Foveon sensor) requires a low pass filter in front of the CCD to allow the spot of light to spread and the light to illuminate several pixels (else a red star illuminating a blue beyer mask would give no output, and any point image would be monochromaticly red, blue or green). All the monochrome CCDs do the same, but mostly so you can remove moire from the image and similar artifacts .

But apart from that, the CCD behind the mask is identical (except the Foveon sensor which has the pixel actually change sensitivity with wavelength for each layer registering a colour).

Putting a Ha filter would help remove any polluting light from stars, nebula, UFOs or skyglow, but wouldn’t make the actual image any better, so you still need a lot of light to collect to make your eye register the image.

Since the light from the horsehead nebula is entirely Ha, there’s no loss in the filter (well, maybe 2% loss), and most of the light from Orion is at the ~500nm of OIII, so it WILL be dimmer, and by quite a bit.

The Ha filter is often called the horsehead filter since the horsehead nebula is one of the few strong pure Ha sources amateurs can easily get a look at.

Lastly, on the beyer mask, there is quite a lot of overlap between the red, blue and green channels on a digital colour mask, so your green pixel will still pick up ~20% of the intensity of mainstream blue or red light falling on it. A Ha filtered image will register a tiny bit of green but no blue. However, you can use the blue as your dark image, which is kind of handy…

The filters visually do nothing other than remove extra light you’re NOT interested. This helps perceive the image, but doesn’t make it much easier.

Last thing. With a proper astronomical CCD sensor you’re looking at up to 90% quantum efficiency (you have a 90% chance of catching EACH PHOTON as a signal) for monochrome peak, a peak of ~65% for a colour astronomical CCD sensor, and ~20% or less for a DSLR or compact/webcam sensor.

Add to that the active cooling (you reduce the noise by half for each 7degrees C of cooling applied), and there is a very big difference.

The sensor on an astronomical CCD will be ~1-2MPix, on a sensor 9mm across for about the same price as a APS-C 12MPix Cannon DSLR (23mm across).

Compact (including the bigger fixed-lens SLR-type) will range from a 6mm 8MPix image to 18mm 20MPix. The light gathered per pixel here is obviously even worse.

Aye, narrow band filters are often something bought for a group and shared round.

Mind you, you can get pretty close with a Wratten 29A (IIRC, it’s the deepest red, often used in gel film in darkrooms) filter. It is a bandpass filter that goes pretty much from ~670nm to way past 750nm and as far as your eyes are concerned, pretty much the same.

Another thing about electronic cameras’ sensitivity to red light is that the sensors themselves are sensitive enough to infrared light that there’s generally a filter between the lens and sensor to eliminate most of that infrared, and it generally takes some of the red with it. When getting a good DSLR for astrophotography, the serious astrophotographer will usually remove that filter (or have it removed); for a while, one of the Canons was even available in a special astrophotography version that came without that filter. (I think it was either a 5 or a 20?)

OK, just checked up on that, and it was the EOS 20Da (the “a” for “astrophotography”, and it’s otherwise mostly like a EOS 20D), from 2005 to 2006. What I didn’t know is that they have recently come out with a successor, finally, the EOS 60Da. Guess which camera it’s otherwise like?