Stars

Magnitudes
The magnitude scale was invented by an ancient Greek astronomer named Hipparchus
in about 150 BC He ranked the he could see in terms of their brightness,
with 1 representing the brightest down to 6 representing the faintest. Modern
astronomy has extended this system to stars brighter than Hipparchus' 1st
magnitude stars and ones much, much fainter than 6.

As it turns out, the eye senses brightness logarithmically, so each increase in
5 magnitudes corresponds to a decrease in brightness by a factor 100. The
absolute magnitude is the magnitude the stars would have if viewed from a
distance of 10 parsecs or some 32.6 light years. Obviously, Deneb is
intrinsically very bright to ...

Want to read the rest of this paper?Join Essayworld today to view this entire essayand over 50,000 other term papers

but a
magnitude difference of 5 has been set to exactly a factor of 100 in intensity.
Absolute magnitudes are how bright a star would appear from some standard
distance, arbitrarily set as 10 parsecs or about 32.6 light years. Stars can be
as bright as absolute magnitude -8 and as faint as absolute magnitude +16 or
fainter. There are thus (a very few) stars more than 100 times brighter than
Sirius, while hardly any are known fainter than Wolf 356.

Star, large celestial body composed of gravitationally contained hot gases
emitting electromagnetic radiation, especially light, as a result of nuclear
reactions inside the star. The sun is a star. With the sole exception of the sun,
the stars appear to be fixed, maintaining the same pattern in the skies year
after year. In fact the stars are in rapid motion, but their distances are so
great that their relative changes in position become apparent only over the
centuries. The number of stars visible to the naked eye from earth has ...

Get instant access to over 50,000 essays.Write better papers. Get better grades.