You are here

The Magnitude System

The magnitude of an astronomical object is simply a measure of its brightness. It allows us to assign a numerical value to an object's brightness so we can compare it with other objects of interest or determine whether its brightness changes over time. The magnitude system was originally derived by Greek astronomers who divided stars visible to the naked eye into 6 groups, based on how soon (in multiples of ten minutes) after sunset each star appeared. Mathematically, the magnitude system can be considered a logarithmic system. To make life a little tricker, the system is also inverse. By this, we mean that the brighter an object is, the lower its magnitude is. Thinking back to its original definition as a first magnitude star would be seen in the first ten minutes after sunset whereas a sixth magnitude star (usually thought of as the limit of a good observing site when combined with someone with good eyesight) are the last to appear, one hour after sunset. i.e. when the sky is first properly dark.

A rule of thumb here is that a difference of 5 magnitudes corresponds to stars that differ in brighness by a factor of 100 and a difference of 1 magnitude is equal to a difference in brightness of 2.512.

Since the invention of the telescope, and especially since the progress made in astronomy in the last 100 years or so, the magnitude scale has been extended so that we now know of objects much fainter than the naked eye limit of 6 and are also aware of objects that are brighter than 1st magnitude. For example, the star Vega (alpha Lyrae) is given the value of zero. So an object brighter in the night sky than Vega will have a negative magnitude and a fainter one a positive magnitude. Sirius, in Canis Major, is the brightest star in the night sky and has a magnitude of -1.46.

Using modern telescopes with cameras/CCDs, we can measure the number of photons arriving at our telescope during an exposure and are then able to convert this value into a magnitude. This value is known as the object's apparent magnitude and is given by the lower case letter 'm'. This is a measure of how bright an object appears to us on Earth.

When you measure the apparent magnitude of an object from a CCD (or even a photographic plate), this is known as an instrumental magnitude. This is the simplest value for the magnitude that you can measure and doesn't allow for comparison to other people's measurements of magnitudes since, as the name suggests, this type of magnitude depends upon the instruments that have been used to measure it e.g. the telescope, CCD camera, filters etc. In fact, it also depends on the software used and any parameters that an individual user selects within the software. In order to be able to compare your magnitude values with other people's you may want to calibrate your results so that you are comparing like with like. You can do this by observing your object and a standard star.