THE ASTRONOMICAL BRIGHTNESS SCALE

Astronomers use a "magnitude" scale to classify objects such as stars
and planets according to their perceived brightness. The first such
scale we know of was devised by the Greek astronomer Hipparchus
around 150 BC.

The faintest stars we can see with our eyes on a dark night have an
astronomical magnitude of +6, whereas Sirius, the brightest star in the
sky has a magnitude of -1. The fainter an object is, the more positive
is its magnitude, whereas very bright objects have increasingly
negative magnitudes.

The magnitude scale is a logarithmic scale (because this is how the eye
perceives light), and each successive magnitude is 2.5 times dimmer
(or brighter) than the next magnitude. This value was adopted to
keep the magnitude scale similar to that of Hipparchus where first
magnitude stars are about 100 times brighter than sixth magnitude
stars. The formula that relates magnitude to brightness or luminosity is: