1 Answer
1

Historically magnitudes ranged from First to Sixth and were assigned by guess by Ptolomaeus. His First Magnitude was the equivalent of saying first size, so it was the brightest, and Sixth Magnitude was the assignment he made for the dimmest seen with the naked eye.

Nowadays, we use roughly the same scale but with real numbers, ranging from -26.74 (Sun) up. We defined the star Vega to be the exact 0.0 apparent magnitude (and 0.0 colour in all filters, for the case), and then we use the formula

$m_1-m_{ref}=-2.5log{I\over I_{ref}}$

to define all other apparent magnitudes from that one by measuring Intensities.

So to answer you, yes, we measure apparent magnitudes rather accurately. For stars other than Sun, they range from -1.46 for Sirius up to +8 in a perfect night for a trained naked eye (typically +6 for the normal eye on a normal, far from city lights, night).