I remember one of the hardest things for me to understand when I first began studying astronomy was grasping the "magnitude" of stars. Magnitude -- or more properly -- apparent magnitude is a number that describes how bright a star or object in the sky appears.
|Star Magnitude Scale|
Image by Astroplot
|Star Magnitude Chart|
Image by Royal Astronomical
Society of Canada
As telescopes and binoculars were invented and turned to the skys, we realized that there were whole classes of stars we couldn't see. All of a sudden there were 7th order, and 8th order, and so on. Now, with binoculars on a clear night, you might be able to see stars up to magnitude 9. With a telescope and a clear night, perhaps even 10, 11 or 12. The earth's biggest telescopes can see stars of magnitude 22 (with a 24" lens) or magnitude 27 (with an 8m lens). And ones you are out of earth's atmosphere, the hubble space telescope can see magnitude 32.
Of course, all these numbers are pretty meaningless without some more precise way of defining them. As we developed ways of measuring how much light is visible instead of just eyeballing it, we could reclassify stars more exactly. The earliest astromomer's estimated that 1st order stars were twice as bright as 2nd, and so on. Remarkably, without any tools to really measure light intake, they were pretty close. Turns out, to maintain the classifications of the early astronomers, a factor of about 2.5 is required instead. Though the whole system could have been scrapped and redone, astronomers chose a system that tries to mimic the original by using a logarithmic formula.
First, an arbitrary star had to be chosen to be a starting point for the scale. Astronomers chose the bright star Vega. This was set to be 0, and all other stars were compared in brightness relative to Vega. If Vega was 2.5 times brighter, that star was rated 1, and there are roughly a dozen that have magnitudes near one, which corresponded fairly consistently with the earliest classification of 1st order stars.
Stars that were 2.5^2 or around 6 times fainter than Vega were classified as 2's. There are roughly 50 stars with a magnitude of around 2.
The 175 or so stars that were 2.5^3 or 15 times fainter than Vega were classified as 3's.
The 500 or so stars that were fourth magnitude are 2.5^4 or roughly 1/50th the brightness of Vega.
All stars apparent magnitude's then can be classified by:
Not all stars fit exactly into a classification, but with this new scaling system, one could define a stars brightness precisely in between magnitudes, and so stars could now be given magnitudes of 1.3, or 4.2. You can always compare a stars brightness to Vega by using the factor of 2.5^m. For instance, the North Star has a magnitude of 1.98, and so Vega is 2.5^1.98 or about 6.1 times brighter.
There are some stars that are brighter than Vega -- for instance, the brightest star Sirius is 3.6 times brighter than Vega. To determine its magnitude then find -2.5 log (3.6) which is -1.4, and so there are some stars that have negative magnitudes. In fact, some planets such as Venus can get even brighter and have a lower magnitude. Turned to the moon, the magnitude can get as low as -12.74, which means it is 2.5^-12.74 times fainter, or 2.5^12.74 times brighter than Vega. And shining more than 40 billion times brighter than Vega, our sun during the day chimes in at -2.5 log (40billion) = -26.
To finish, here's a map of the Big and Little Dippers, to help you become a little more familiar with some of the numbers involved. Try to identify for yourself which stars are the brightest and faintest, and compare them with the numbers listed below.
|Magnitude of stars in Ursa Major and Ursa Minor|
Image by AstroBob