Star
Light, Star Bright |
Student
Sheet(s) |
|
Magnitude is the degree of brightness of a star. An ancient Greek astronomer named Hipparchus in about 150 B.C invented the magnitude scale. He ranked the stars he could see in terms of their brightness, with 1 representing the brightest up to 6 representing the faintest. In 1856, British astronomer Norman Pogson proposed a quantitative scale of stellar magnitudes, which was adopted by the astronomical community. He noted that the Earth receives 100 times more light from a first magnitude star as from a sixth; thus with a difference of five magnitudes, there is a 100:1 ratio of incoming light energy, which is called luminous flux.
What
are apparent and absolute magnitudes?
Apparent magnitude is how bright the stars appear to us in the sky. The scale is somewhat arbitrary. Absolute magnitudes are how bright a star
would appear from some standard distance, arbitrarily set as 10 parsecs or
about 32.6 light years. Stars can be as
bright as absolute magnitude -8 and as faint as absolute magnitude +16 or
fainter. As it turns out, the eye
senses brightness logarithmically, so each increase in 5 magnitudes corresponds
to a decrease in brightness by a factor of 100.