Star Light, Star Bright

Student Sheet(s)


Background Information

Magnitude is the degree of brightness of a star.  An ancient Greek astronomer named Hipparchus in about 150 B.C invented the magnitude scale.  He ranked the stars he could see in terms of their brightness, with 1 representing the brightest up to 6 representing the faintest.  In 1856, British astronomer Norman Pogson proposed a quantitative scale of stellar magnitudes, which was adopted by the astronomical community.  He noted that the Earth receives 100 times more light from a first magnitude star as from a sixth; thus with a difference of five magnitudes, there is a 100:1 ratio of incoming light energy, which is called luminous flux.

What are apparent and absolute magnitudes?  Apparent magnitude is how bright the stars appear to us in the sky.  The scale is somewhat arbitrary.  Absolute magnitudes are how bright a star would appear from some standard distance, arbitrarily set as 10 parsecs or about 32.6 light years.  Stars can be as bright as absolute magnitude -8 and as faint as absolute magnitude +16 or fainter.  As it turns out, the eye senses brightness logarithmically, so each increase in 5 magnitudes corresponds to a decrease in brightness by a factor of 100.

 

 

Materials

 

 

Procedure

  1. Using the data in the following table, make a bar graph of the distance in light years.
  2. Make a line graph of the stars' absolute magnitude and apparent magnitude.
  3. Make a line graph of the distance in light years and apparent magnitude.

Chart

 

Questions

  1. Which star is the brightest?
  2. Which star is the dimmest?
  3. What is the difference between apparent and absolute magnitude?
  4. Explain the Sun having an apparent magnitude of -26.72, but its absolute magnitude is 4.8.
  5. What does a lower magnitude value mean?
  6. Write a one paragraph summary beneath each of your graphs.
  1. Are the brightest stars closer or further away from the Earth?