Stellar Magnitudes

views updated May 11 2018

Stellar Magnitudes

How bright it looks: apparent magnitude

How bright it really is: absolute magnitude

The nature of the magnitude scale

Magnitudes in modern astronomy

Resources

Magnitude is the unit used in astronomy to describe a stars brightness in a particular portion of the electromagnetic spectrum. Stars emit different amounts of radiation in different regions of the spectrum, so a stars brightness will differ from one part of the spectrum to the next. An important field of research in modern astronomy is the accurate measurement of stellar brightness in magnitudes in different parts of the spectrum.

How bright it looks: apparent magnitude

Greek astronomer Hipparchus (c. 166125 BC) devised the first magnitudes in the second century BC. He classified stars according to how bright they looked to the naked eye: the brightest stars he called 1st class stars, the next brightest 2nd class, and so on down to 6th class. In this way all the stars visible to the ancient Greeks were neatly classified into six categories.

Modern astronomers still use Hipparchus categories, though in considerably refined form. With modern instruments astronomers measure a quantity called V, the stars brightness in the visual portion of the spectrum. Since visual light is what human eyes most readily detect, V is analogous to Hipparchus classes. For example, Hipparchus listed Aldebaran, the brightest star in the constellation Taurus (the Bull), as a 1st class star, while today astronomers know that for Aldebaran V = 0.85. Astronomers often refer to a stars visual brightness as its apparent magnitude, which makes sense since this describes how bright the star appears to the eye (or the telescope). One may hear an astronomer say that Aldebaran has an apparent magnitude of 0.85.

Hipparchus scheme defined from the outset one of the quirks of magnitudes: they run backwards. The fainter the star means the larger the number describing its magnitude. Therefore, the sun, the brightest object in the sky (with respect to the Earth), has an apparent magnitude of 26.75, while Sirius, the brightest star in the sky other than the sun and visible on cold winter nights in the constellation Canis Major (the Big Dog), has an apparent magnitude of 1.45. As another example, the full Moon has an apparent magnitude of 12.6. The faintest star one can see without optical aid is about +5 (or+6 if sharp eyes are used), and the faintest objects visible to the most powerful telescope on the Earth have an apparent magnitude of about +30.

How bright it really is: absolute magnitude

More revealing than apparent magnitude is absolute magnitude, which is the apparent magnitude a star would have if it were ten parsecs from the Earth (a parsec is a unit of distance equal to 12 trillion mi [19 km]). This definition is important because apparent magnitude can be deceiving. One knows that a lit match is not as bright as a streetlight, but if one holds the match next to the eye, it will appear brighter than a streetlight six blocks away. That is why V is called apparent magnitude: it is only how bright the star appears to be. For example, the sun is a fainter star than Sirius. Sirius emits far more energy than the Sun does, yet the sun appears brighter to humans on the Earth because it is so much closer. Absolute magnitude, however, reveals the truth: the sun has an absolute magnitude of+4.8, while Sirius is+1.4 (remember, smaller numbers mean brighter, not fainter).

The nature of the magnitude scale

In 1856, British astronomer Norman Robert Pogson (1829 1891) noticed that Hipparchus 6th class stars were roughly 100 times fainter than his 1st class stars. Pogson did the sensible thing: he redefined the stars V brightness so that a difference of five magnitudes was exactly a factor of 100 in brightness. This meant that a star with V = 1.00 appeared to be precisely 100 times brighter than a star with V = 6.00. One magnitude is then a factor of about 2.512 in brightness. Try it: enter 1 on a calculator and multiply it by 2.512 five timesone has just gone from first magnitude to sixth (or eight to thirteenth, or minus seventh to minus second, or any other such combination).

Looking back to the numbers above, one sees that the sun (V = 26.75) has an apparent visual brightness 25 magnitudes greater than Sirius. That is five factors of five magnitudes, or 100× 100× 100× 100× 100: ten billion! And the difference in apparent brightness between the Sun and the faintest object humans have ever seen (using the Hubble Space Telescope) is more than 56 magnitudes, or a factor of ten billion trillion.

Magnitudes in modern astronomy

In the 140 years since Pogson created the modern magnitudes, astronomers have developed many different brightness systems, and the most popular are less than 50 years old.

For example, in 1953, American astronomer Harold Lester Johnson (19211980) created the UBV system of brightness measurements. It is already known that V is measured over that portion of the electromagnetic spectrum to which human eyes are most sensitive. B is the stars brightness in magnitudes measured in the blue part of the spectrum, while U is the brightness in the ultravioleta spectral region human eyes cannot detect. There are many other brightness measurement systems in use, so many that astronomers often disagree on how measurements in one system should be converted to another.

Accurate measurement of stellar brightness is important because subtracting the brightness in one part of the spectrum from the brightness in another

KEY TERMS

Absolute magnitude The apparent brightness of a star, measured in units of magnitudes, at a fixed distance of 10 parsecs.

Apparent magnitude The brightness of a star, measured in units of magnitudes, in the visual part of the electromagnetic spectrum, the region to which human eyes are most sensitive.

Brightness The amount of energy a star radiates in a certain portion of the electromagnetic spectrum in a given amount of time, often measured in units of magnitudes.

Magnitude The unit used in astronomy to measure the brightness of a star. One magnitude corresponds to a factor of 2.512 in brightness, so that five magnitudes equals a factor of 100.

part reveals important information about the star. For many stars the quantity BV gives a good approximation of the stars temperature. And, it was established in 1978 that the quantity VR, where R is the brightness in the red part of the spectrum, can be used to estimate a stars radius. This is important, because advances in scientific understanding of the stars require knowledge of basic parameters like temperature and radius, and careful measurements of brightness can provide some of that information.

See also Spectral classification of stars.

Resources

BOOKS

Arny, Thomas. Explorations: An Introduction to Astronomy. Boston, MA: McGrawHill, 2006.

Chaisson, Eric. Astronomy: A Beginners Guide to the Universe. Upper Saddle River, NJ: Pearson/Prentice Hall, 2004.

Krumenaker, Larry, ed. The Characteristics and the Life Cycle of Stars: An Anthology of Current Thought. New York: Rosen Publishing Group, 2006.

Kundt, Wolfgang. Astrophysics: A New Approach. Berlin and New York: Springer, 2005.

Sherrod, P. Clay. A Complete Manual of Amateur Astronomy. New York: Dover, 2003.

Zelik, Michael. Astronomy: The Evolving Universe. Cambridge and New York: Cambridge University Press, 2002.

Jeffrey C. Hall

Stellar Magnitudes

views updated May 09 2018

Stellar magnitudes

Magnitude is the unit used in astronomy to describe a star's brightness in a particular portion of the electromagnetic spectrum . Stars emit different amounts of radiation in different regions of the spectrum , so a star's brightness will differ from one part of the spectrum to the next. An important field of research in modern astronomy is the accurate measurement of stellar brightness in magnitudes in different parts of the spectrum.


How bright it looks: apparent magnitude

The Greek astronomer Hipparchus devised the first magnitudes in the second century b.c. He classified stars according to how bright they looked to the eye : the brightest stars he called "1st class" stars, the next brightest "2nd class," and so on down to "6th class." In this way all the stars visible to the ancient Greeks were neatly classified into six categories.

Modern astronomers still use Hipparchus' categories, though in considerably refined form. With modern instruments astronomers measure a quantity called V, the star's brightness in the visual portion of the spectrum. Since visual light is what our eyes most readily detect, V is analogous to Hipparchus' classes. For example, Hipparchus listed Aldebaran, the brightest star in the constellation Taurus (the Bull), as a 1st class star, while today we know that for Aldebaran V = 0.85. Astronomers often refer to a star's visual brightness as its apparent magnitude, which makes sense since this describes how bright the star appears to the eye (or the telescope ). You will hear an astronomer say that Aldebaran has an apparent magnitude of 0.85.

Hipparchus' scheme defined from the outset one of the quirks of magnitudes: they run backwards. The fainter the star, the larger the number describing its magnitude. Therefore, the Sun , the brightest object in the sky, has an apparent magnitude of -26.75, while Sirius, the brightest star in the sky other than the Sun and visible on cold winter nights in the constellation Canis Major (the Big Dog), has an apparent magnitude of -1.45. The faintest star you can see without optical aid is about +5 (or +6 if your eyes are very sharp), and the faintest objects visible to the most powerful telescope on Earth have an apparent magnitude of about +30.


How bright it really is: absolute magnitude

More revealing than apparent magnitude is absolute magnitude, which is the apparent magnitude a star would have if it were ten parsecs from the Earth (a parsec is a unit of distance equal to 12 trillion mi [19 km]). This is important because apparent magnitude can be deceiving. You know that a lit match is not as bright as a streetlight, but if you hold the match next to your eye, it will appear brighter than a streetlight six blocks away. That's why V is called apparent magnitude: it is only how bright the star appears to be. For example, the Sun is a fainter star than Sirius! Sirius emits far more energy than the Sun does, yet the Sun appears brighter to us because it is so much closer. Absolute magnitude, however, reveals the truth: the Sun has an absolute magnitude of +4.8, while Sirius is +1.4 (remember, smaller numbers mean brighter, not fainter).


The nature of the magnitude scale

In 1856, the British scientist N. R. Pogson noticed that Hipparchus' 6th class stars were roughly 100 times fainter than his 1st class stars. Pogson did the sensible thing: he redefined the stars' V brightness so that a difference of five magnitudes was exactly a factor of 100 in brightness. This meant that a star with V = 1.00 appeared to be precisely 100 times brighter than a star with V = 6.00. One magnitude is then a factor of about 2.512 in brightness. Try it: enter 1 on a calculator and multiply it by 2.512 five times-you've just gone from first magnitude to sixth (or eight to thirteenth, or minus seventh to minus second, or any other combination).

Looking back to the numbers above, we see that the Sun (V = -26.75) has an apparent visual brightness 25 magnitudes greater than Sirius. That's five factors of five magnitudes, or 100 × 100 × 100 × 100 × 100: ten billion! And the difference in apparent brightness between the Sun and the faintest object humans have ever seen (using the Hubble Space Telescope ) is more than 56 magnitudes, or a factor of ten billion trillion.


Magnitudes in modern astronomy

In the 140 years since Pogson created the modern magnitudes, astronomers have developed many different brightness systems, and the most popular are less than 50 years old.

For example, in 1953, H. L. Johnson created the UBV system of brightness measurements. We've already met V, which is measured over that portion of the electromagnetic spectrum to which our eyes are most sensitive. B is the star's brightness in magnitudes measured in the blue part of the spectrum, while U is the brightness in the ultraviolet—a spectral region our eyes cannot detect. There are many other brightness measurement systems in use, so many that astronomers often disagree on how measurements in one system should be converted to another.

Accurate measurement of stellar brightness is important because subtracting the brightness in one part of the spectrum from the brightness in another part reveals important information about the star. For many stars the quantity B-V gives a good approximation of the star's temperature . And it was established in 1978 that the quantity V-R, where R is the brightness in the red part of the spectrum, can be used to estimate a star's radius. This is important, because advances in our understanding of the stars require knowledge of basic parameters like temperature and radius, and careful measurements of brightness can provide some of that information.

See also Spectral classification of stars.

Resources

books

Introduction to Astronomy and Astrophysics. 4th ed. New York: Harcourt Brace, 1997.

Kaufmann, William. Discovering the Universe. New York: W. H. Freeman, 1990.

Mitton, Simon P., ed. The Cambridge Encyclopedia of Astronomy. Cambridge: Cambridge University Press, 1977.

Sherrod, P. Clay. A Complete Manual of Amateur Astronomy. New York: Dover, 2003.


Jeffrey C. Hall

KEY TERMS

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Absolute magnitude

—The apparent brightness of a star, measured in units of magnitudes, at a fixed distance of 10 parsecs.

Apparent magnitude

—The brightness of a star, measured in units of magnitudes, in the visual part of the electromagnetic spectrum, the region to which our eyes are most sensitive.

Brightness

—The amount of energy a star radiates in a certain portion of the electromagnetic spectrum in a given amount of time, often measured in units of magnitudes.

Magnitude

—The unit used in astronomy to measure the brightness of a star. One magnitude corresponds to a factor of 2.512 in brightness, so that five magnitudes = a factor of 100.

Stellar Magnitudes

views updated May 21 2018

Stellar magnitudes

Of principle importance to general astronomical observation is the observable brightness of the stars. Magnitude is the unit used in astronomy to describe a star's brightness. Although stellar magnitude in the visible spectrum dictates which stars can be observed under particular visible light conditions variable due to time of observation, moon phase, atmospheric conditions, and the amount of light pollution presentmagnitude also describes the relative amount of electromagnetic radiation observable in other regions of the electromagnetic spectrum (e.g., the x ray region of the spectrum).

Stars emit different amounts of radiation in different regions of the spectrum, so a star's "brightness" or magnitude will differ from one part of the spectrum to the next. An important field of research in modern astronomy is the accurate measurement of stellar brightness in magnitudes in different parts of the spectrum.

The Greek astronomer Hipparchus devised the first magnitudes in the second centuryb.c.He classified stars according to how bright they looked to the eye: the brightest stars he called "1st class" stars, the next brightest "2nd class," and so on down to "6th class." In this way, all the stars visible to the ancient Greeks were neatly classified into six categories.

Modern astronomers still use Hipparchus' categories, though in considerably refined form. With modern instruments astronomers measure a quantity called V, the star's brightness in the visual portion of the spectrum. Since visual light is what our eyes detect, V is analogous to Hipparchus' classes. For example, Hipparchus listed Aldebaran, the brightest star in the constellation Taurus, as a 1st class star and modern astronomers measure Aldebaran's V at 0.85. Astronomers often refer to a star's visual brightness as its apparent magnitude, a description of how bright the star appears to the eye (or the telescope ).

Hipparchus' scheme defined from the outset one of the quirks of magnitudes: they list magnitude inversely. The fainter the star, the larger the number describing its magnitude. Therefore, the Sun , the brightest object in the sky, has an apparent magnitude of 26.75, while Sirius, the brightest star in the sky other than the Sun and visible on cold winter nights in the constellation Canis Major, has an apparent magnitude of 1.45. The faintest star you can see without optical aid is about +5 or +6 in a very dark sky with little light pollution. The faintest objects visible to the most powerful telescopes on Earth have an apparent magnitude of about +30.

More revealing than apparent magnitude is absolute magnitude, the apparent magnitude a star would have if it were ten parsecs from the Earth (a parsec is a unit of distance equal to 12 trillion mi [19.3 trillion km]). This is important because apparent magnitude can be deceiving. You know that a penlight is not as bright as a streetlight, but if you hold the penlight near your eye, it will appear brighter than a streetlight six blocks away. That's why V is called apparent magnitude: it is only how bright the star appears to be. For example, the Sun is a fainter star than SiriusSirius emits far more energy than the Sun doesyet the Sun appears brighter because it is so much closer. The Sun has an absolute magnitude of +4.8, while Sirius is +1.4.

In 1856, the British scientist N. R. Pogson noticed that Hipparchus' 6th class stars were roughly 100 times fainter than his 1st class stars. Pogson redefined the stars'V brightness so that a difference of five magnitudes was exactly a factor of 100 in brightness. This meant that a star with V = 1.00 appeared to be precisely 100 times brighter than a star with V = 6.00. One magnitude is then a factor of about 2.512 in brightness.

The Sun (V = 26.75) has an apparent visual brightness 25 magnitudes greater than Sirius. The difference in apparent brightness between the Sun and the faintest object humans have ever observed (using the Hubble Space Telescope ) is more than 56 magnitudes.

In the 140 years since Pogson created the modern magnitudes, astronomers have developed many different brightness systems. In 1953, H. L. Johnson created the UBV system of brightness measurements. B is the star's brightness in magnitudes measured in the blue part of the spectrum, while U is the brightness in the ultraviolet spectral region. There are many other brightness measurement systems in use.

Accurate measurement of stellar brightness is important because subtracting the brightness in one part of the spectrum from the brightness in another part reveals important information about the star. For many stars the quantity B-V gives a good approximation of the star's temperature . It was established in 1978 that the quantity V-R, where R is the brightness in the red part of the spectrum, can be used to estimate a star's radius.

See also Cosmology; Quantum electrodynamics (QED); Seeing