The apparent magnitude (m) of a celestial object is a number that is a measure of its brightness as seen by an observer on Earth. The brighter an object appears, the lower its magnitude value (i.e. inverse relation). The Sun, at apparent magnitude of -26.7, is the brightest object in the sky. It is adjusted to the value it would have in the absence of the atmosphere. Furthermore, the magnitude scale is logarithmic. A difference of 1 in magnitude corresponds to a change in brightness by a factor of 5?100, or about 2.512.
The measurement of apparent magnitudes or brightnesses of celestial objects is known as photometry. Apparent magnitudes are used to quantify the brightness of sources at ultraviolet, visible, and infrared wavelengths. An apparent magnitude is usually measured in a specific passband corresponding to some photometric system such as the UBV system. In standard astronomical notation, an apparent magnitude in the V ("visual") filter band would be denoted either as mV or often simply as V, as in "mV = 15" or "V = 15" to describe a 15th-magnitude object.
Video Apparent magnitude
History
The scale used to indicate magnitude originates in the Hellenistic practice of dividing stars visible to the naked eye into six magnitudes. The brightest stars in the night sky were said to be of first magnitude (m = 1), whereas the faintest were of sixth magnitude (m = 6), which is the limit of human visual perception (without the aid of a telescope). Each grade of magnitude was considered twice the brightness of the following grade (a logarithmic scale), although that ratio was subjective as no photodetectors existed. This rather crude scale for the brightness of stars was popularized by Ptolemy in his Almagest and is generally believed to have originated with Hipparchus.
In 1856, Norman Robert Pogson formalized the system by defining a first magnitude star as a star that is 100 times as bright as a sixth-magnitude star, thereby establishing the logarithmic scale still in use today. This implies that a star of magnitude m is about 2.512 times as bright as a star of magnitude m + 1. This figure, the fifth root of 100, became known as Pogson's Ratio. The zero point of Pogson's scale was originally defined by assigning Polaris a magnitude of exactly 2. Astronomers later discovered that Polaris is slightly variable, so they switched to Vega as the standard reference star, assigning the brightness of Vega as the definition of zero magnitude at any specified wavelength.
Apart from small corrections, the brightness of Vega still serves as the definition of zero magnitude for visible and near infrared wavelengths, where its spectral energy distribution (SED) closely approximates that of a black body for a temperature of 11000 K. However, with the advent of infrared astronomy it was revealed that Vega's radiation includes an Infrared excess presumably due to a circumstellar disk consisting of dust at warm temperatures (but much cooler than the star's surface). At shorter (e.g. visible) wavelengths, there is negligible emission from dust at these temperatures. However, in order to properly extend the magnitude scale further into the infrared, this peculiarity of Vega should not affect the definition of the magnitude scale. Therefore, the magnitude scale was extrapolated to all wavelengths on the basis of the black body radiation curve for an ideal stellar surface at 11000 K uncontaminated by circumstellar radiation. On this basis the spectral irradiance (usually expressed in janskys) for the zero magnitude point, as a function of wavelength, can be computed. Small deviations are specified between systems using measurement apparatuses developed independently so that data obtained by different astronomers can be properly compared, but of greater practical importance is the definition of magnitude not at a single wavelength but applying to the response of standard spectral filters used in photometry over various wavelength bands.
With the modern magnitude systems, brightness over a very wide range is specified according to the logarithmic definition detailed below, using this zero reference. In practice such apparent magnitudes do not exceed 30 (for detectable measurements). The brightness of Vega is exceeded by four stars in the night sky at visible wavelengths (and more at infrared wavelengths) as well as the bright planets Venus, Mars, and Jupiter, and these must be described by negative magnitudes. For example, Sirius, the brightest star of the celestial sphere, has an apparent magnitude of -1.4 in the visible. Negative magnitudes for other very bright astronomical objects can be found in the table below.
Astronomers have developed other photometric zeropoint systems as alternatives to the Vega system. The most widely used is the AB magnitude system, in which photometric zeropoints are based on a hypothetical reference spectrum having constant flux per unit frequency interval, rather than using a stellar spectrum or blackbody curve as the reference. The AB magnitude zeropoint is defined such that an object's AB and Vega-based magnitudes will be approximately equal in the V filter band.
Maps Apparent magnitude
Calculations
As the amount of light actually received by a telescope is reduced by transmission through the Earth's atmosphere, any measurement of apparent magnitude is corrected for what it would have been as seen from above the atmosphere. The dimmer an object appears, the higher the numerical value given to its apparent magnitude, with a difference of 5 magnitudes corresponding to a brightness factor of exactly 100. Therefore, the apparent magnitude m, in the spectral band x, would be given by
which is more commonly expressed in terms of common (base-10) logarithms as
where Fx is the observed flux density using spectral filter x, and Fx,0 is the reference flux (zero-point) for that photometric filter. Since an increase of 5 magnitudes corresponds to a decrease in brightness by a factor of exactly 100, each magnitude increase implies a decrease in brightness by the factor 5?100 ? 2.512 (Pogson's ratio). Inverting the above formula, a magnitude difference m1 - m2 = ?m implies a brightness factor of
Example: Sun and Moon
What is the ratio in brightness between the Sun and the full Moon?
The apparent magnitude of the Sun is -26.74 (brighter), and the mean apparent magnitude of the full moon is -12.74 (dimmer).
Difference in magnitude:
Brightness factor:
The Sun appears about 400000 times brighter than the full moon.
Magnitude addition
Sometimes one might wish to add brightnesses. For example, photometry on closely separated double stars may only be able to produce a measurement of their combined light output. How would we reckon the combined magnitude of that double star knowing only the magnitudes of the individual components? This can be done by adding the brightnesses (in linear units) corresponding to each magnitude.
Solving for yields
where mf is the resulting magnitude after adding the brightnesses referred to by m1 and m2.
Absolute magnitude
Since flux decreases with distance according to the inverse-square law, a particular apparent magnitude could equally well refer to a star at one distance, or a star four times brighter at twice that distance, and so on. When one is not interested in the brightness as viewed from Earth, but the intrinsic brightness of an astronomical object, then one refers not to the apparent magnitude but the absolute magnitude. The absolute magnitude M, of a star or astronomical object is defined as the apparent magnitude it would have as seen from a distance of 10 parsecs (about 32.6 light-years). The absolute magnitude of the Sun is 4.83 in the V band (yellow) and 5.48 in the B band (blue). In the case of a planet or asteroid, the absolute magnitude H rather means the apparent magnitude it would have if it were 1 astronomical unit from both the observer and the Sun, and fully illuminated (a configuration that is only theoretically achievable, with the observer situated on the surface of the Sun).
Standard reference values
It is important to note that the scale is logarithmic: the relative brightness of two objects is determined by the difference of their magnitudes. For example, a difference of 3.2 means that one object is about 19 times as bright as the other, because Pogson's Ratio raised to the power 3.2 is approximately 19.05.
A common misconception is that the logarithmic nature of the scale is because the human eye itself has a logarithmic response. In Pogson's time this was thought to be true (see Weber-Fechner law), but it is now believed that the response is a power law (see Stevens' power law).
Magnitude is complicated by the fact that light is not monochromatic. The sensitivity of a light detector varies according to the wavelength of the light, and the way it varies depends on the type of light detector. For this reason, it is necessary to specify how the magnitude is measured for the value to be meaningful. For this purpose the UBV system is widely used, in which the magnitude is measured in three different wavelength bands: U (centred at about 350 nm, in the near ultraviolet), B (about 435 nm, in the blue region) and V (about 555 nm, in the middle of the human visual range in daylight). The V band was chosen for spectral purposes and gives magnitudes closely corresponding to those seen by the light-adapted human eye, and when an apparent magnitude is given without any further qualification, it is usually the V magnitude that is meant, more or less the same as visual magnitude.
Because cooler stars, such as red giants and red dwarfs, emit little energy in the blue and UV regions of the spectrum their power is often under-represented by the UBV scale. Indeed, some L and T class stars have an estimated magnitude of well over 100, because they emit extremely little visible light, but are strongest in infrared.
Measures of magnitude need cautious treatment and it is extremely important to measure like with like. On early 20th century and older orthochromatic (blue-sensitive) photographic film, the relative brightnesses of the blue supergiant Rigel and the red supergiant Betelgeuse irregular variable star (at maximum) are reversed compared to what human eyes perceive, because this archaic film is more sensitive to blue light than it is to red light. Magnitudes obtained from this method are known as photographic magnitudes, and are now considered obsolete.
For objects within the Milky Way with a given absolute magnitude, 5 is added to the apparent magnitude for every tenfold increase in the distance to the object. For objects at very great distances (far beyond the Milky Way), this relationship must be adjusted for redshifts and for non-Euclidean distance measures due to general relativity.
For planets and other Solar System bodies the apparent magnitude is derived from its phase curve and the distances to the Sun and observer.
Table of notable celestial objects
Some of the above magnitudes are only approximate. Telescope sensitivity also depends on observing time, optical bandpass, and interfering light from scattering and airglow.
See also
References
External links
- "The astronomical magnitude scale". International Comet Quarterly.
Source of article : Wikipedia