Difference Between Absolute and Apparent Magnitude

Absolute vs Apparent Magnitude
 

The astronomical objects have fascinated human race and captured the imagination of most brilliant minds on earth for thousands of years. It is the first natural wonder to be closely analysed by the human mind. In their investigation, the ancient astronomers needed tools to evaluate their observations, which are not commonly used on more earthly problems.

One such tool is the concept of the magnitude, which Greek astronomer Hipparchus used about 200 years ago. It consists of an apparent magnitude scale based on pure observation. He classified stars based on how bright they appear in the sky. Modern astronomers use a more mathematical approach to this, but the concept has not changed over the time of two millennia.

What is Apparent Magnitude?

Apparent magnitude is defined as the brightness of a celestial object as measured by an observer on earth, in the absence of the atmosphere. Apparent magnitude is given with a scale such that the lower the brightness, higher the magnitude and higher the brightness lower the magnitude. For example, brightest star in the sky in the visible spectrum, the Sirius, has apparent magnitude of -1.4, and the maximum apparent magnitude of Charon, the moon of Pluto, is 15.55

Apparent magnitude is a measure of the intensity of light received from a specific object in the sky. However, it does not give a measure of the intrinsic brightness of the object. The amount of light/photons received by an observer on earth is dependent on the distance of the object and the actual intensity of the object.

Also, the apparent magnitude of a celestial body may differ depending on the range of the electromagnetic spectrum in which it is being observed. The apparent magnitude of the same object observed in the infrared band is different from the amount observed in visible light. However, the concept is mainly used for observations in the visible region of the electromagnetic spectrum.

What is Absolute Magnitude?

Absolute magnitude is defined as the apparent magnitude of a star at the distance of 10 parsecs or 32.6 light years. It is a measure of the intrinsic brightness of the celestial body.

Comparing the magnitude of the astronomical bodies at a fixed distance allows the astronomers to rule out the astronomical extinction and the varying distance to the bodies, and consider only the amount of light coming from the body.

What is the difference between Absolute and Apparent Magnitude?

• Apparent magnitude is the brightness of an astronomical body as seen from earth, while absolute magnitude is the apparent magnitude of a body as seen from 10 parsecs or 32.6 light years from the earth.

• The absolute magnitude is an intrinsic measurement, but apparent magnitude is not.

ncG1vNJzZmivp6x7pbXFn5yrnZ6YsqOx07CcnqZemLyue8OinZ%2Bdopq7pLGMm5ytr5Wau26twaympa2kmnqiusNmraxlkaW9or7Ep6tmpZGcu6rA1J2caA%3D%3D