Home » Questions » What is Apparent Magnitude (m)?

What is Apparent Magnitude (m)?

When we look at the stars in the sky, some seem very bright while others are just bright enough to be visible.

How bright a star appears to be is known as its apparent magnitude (m).

Originally (Hipparchus 190 – 120 BC) the scale was from 1 to 6.

The brightest stars were m = 1 and the faintest, just barely visible with the naked eye, were m = 6. Going from 1 magnitude to another meant an increase or decrease in brightness of about two times.

This was hardly very scientific.

n 1856 Pogson made things more formal by saying that a magnitude 1 star was 100 times brighter than a magnitude 6 star.

The difference between magnitudes is, therefore, the fifth root of 100 = 2.51, known as Pogson’s ratio.

So if star A has an apparent magnitude of 5 and star B has an apparent magnitude of 4 then B appears 2.51 x brighter than A.

We also need a reference star, a star of known brightness that we can compare all the others by. Various stars have been used for this including Polaris and Vega.

Here are some apparent magnitudes.

The Sun      -26.73mVega     0m
The full moon      -12.6mBrightest stars you can see in Middlesbrough      3m
maximum brightness of Venus      -4.4mfaintest stars seen with the Hubble Space Telescope      30m