Calculating Magnitude

You might be wondering what exactly the number you read off for a magnitude tells you about the relative brightness of different objects. Magnitudes are built on what’s known as a logarithmic scale which allows us to compare objects with vastly different brightness without using incredibly large numbers. The magnitude scale works in such a way that an increase of 1 in magnitude corresponds to a decrease in brightness by a factor of about 2.5. In other words, an object with a magnitude of 5 is 2.5 times fainter than an object with a magnitude of 4.
The physical property that magnitude actually measures is radiant flux, the amount of light that arrives in a given area on Earth in a given amount of time. This radiant flux, abbreviated as F (in units of ergs/s/cm^2/Hz), relates to magnitude, m, in the following way:

m = -2.5 x log10 (F / F_Vega)

The star Vega in the northern hemisphere constellation Lyra is used as the standard for the magnitude system, so F_Vega means the amount of light arriving at Earth in a given time from Vega. This definition means that Vega’s magnitude is set to zero through all filters.
This does not mean that Vega looks the same through all filters; it just means that astronomers have agreed to use Vega as the zero point for the magnitude scale, much like the freezing point of water is used as the zero point for the Celsius temperature scale.
A variation of the equation above can be used to relate the difference in magnitudes between any two objects to their flux ratio, F1/F2:

m1 – m2 = -2.5 x log10 (F1/F2)

The sun, which is 14 units of apparent magnitude brighter than the full moon, is actually almost 400,000 times brighter if you compare the intensity of their light directly (this is probably not surprising, since we can safely look at the moon but not at the sun). So now you can appreciate how magnitudes help us compare objects in the sky that have extremely different brightnesses without having to use enormous numbers.