Apparent magnitude refers to the brightness of stars. Hipparchus, a Greek astronomer of the second century B.C., made a scale of stellar magnitude for comparing the brightness of stars, and catalogued about 1,000 stars. He called the brightest stars "first magnitude stars." The faintest stars visible to the naked eye, he called "sixth magnitude stars."
The luminosity, or brightness, of a star depends upon its temperature and size. Its apparent brightness, or visual magnitude, depends not only upon its luminosity, but also upon its distance from Earth. Today, it is known that some bright stars are brighter than first-magnitude stars. The scale has now been extended to include zero magnitude and also negative magnitudes. Sirius, the brightest star, has been assigned a magnitude of –1.4. A zero-magnitude star is 100 times brighter than a fifth-magnitude star.
The full Moon has a mean apparent magnitude of –12.74 and the Sun has an apparent magnitude of –26.74.
The scale was extended beyond the sixth magnitude to take in the numerous stars seen only with the aid of a telescope. Certain stars in the North Polar Sequence are used as a standard for finding the magnitude of a star. The comparison is made on photographic plates. For greater accuracy, a photoelectric cell is used to measure a star's light, which can then be compared to the light from the standard star.
Absolute magnitude is another scale used in astronomy. It refers to the total amount of light radiated by a star without reference to the amount received on Earth. Absolute magnitude is the apparent magnitude the star
would have if it were a fixed distance (10 parsecs) away. On the absolute scale, Sirius has a magnitude of –1.5.