One of the main stumbling points for new photographers is the seemingly random series of numbers that we have come to know as the f-stop scale or aperture scale. Things start out innocently enough f/1, f/1.4 (just add 0.4 every time, right?), but things get ugly quickly — f/2, f/2.8, f/4. Why would anyone invent such an arbitrary scale?
To answer, we must go back to the second century BC. It was during this period when a Greek astronomer named Hipparchus developed the first system for organizing stars by their apparent brightness. He ranked stars on a scale from 1 to 6 based on the brightness he observed. Centuries later, when astronomers developed methods to quantify the actual brightness of each star, they noticed something strange. A category one star was not six times brighter than a category six star — it was 100 times brighter. Every step on the apparent brightness scale yielded an actual brightness increase of 2.5x.
It turns out that the human eye is not very good at picking out small differences in brightness. In order to see a difference, we must change the brightness a LOT — like two and a half times its original value. What Hipparchus discovered, by accident, was the logarithmic nature of human perception. Somewhere within us, we are hardwired to perceive level changes only when they are many times less than or greater than the next level. The visual advantage we gain from this is dynamic range. It has been estimated that the human eye can effectively process 10 f/stops of light levels — an extraordinary range which certainly exceeds any film or sensor invented so far. If the human eye could distinguish small linear increments of brightness, there would be no way to maintain the same wide dynamic range.
The logarithmic nature of human perception was known as early as the 1800s and was eventually summarized by German psychologists as the Weber-Fechner law. The law has implications that apply to many different human processes — vision, hearing, and mental processing. Modern psychologists believe that before children are taught the linear number scale (1,2,3…), their natural tendency is to think in terms of a logarithmic scale (2,4,8,16). For a particularly mind blowing description of this, just listen to this Radiolab podcast, which describes an entire tribe in the Amazon who uses a logarithmic number scale for everyday life. Ask them for a number half way between 1 and 9 and they will say 3.
Which brings us back to f/stops. At the same time psychologists were musing about the logarithmic human perceptions, early photographers were quantifying the optical principles of early cameras. Fairly early on, it was determined that the area of the aperture hole needed to vary by a factor of 2x in order to yield perceptibly brighter or darker photographs from one f/stop to the next. The figure below shows the progression of aperture areas going from largest to smallest. For each progression, the area is divided in half until we get to the smallest aperture which is 1/32nd the size of the original one. The diameter of each of these apertures is proportional to the square root of the aperture area. Thus, by taking the square root of the aperture areas, we see some familiar numbers — 1, 1.4, 2, etc. Voila!
The f-stop numbering scheme may seem clumsy and awkward, but it is a necessary consequence of our human biology. Hipparchus would be proud of us.