Why Won’t the ’72dpi’ Myth Die?
When they were young, my children used to get very excited about Santa Claus and the Easter Bunny. But they saw through those by the time they were six. So why has the myth of saving JPEG files at 72dpi lasted far more than six years?
Decades ago computer screens commonly had 72ppi (pixels per inch)—as opposed to the roughly 200 to 500ppi in the screens we use now—which birthed the mistaken belief that jpeg files saved at 72ppi would display better on those screens. Perhaps people believed the pixels in the image would align better with the pixels on the screen if the dpi of the image matched the ppi of the screen.
They don’t.
If you display a 1,000 pixel-wide image on a screen, at full size, it will use 1,000 pixels of width on the screen. This is true whether the image was saved with 1 ppi, 72 ppi, or a billion ppi.
Images and screens don’t have a dpi, printers do. The image’s pixels-per-inch determine how many pixels of the image will be used to fill each inch of the paper. For example: your 1,000 pixel image, printed at 200 ppi, will be 5 inches wide.
The software that drives the printer might let you specify the dpi. This sets how many dots of ink will be used to print each inch of image, regardless of the size of the printed image, and in this day and age, the dpi for photographic printers is a lot higher than 72. An image printed using only 72 dots of ink for each inch of paper would actually look pretty bad.
The next time you see a request to submit your images at 72dpi, tell them you don’t believe in Santa Claus.
About the author: David Garnick is the Founder and Editor of Bokeh Bokeh Photo, and directs the San Francisco Bay Month of Photography. He specializes in still life and landscape photography that references visual styles of the past. David collaborates with museums to create compelling imagery for exhibitions. You can find more of his work on his website. This article was also published here.