This article started after I followed an online discussion about whether a 35mm or a 50mm lens on a full frame camera gives the equivalent field of view to normal human vision. This particular discussion immediately delved into the optical physics of the eye as a camera and lens — an understandable comparison since the eye consists of a front element (the cornea), an aperture ring (the iris and pupil), a lens, and a sensor (the retina).
Despite all the impressive mathematics thrown back and forth regarding the optical physics of the eyeball, the discussion didn’t quite seem to make sense logically, so I did a lot of reading of my own on the topic. Read more…
Ever wonder what the f-number of your eyes are? It can easily be calculated using the human eye’s focal length (~22mm) and physical aperture size. Here’s what Wikipedia has to say:
Computing the f-number of the human eye involves computing the physical aperture and focal length of the eye. The pupil can be as large as 6–7 mm wide open, which translates into the maximum physical aperture.
The f-number of the human eye varies from about f/8.3 in a very brightly lit place to about f/2.1 in the dark. The presented maximum f-number has been questioned, as it seems to only match the focal length that assumes outgoing light rays. According to the incoming rays of light (what we actually see), the focal length of the eye is a bit longer, resulting in minimum f-number of f/3.2.
The article also notes that the eye cannot be considered an ordinary air-filled camera since it’s filled with light refracting liquid.
You may have heard that digital cameras can be made sensitive to infrared light by removing the IR filter found inside, but did you now that something similar can be done with the human eye? People who have aphakia, or the absence of the lens on the eye, have reported the ability to see ultraviolet wavelengths. Claude Monet was one such person. Carl Zimmer writes,
Late in his life, Claude Monet developed cataracts. As his lenses degraded, they blocked parts of the visible spectrum, and the colors he perceived grew muddy. Monet’s cataracts left him struggling to paint; he complained to friends that he felt as if he saw everything in a fog. After years of failed treatments, he agreed at age 82 to have the lens of his left eye completely removed. Light could now stream through the opening unimpeded. Monet could now see familiar colors again. And he could also see colors he had never seen before. Monet began to see–and to paint–in ultraviolet.
[...] With his lens removed, Monet continued to paint. Flowers remained one of his favorite subjects. Only now the flowers were different. When most people look at water lily flowers, they appear white. After his cataract surgery, Monet’s blue-tuned pigments could grab some of the UV light bouncing off of the petals. He started to paint the flowers a whitish-blue.
The lens on a human eye ordinarily filters out UV rays, so we don’t see many of the things certain animals see. For example, the males and females of some butterfly species look identical to the human eye but very different to UV-sensitive eyes — the males sport bright patterns in order to attract the females!
Contrast detection is one of the two main techniques used in camera autofocus systems. Although focusing speeds continue to improve, the method uses an inefficient “guess and check” method of figuring out a subject’s distance — it doesn’t initially know whether to move focus backward or forward. UT Austin vision researcher Johannes Burge wondered why the human eye is able to instantly focus without the tedious “focus hunting” done by AF systems. He and his advisor then developed a computer algorithm that’s able determine the exact amount of focus error by simply examining features in a scene.
His research paper, published earlier this month, offers proof that there is enough information in a static image to calculate whether the focus is too far or too close. Burge has already patented the technology, which he says could allow for cameras to focus in as little as 10 milliseconds.
Here’s a slow motion video showing a closeup look at the human eye, our amazing biological lens (and sensor). You might be surprised at how mechanical its movements are and how fluid the iris is. Another crazy fact is that we’re continually relying on “image stabilization” to see things clearly:
The visual system in the brain is too slow to process information if the images are slipping across the retina at more than a few degrees per second. Thus, for humans to be able to see while moving, the brain must compensate for the motion of the head by turning the eyes. [#]
To see a quick demonstration of this fact, try the following experiment: hold your hand up, about one foot in front of your nose. Keep your head still, and shake your hand from side to side, slowly at first, and then faster and faster. At first you will be able to see your fingers quite clearly. But as the frequency of shaking passes about 1 Hz, the fingers will become a blur. Now, keep your hand still, and shake your head. No matter how fast you shake your head, the image of your fingers remains clear. This demonstrates that the brain can move the eyes opposite to head motion much better than it can follow, or pursue, a hand movement. When your pursuit system fails to keep up with the moving hand, images slip on the retina and you see a blurred hand. [#]
Like with cameras, our built-in image stabilization can deal with head shake but not motion blur.
As the low-light capabilities of high-end (and even low-end) cameras rapidly improve, it’s easy to marvel at technology and forget how amazing our own eyes are, but here are some mind-boggling facts to consider: did you know that the human eye can detect as few as two photons entering the retina, and that, under ideal conditions, a healthy young adult can see a candle flame from 30 miles away? To see how mind-boggling that is, try using Google Maps to find a location 30-miles away from where you live.
There’s an interesting discussion going on over at the DPReview forums regarding how the human eye compares to the technology we have in digital cameras.
Here are some of the findings that were compiled from various sources on the web:
Sensor size: 22mm in diameter
Resolution: 576 megapixels
Sensitivity: 1 – 800 ISO
Focal length: 22mm – 35mm
Aperture: f/2.1 – f/8.3
Another interesting idea that came up was the possibility of using the human eye as the lens and sensor for future imaging devices:
Maybe future “cameras” will actually link to your eyes – since the eyeball is such a great lens, who knows? Getting signal from the eye is the trick – would require a surgical implant or a means of reading brainwaves. Maybe that’s 200 years out – similar time [frame] the Mayo clinic is talking about for correcting double/triple vision.
Perhaps in the future we’ll all be documenting our lives at 576 megapixels through our eyes and ears, and storing the photos and videos on petabyte external hard drives at home.
What do you think of this discussion? Is there anything that jumps out at you as being wrong, or do you agree with the comparison for the most part?