Ever wonder what the f-number of your eyes are? It can easily be calculated using the human eye’s focal length (~22mm) and physical aperture size. Here’s what Wikipedia has to say:
Computing the f-number of the human eye involves computing the physical aperture and focal length of the eye. The pupil can be as large as 6–7 mm wide open, which translates into the maximum physical aperture.
The f-number of the human eye varies from about f/8.3 in a very brightly lit place to about f/2.1 in the dark. The presented maximum f-number has been questioned, as it seems to only match the focal length that assumes outgoing light rays. According to the incoming rays of light (what we actually see), the focal length of the eye is a bit longer, resulting in minimum f-number of f/3.2.
The article also notes that the eye cannot be considered an ordinary air-filled camera since it’s filled with light refracting liquid.
Image sensors and the advent of digital imaging have been met with differing reactions from the photographical community. But what a team of doctors at the Oxford Eye Hospital have managed to do with the technology is 100% digital, and 100% amazing. Clinical trial leaders Robert MacLaren and Tim Jackson have helped two blind men to partially see again. Read more…
After taking a macro photograph of his own eye using a Samsung WB500 compact camera, Jarroseph was startled to find that the photograph showed his own face reflected in his eyeball. His face had reflected off the front of the lens, off his eyeball, and then into the camera!
You may have heard that digital cameras can be made sensitive to infrared light by removing the IR filter found inside, but did you now that something similar can be done with the human eye? People who have aphakia, or the absence of the lens on the eye, have reported the ability to see ultraviolet wavelengths. Claude Monet was one such person. Carl Zimmer writes,
Late in his life, Claude Monet developed cataracts. As his lenses degraded, they blocked parts of the visible spectrum, and the colors he perceived grew muddy. Monet’s cataracts left him struggling to paint; he complained to friends that he felt as if he saw everything in a fog. After years of failed treatments, he agreed at age 82 to have the lens of his left eye completely removed. Light could now stream through the opening unimpeded. Monet could now see familiar colors again. And he could also see colors he had never seen before. Monet began to see–and to paint–in ultraviolet.
[…] With his lens removed, Monet continued to paint. Flowers remained one of his favorite subjects. Only now the flowers were different. When most people look at water lily flowers, they appear white. After his cataract surgery, Monet’s blue-tuned pigments could grab some of the UV light bouncing off of the petals. He started to paint the flowers a whitish-blue.
The lens on a human eye ordinarily filters out UV rays, so we don’t see many of the things certain animals see. For example, the males and females of some butterfly species look identical to the human eye but very different to UV-sensitive eyes — the males sport bright patterns in order to attract the females!
Want to see how your eyes stack up against other photographers when it comes to seeing colors? Try your hand at Color, a simple browser-based color matching game that tests you in how quickly you can match colors. It starts with simple matching, but soon moves onto more difficult challenges involving multiple colors. Be sure to leave a comment here reporting on the score you get!
Here’s a quick and simple tip for better portraits by Reddit user rmx_:
Everyone has a lazy eye. By that, I mean one eye is always smaller and/or more closed than the other eye. In some people, it is very easy to spot; in others, nearly impossible. The “beautiful people” have more symmetrical faces, but still, one eye will open more than the other. (Denzel Washington has one of the most I have seen […])
[…] here is the tip: get the smaller/lazier eye slightly closer to the camera. Oh, and don’t tell the person what you’re looking at their eyes for! You’ll make them self conscious. Simply ask them to look at your finger and move their head to follow it, and then guide them left or right as necessary. Chances are, the movement needed will not be so much that you have to adjust your lights.
Here’s a slow motion video showing a closeup look at the human eye, our amazing biological lens (and sensor). You might be surprised at how mechanical its movements are and how fluid the iris is. Another crazy fact is that we’re continually relying on “image stabilization” to see things clearly:
The visual system in the brain is too slow to process information if the images are slipping across the retina at more than a few degrees per second. Thus, for humans to be able to see while moving, the brain must compensate for the motion of the head by turning the eyes. [#]
To see a quick demonstration of this fact, try the following experiment: hold your hand up, about one foot in front of your nose. Keep your head still, and shake your hand from side to side, slowly at first, and then faster and faster. At first you will be able to see your fingers quite clearly. But as the frequency of shaking passes about 1 Hz, the fingers will become a blur. Now, keep your hand still, and shake your head. No matter how fast you shake your head, the image of your fingers remains clear. This demonstrates that the brain can move the eyes opposite to head motion much better than it can follow, or pursue, a hand movement. When your pursuit system fails to keep up with the moving hand, images slip on the retina and you see a blurred hand. [#]
Like with cameras, our built-in image stabilization can deal with head shake but not motion blur.
You might want to skip this post if you’re squeamish. A filmmaker named Rob Spence has successfully become a cyborg by replacing an eye he lost through a childhood accident with a wireless camera that transmits everything he sees to a computer. Spence believes that technology may soon reach the point where are be tempted to swap out their body parts for superior prosthetics. No word on when he’ll be able to apply Instagram filters to his eye camera photos.
San Franciscan Tanya Vlach lost her left eye in a car accident back in 2005. Dissatisfied with her prosthetic eye, she’s trying to raise money to develop an in-eye camera that captures blink-activated still photos and 720p HD video. Her wish list of features include geotagging, IR/UV capture, facial recognition, and sensor activated zoom, focus, and on/off. Vlach’s Kickstarter project is titled “Grow a new eye“, and has a set goal of reaching $15,000 in funding by August 3rd, 2011.