A couple years ago we reported on the amazing fact that chickens have image stabilized heads, and shared some interesting “research” into using chickens as camera stabilizers. It turns out birds aren’t the only creatures with IS systems built into their hardware: cats have it too!
Posts Tagged ‘biology’
Randall Munroe over at XKCD posted this fascinating comic today that demonstrates some of the peculiarities of human vision. Roll up a piece of paper to set your eyes the correct distance from the screen, and then observe how they perceive things like detail, color, polarization, and more. Click the image above for the large version.
Ever wonder what the f-number of your eyes are? It can easily be calculated using the human eye’s focal length (~22mm) and physical aperture size. Here’s what Wikipedia has to say:
Computing the f-number of the human eye involves computing the physical aperture and focal length of the eye. The pupil can be as large as 6–7 mm wide open, which translates into the maximum physical aperture.
The f-number of the human eye varies from about f/8.3 in a very brightly lit place to about f/2.1 in the dark. The presented maximum f-number has been questioned, as it seems to only match the focal length that assumes outgoing light rays. According to the incoming rays of light (what we actually see), the focal length of the eye is a bit longer, resulting in minimum f-number of f/3.2.
The article also notes that the eye cannot be considered an ordinary air-filled camera since it’s filled with light refracting liquid.
Here’s a slow motion video showing a closeup look at the human eye, our amazing biological lens (and sensor). You might be surprised at how mechanical its movements are and how fluid the iris is. Another crazy fact is that we’re continually relying on “image stabilization” to see things clearly:
The visual system in the brain is too slow to process information if the images are slipping across the retina at more than a few degrees per second. Thus, for humans to be able to see while moving, the brain must compensate for the motion of the head by turning the eyes. [#]
To see a quick demonstration of this fact, try the following experiment: hold your hand up, about one foot in front of your nose. Keep your head still, and shake your hand from side to side, slowly at first, and then faster and faster. At first you will be able to see your fingers quite clearly. But as the frequency of shaking passes about 1 Hz, the fingers will become a blur. Now, keep your hand still, and shake your head. No matter how fast you shake your head, the image of your fingers remains clear. This demonstrates that the brain can move the eyes opposite to head motion much better than it can follow, or pursue, a hand movement. When your pursuit system fails to keep up with the moving hand, images slip on the retina and you see a blurred hand. [#]
Like with cameras, our built-in image stabilization can deal with head shake but not motion blur.
We’re now one step closer to being able to take photographs with our minds. Scientists at UC Berkeley have come up with a way to reconstruct what the human brain sees:
[Subjects] watched two separate sets of Hollywood movie trailers
[…] brain activity recorded while subjects viewed the first set of clips was fed into a computer program that learned, second by second, to associate visual patterns in the movie with the corresponding brain activity.
Brain activity evoked by the second set of clips was used to test the movie reconstruction algorithm. This was done by feeding 18 million seconds of random YouTube videos into the computer program so that it could predict the brain activity that each film clip would most likely evoke in each subject.
Finally, the 100 clips that the computer program decided were most similar to the clip that the subject had probably seen were merged to produce a blurry yet continuous reconstruction of the original movie. [#]
Unlike the cat brain research video we shared a while back, the resulting imagery in this project isn’t directly generated from brain signals but is instead reconstructed from YouTube clips similar to what the person is thinking. They’re still calling it a “major leap toward reconstructing internal imagery” though. In the future this technology might be used to record not just our visual memories, but even our dreams!
Here’s a mind-bending video in which someone created the famous checker shadow illusion in real life. The optical illusion takes advantage of the way our brains process lighting and shadows.
As with many so-called illusions, this effect really demonstrates the success rather than the failure of the visual system. The visual system is not very good at being a physical light meter, but that is not its purpose. The important task is to break the image information down into meaningful components, and thereby perceive the nature of the objects in view. [#]
Interesting huh? Our eyes aren’t very good as a light meters, since they’re easily deceived by context.
Update: It looks like the video was taken down by the uploader. Sorry guys.
Color is simply how our brains respond to different wavelengths of light, and wavelengths outside the spectrum of visible light are invisible and colorless to us simply because our eyes can’t detect them. Since colors are created in our brains, what if we all see colors differently from one another? BBC created a fascinating program called “Do You See What I See?” that explores this question, and the findings are pretty startling.
As the low-light capabilities of high-end (and even low-end) cameras rapidly improve, it’s easy to marvel at technology and forget how amazing our own eyes are, but here are some mind-boggling facts to consider: did you know that the human eye can detect as few as two photons entering the retina, and that, under ideal conditions, a healthy young adult can see a candle flame from 30 miles away? To see how mind-boggling that is, try using Google Maps to find a location 30-miles away from where you live.
According to neuroscientist Bradley Voytek, the reason we don’t utilize our full sensory potential is because we’re not paying enough attention to them — kinda makes you want to put down your camera and focus on staring at things, huh?