Posts Tagged ‘biology’
Did you know that 90% of the cells in (or on) the human body are bacteria and other microorganisms? Have you ever thought about how many bacteria live on your DSLR camera? Chicago Tribute staff photographer Alex Garcia recently dove into this second question while visiting the Argonne National Laboratory outside Chicago.
This article started after I followed an online discussion about whether a 35mm or a 50mm lens on a full frame camera gives the equivalent field of view to normal human vision. This particular discussion immediately delved into the optical physics of the eye as a camera and lens — an understandable comparison since the eye consists of a front element (the cornea), an aperture ring (the iris and pupil), a lens, and a sensor (the retina).
Despite all the impressive mathematics thrown back and forth regarding the optical physics of the eyeball, the discussion didn’t quite seem to make sense logically, so I did a lot of reading of my own on the topic.
If you think male and female photographers sometimes have very different styles, the reason might go beyond their tastes and approaches to shooting. Men and women see the world differently — literally. A new study by vision researchers have found that the two genders have different ways of collecting visual information.
According to the findings, men are more sensitive to moving objects and seeing small details, while women tend to be sharper in seeing color changes.
Gigapixel images are usually used to capture tiny details in expansive scenes, but scientists in the Netherlands recently created one that shows microscopic details in a tiny subject. Using a technique called virtual nanoscopy (a new relative of microscopy?), the researchers created a massive 281-gigapixel image of a 1.5-millimeter-long zebrafish embryo.
Randall Munroe over at XKCD posted this fascinating comic today that demonstrates some of the peculiarities of human vision. Roll up a piece of paper to set your eyes the correct distance from the screen, and then observe how they perceive things like detail, color, polarization, and more. Click the image above for the large version.
Ever wonder what the f-number of your eyes are? It can easily be calculated using the human eye’s focal length (~22mm) and physical aperture size. Here’s what Wikipedia has to say:
Computing the f-number of the human eye involves computing the physical aperture and focal length of the eye. The pupil can be as large as 6–7 mm wide open, which translates into the maximum physical aperture.
The f-number of the human eye varies from about f/8.3 in a very brightly lit place to about f/2.1 in the dark. The presented maximum f-number has been questioned, as it seems to only match the focal length that assumes outgoing light rays. According to the incoming rays of light (what we actually see), the focal length of the eye is a bit longer, resulting in minimum f-number of f/3.2.
The article also notes that the eye cannot be considered an ordinary air-filled camera since it’s filled with light refracting liquid.
Here’s a slow motion video showing a closeup look at the human eye, our amazing biological lens (and sensor). You might be surprised at how mechanical its movements are and how fluid the iris is. Another crazy fact is that we’re continually relying on “image stabilization” to see things clearly:
The visual system in the brain is too slow to process information if the images are slipping across the retina at more than a few degrees per second. Thus, for humans to be able to see while moving, the brain must compensate for the motion of the head by turning the eyes. [#]
To see a quick demonstration of this fact, try the following experiment: hold your hand up, about one foot in front of your nose. Keep your head still, and shake your hand from side to side, slowly at first, and then faster and faster. At first you will be able to see your fingers quite clearly. But as the frequency of shaking passes about 1 Hz, the fingers will become a blur. Now, keep your hand still, and shake your head. No matter how fast you shake your head, the image of your fingers remains clear. This demonstrates that the brain can move the eyes opposite to head motion much better than it can follow, or pursue, a hand movement. When your pursuit system fails to keep up with the moving hand, images slip on the retina and you see a blurred hand. [#]
Like with cameras, our built-in image stabilization can deal with head shake but not motion blur.
We’re now one step closer to being able to take photographs with our minds. Scientists at UC Berkeley have come up with a way to reconstruct what the human brain sees:
[Subjects] watched two separate sets of Hollywood movie trailers
[...] brain activity recorded while subjects viewed the first set of clips was fed into a computer program that learned, second by second, to associate visual patterns in the movie with the corresponding brain activity.
Brain activity evoked by the second set of clips was used to test the movie reconstruction algorithm. This was done by feeding 18 million seconds of random YouTube videos into the computer program so that it could predict the brain activity that each film clip would most likely evoke in each subject.
Finally, the 100 clips that the computer program decided were most similar to the clip that the subject had probably seen were merged to produce a blurry yet continuous reconstruction of the original movie. [#]
Unlike the cat brain research video we shared a while back, the resulting imagery in this project isn’t directly generated from brain signals but is instead reconstructed from YouTube clips similar to what the person is thinking. They’re still calling it a “major leap toward reconstructing internal imagery” though. In the future this technology might be used to record not just our visual memories, but even our dreams!