Recently published patent applications filed by Nikon offer a glimpse into what the company may be working on for future DSLRs and mirrorless cameras. The three technologies spotted are: illuminated lens mounts, dual lens contacts, and a hybrid viewfinder.
Panasonic is claiming a major breakthrough in the world of camera sensors, saying that it has doubled the color sensitivity with a new technology called ‘Micro Color Splitters.’
Flickr has quietly rolled out a great incremental update to its photo-sharing service. Individual photo pages now display a number of EXIF details under a new section labeled “Additional Info”, found in the column to the right. With a quick glance, you’ll be able to see the shutter speed, aperture, ISO, and focal length that a photographer was using when he or she snapped any photo.
This morning Instagram made a huge splash in the social networking scene by launching its own web profiles for viewing users’ photographs through a web browser. Each profile shares a user’s photographs, profile info, and pretty much everything the mobile view has. The service just became a lot more Facebook-like.
Is “camera toss” photography ready to go from fad to feature? Apparently Nikon thinks so. A recently published patent (No. 2012-189859) shows that the company has been thinking about building specific features into its compact and mirrorless cameras that would assist in using the technique.
It’s been a while since I wrote a history article and two or three people seemed to like them. I’ve pretty much covered the development of early cameras and lenses so it’s time to consider the way we recorded those images so other people could see them. No, I’m not talking about Facebook. I’m talking about film. Actually, I’m talking about even before film, mostly, but I really wanted to work that ‘development of film’ bit into the title. Pretty great, isn’t it? OK, maybe not.
Facial recognition service Face.com has announced a new feature in its API: age detection. After analyzing a photograph of a person’s face, the software returns three values: minimum age, maximum age, and estimated age, along with the confidence level of the guesses. Applications for the new technology include enhanced parental controls and targeted advertising. If you want to test out the service yourself, you can play around with the API here (in the photo above, the correct age is ~47).
Face.com API Sandbox (via Face via Gizmodo via PopPhoto)
Samsung has developed what the company claims is the world’s first CMOS sensor that can capture both RGB and range images at the same time. Microsoft’s Kinect has received a good deal of attention as of late for its depth-sensing capabilities, but it uses separate sensors for RGB images and range images. Samsung’s new solution combines both functions into a single image sensor by introducing “z-pixels” alongside the standard red, blue, and green pixels. This allows the sensor to capture 480×360 depth images while 1920×720 photos are being exposed. One of the big trends in the next decade may be depth-aware devices, and this new development certainly goes a long way towards making that a reality.
(via Tech-On! via Gizmodo)
Knowing how long to develop film for is easy if you use popular films and developers, but what if you want to use some obscure combination that isn’t well documented? If that’s you, check out the Photocritic Film Development Database. It’s a simple service that outputs development times for 1440 different film/developer combinations. For combinations that aren’t officially published, creator Haje Jan Kamps has come with a formula that estimates the time — a formula that he says is surprisingly accurate.
Photocritic Film Development Database (via Pixiq)
Update: Digitaltruth also has a massive film development database/chart.
Contrast detection is one of the two main techniques used in camera autofocus systems. Although focusing speeds continue to improve, the method uses an inefficient “guess and check” method of figuring out a subject’s distance — it doesn’t initially know whether to move focus backward or forward. UT Austin vision researcher Johannes Burge wondered why the human eye is able to instantly focus without the tedious “focus hunting” done by AF systems. He and his advisor then developed a computer algorithm that’s able determine the exact amount of focus error by simply examining features in a scene.
His research paper, published earlier this month, offers proof that there is enough information in a static image to calculate whether the focus is too far or too close. Burge has already patented the technology, which he says could allow for cameras to focus in as little as 10 milliseconds.
(via ScienceNOW via Fast Company)
Image credit: 2011 12×12 Vancouver Photo Marathon by 12×12 Vancouver Photo Marathon