Contrast detection is one of the two main techniques used in camera autofocus systems. Although focusing speeds continue to improve, the method uses an inefficient “guess and check” method of figuring out a subject’s distance — it doesn’t initially know whether to move focus backward or forward. UT Austin vision researcher Johannes Burge wondered why the human eye is able to instantly focus without the tedious “focus hunting” done by AF systems. He and his advisor then developed a computer algorithm that’s able determine the exact amount of focus error by simply examining features in a scene.
His research paper, published earlier this month, offers proof that there is enough information in a static image to calculate whether the focus is too far or too close. Burge has already patented the technology, which he says could allow for cameras to focus in as little as 10 milliseconds.
This amazing image might look like a computer generated graphic, but it’s actually a composite photograph by NASA showing India’s population growth over the years. The white areas show the illumination visible in the country prior to 1992, while the blue, green, and red lights indicate new lights that became visible in 1992, 1998, and 2003, respectively. The four photos were tinted and then combined into an image that reveals where new populations are appearing. NASA definitely needs to do one for every country!
In the future, after you print photos onto paper using your camera, you’ll be able to scan them and share them on Flickr using your mouse. At CES earlier this year, LG showed off an amazing new mouse that lets you quickly scan images and documents by simply waving the mouse over them. Now it’s available — if you live in the UK, you can buy one from Dabs for £90 (~$150).
Twitter, Google+, and Facebook are one step closer to becoming clones of each other (at least when it comes to photo sharing) — Twitter has rolled out photo galleries that display the 100 most recent images Tweeted by users in chronological order.
The images included in user galleries can come from Twitter, yFrog, TwitPic, Instagram and other image sharing services supported in Twitter’s details pane. [#]
To view a user’s gallery, simply visit their Twitter page and click the thumbnails on the sidebar. Read more…
There’s a good chance the digital photos you’ve stored on hard drives and DVDs won’t outlive you, but what if there was a disc that could last forever? M-Disc, short for Millenial Disc, is a new type of disc that doesn’t suffer from natural decay and degradation like existing disc technologies, allowing you to store data safely for somewhere between “1000 years” and “forever”.
Existing disc technologies write data using an organic dye layer that begins to experience “data rot” immediately after it’s written, causing the disc to become unreadable after a certain amount of time. The M-Disc, on the other hand, actually carves your data into “rock-like materials” that are known to last for centuries, meaning there’s no data rot. Apparently NASA uses the discs to store data. Hopefully it becomes available and affordable soon…
Here’s some interesting innovation on the tech-side of photography: on August 24, Sony will be unveiling a new lens adapter called the LA-EA2 that will let customers use large Sony Alpha DSLR lenses on their small NEX mirrorless cameras. Unlike most lens adapters, this one actually does a lot more than adapt lenses — it has its own translucent mirror and phase-detection autofocus sensor to aid the camera in providing snappy autofocus. It’s almost like an accessory that helps turn small NEX bodies into a DSLR-style camera (except there’s still no optical viewfinder).
The photo sharing feature on Twitter that we first reported on a couple months ago is now live for all users. This nudges the service a little more closer towards what Facebook and Google+ offer, allowing users to upload and share photos directly through Twitter. Third-party photo-sharing services geared towards Twitter users can’t be too happy about this — the founder of TwitPic turned down a $10 million offer back in 2009, only to have Twitter drink its milkshake a couple years later.
Thought the grain-of-salt-sized camera announced in Germany earlier this year was small? Well, researchers at Cornell have created a camera just 1/100th of a millimeter thick and 1mm on each size that has no lens or moving parts. The Planar Fourier Capture Array (PFCA) is simply a flat piece of doped silicon that cost just a few cents each. After light information is gathered, some fancy mathematical magic (i.e. the Fourier transform) turns the information into a 20×20 pixel “photo”. The fuzzy photo of the Mona Lisa above was shot using this camera.
Obviously, the camera won’t be very useful for ordinary photography, but it could potentially be extremely useful in science, medicine, and gadgets.
A company called Lytro has just launched with $50 million in funding and, unlike Color, the technology is pretty mind-blowing. It’s designing a camera that may be the next giant leap in the evolution of photography — a consumer camera that shoots photos that can be refocused at any time. Instead of capturing a single plane of light like traditional cameras do, Lytro’s light-field camera will use a special sensor to capture the color, intensity, and vector direction of the rays of light (data that’s lost with traditional cameras).
[...] the camera captures all the information it possibly can about the field of light in front of it. You then get a digital photo that is adjustable in an almost infinite number of ways. You can focus anywhere in the picture, change the light levels — and presuming you’re using a device with a 3-D ready screen — even create a picture you can tilt and shift in three dimensions. [#]
Try clicking the sample photograph above. You’ll find that you can choose exactly where the focus point in the photo is as you’re viewing it! The company plans to unveil their camera sometime this year, with the goal of having the camera’s price be somewhere between $1 and $10,000… Check out more sample photos here
About a year ago, engineer and photo-enthusiast Morten Hjerde began brainstorming ideas for the next generation of photographic lighting after concluding that most of the lights used by photographers these days are simply glorified light bulbs.
Using embedded electronics and microprocessor programming, he set out to explore ways to create a different kind of light. A light that would go where the current lights could not go. Exploring the possibility and feasibility of actual digital light. Light that could be pushed and tweaked like you push and tweak the pixels on your computer screen. [#]
He set up a company called Rift Labs, and decided to open source the design and software involved in creating this digital light source. The video above provides some interesting background on the project.