The camera on smartphones is one of the main selling points these days, and Apple is working hard to push its iPhone camera ahead of the pack. A newly discovered patent reveals that Apple has created an innovative sensor design that increases quality by using three separate sensors and a prism for splitting light.
Digital camera sensors generally use a Bayer filter to capture color photographs with a single sensor. The Bayer uses a grid of tiny color filters in order to detect the red, green, and blue components of incoming light and then use algorithms to estimate colors for the final photo. It’s a popular sensor design, but using an array of color filters isn’t an efficient use of light as a lot of the useful information is thrown out.
Apple is now trying to keep more of the incoming light by using three separate sensors and no color filter array. Instead, its camera would take incoming light and split it to the three sensors so that they can capture the red, green, and blue components without competing against one another for light.
Apple Insider writes that this system would not require color channel processing or demosaicing, so it maximizes the pixel array resolution.
This newly discovered patent was apparently developed alongside another design we shared last month: a periscope-style camera module that uses a mirror for compact image stabilization.
No word yet on whether Apple is actually planning to launch these technologies in a future iPhone (or some other product), but at the very least they show us that Apple is working hard at pushing the world of mobile photography forward.