Caltech Made a Sensor That Could Be the Lensless Camera of the Future

The brainiacs at Caltech have produced something really cool: an imaging chip that produces an image from light sensors… without lenses. It’s a chip that could be the birth of the future of photography.

The camera has an arsenal of light receivers, each of which can individually add a tightly controlled time delay to the light it receives, allowing you to “look” in different directions and focus on different things in the scene in-front of the chip.

“We’ve created a single thin layer of integrated silicon photonics that emulates the lens and sensor of a digital camera, reducing the thickness and cost of digital cameras,” says Ali Hajimiri, the leader of the team looking into the camera. “It can mimic a regular lens, but can switch from a fish-eye to a telephoto lens instantaneously—with just a simple adjustment in the way the array receives light.”

“The ability to control all the optical properties of a camera electronically using a paper-thin layer of low-cost silicon photonics without any mechanical movement, lenses, or mirrors, opens a new world of imagers that could look like wallpaper, blinds, or even wearable fabric,” says Hajimiri.

It’s the camera that limits the thickness of cell phones, because of the required mechanics of the lens, but with this new technology these lenses could be dispensed with. Bring on the age of paper-thin cell phones.

The researchers at Caltech also think that the camera might have implications for space photography. They envisage huge, flat telescopes in Space or on the ground capable of imaging our universe.

The real image (left) and the produced scene by the camera (right).

Currently the camera only produces very low resolution images, but the concept is now proven and now the team will be working on scaling up the camera, designing chips that enable larger receivers with higher resolution and sensitivity.

Others working on lensless cameras include Hitachi, Rice University, and Bell Labs.

Discussion