Transparent Camera Tech Aims to Revolutionize Eye Tracking

A pair of futuristic glasses with projected information sit on a face.

Society is becoming increasingly reliant on image sensors. However, limitations in these sensors thus far have held back specific applications.

Image sensors are, of course, essential components in cameras of all shapes and sizes. They are vital to augmented and virtual reality devices, smart displays, and human-computer interfaces. Many of these applications rely on eye tracking. This is a complicated technology because placing something in front of the eye to track its movement blocks the view. The alternative is mounting the sensors further away. Unfortunately, this makes them less accurate, bulkier, and more power-hungry, making them less functional in many situations.

As Discover Magazine reports, one team has recently developed a potential solution. Gabriel Mercier at The Barcelona Institute of Science and Technology in Spain and colleagues recently published a paper that details their semi-transparent photodetectors that allow cameras to be essentially invisible. They tested the new device, and it showed promise in enabling a “new generation of eye tracking devices built into ordinary objects such as spectacle lenses, computer monitors and windows,” according to Discover Magazine.

The device comprises an “8×8 array of semi-transparent photodetectors and electrodes disposed on a fully transparent substrate,” according to Mercier. Each pixel is 60 x 140 microns in size with an optical transparency of 85-95 percent. The pixels are also highly sensitive, “with more than 90% of them showing a noise equivalent irradiance < 10-4 W/m2 for wavelengths of 637 nm," the paper details. To make these transparent sensors, the team used graphene-based quantum dots. These consist "of a layer of graphene—a 2-dimensional sheet of carbon atoms in "chicken wire" formation -- covered with dots of lead sulfide," Discover Magazine explains. When photons hit them, each dot emits electrons that travel across the graphene, producing a current. Since most light passes through the graphene, and the dots are too small to see, the material is effectively transparent.

These photodetectors feature a large amount of built-in gain, meaning they can be placed far away without needing amplification. The high levels of gain are a big step since current organic light-sensitive conductors need amplification immediately, and the electronics that can do that are not transparent and need to be placed near the pixels. The new array from Mercier et al. can be printed onto glasses or other devices that sit immediately in front of the eye without blocking the view.

To test their novel device, Mercier and his team projected grayscale patterns onto the photodetectors and compared the output to a conventional image sensor. Eye tracking sensors require a refresh rate of at least 200Hz, and the team recorded their device working at more than 400Hz while producing reliable images. “It is clearly visible that most patterns can be reconstructed by the imaging device,” they say. They also simulated eye-tracking by projecting a black dot onto their device and using the output to track it.

This research has powerful implications. It could improve eye-tracking autofocus abilities, open new opportunities for augmented and virtual reality, and much more.

Plus, eye-tracking is valuable far beyond entertainment. “Eye-tracking has a wide range of uses, such as detecting schizophrenia, measuring the comprehension of texts or driving experience, while allowing for a better understanding of memory and commercial choices,” said Mercier and team. “Eye-tracking provides a human-computer interface that can allow for the touch and gestureless control of automotive infotainment systems — and is also earmarked as a key enabling technology for omnipresent virtual and augmented reality.”

Of course, there’s still much work to be done. However, the technology is promising, and it will be exciting to see its applications in the relatively near future.

Image credits: Header photo licensed via Depositphotos.