Man’s $1,998 Camera Fried by Self-Driving Car Laser

Self-driving cars widely use a technology called lidar (which stands for light detection and ranging) to “see” the world using laser pulses. These lasers are designed to be safe to human eyes, but it seems they may not always be safe for cameras. A man at CES in Las Vegas says that a car-mounted lidar permanently damaged the sensor in his new $1,998 Sony a7R II mirrorless camera.

Ars Technica reports that Ridecell autonomous vehicle engineer Jit Ray Chowdhury had been photographing a self-driving car that was using a lidar system developed by AEye.

He was then horrified to find that all his subsequent photos showed clear sensor damage — there were two bright purple spots with horizontal and vertical lines across the entire frame.

Here are the photos Chowdhury was shooting when the damage occurred:

“I noticed that all my pictures were having that spot,” Chowdhury tells Ars. “I covered up the camera with the lens cap and the spots are thereā€”it’s burned into the sensor.”

Here are some photos Chowdhury shot afterward that clearly shows the damaged pixels on the sensor:

AEye CEO Luis Dussan tells Ars that his company’s lidars are completely safe for human eyes but didn’t deny that they are capable of damaging camera sensors.

“Cameras are up to 1000x more sensitive to lasers than eyeballs,” Dussan tells Ars. “Occasionally, this can cause thermal damage to a camera’s focal plane array.”

There may soon be an explosion of self-driving cars (and lidars) on public roads, so whether the lasers pose any danger to things like cameras is something that will need to be looked into.

Lasers have long been known to pose a risk to cameras, and there are many documented cases online of sensors being damaged by the types commonly used in concert light shows. Here are two videos showing expensive Canon 5D Mark II and 5D Mark III DSLR cameras being permanently damaged by direct hits:

Different lidar systems feature different designs and lasers, so many or most of them may be completely safe for cameras as well.

“This sensor damage was an effect of a combination of things — intensity, amount of time, spot size, wavelength, pulsing, [etc.],” Chowdhury tells PetaPixel. “I have tested and photographed almost all lidars up close without getting my camera damaged. Also, this may not happen at a distance.”

Chowdhury also notes that AEye’s lidar may not be the only one on the market that poses a risk to camera sensors.

“It is unfortunate that I discovered this with their lidar,” Chowdhury says. “A warning from them when I asked permission from them for photographing their system would have been ideal.”

AEye has since offered to cover the cost of Chowdhury’s fried camera, and the engineer says he says he will accept the “generous” gesture as his camera was only a month old and was a big investment.

“I do want all lidar companies to mention in specs how camera safe they are in some units, like say distance vs camera aperture, shutter speed,” Chowdhury says. “Or what kind of filters to use or if they have completely mitigated the problem.”


Update on 1/14/19: This article originally stated that “AEye claims its lidar has more range than competitors’ systems thanks to the use of a powerful laser.” We’ve removed this line after being contacted by AEye.

“This is untrue,” an AEye spokesperson tells PetaPixel. “Aeye’s range originates from its use of artificial intelligence to discriminately collect data information that matters to an autonomous vehicle’s path planning system.”


Image credits: Photographs by Jit Ray Chowdhury and used with permission

Discussion