Google Explains How its HDR+ with Bracketing Technology Works

Google has shared extensive details on its newest HDR+ with Bracketing technology that it has implemented in its Pixel device cameras to achieve better image quality.

As reported by DPReview, following Google’s most recent Pixel V8.2 update to its camera app, the company has improved the technology to achieve more natural-looking high dynamic range (HDR) images shot using its Pixel smart devices.

In its Google AI blog, the company explains that Pixel 5 and Pixel 4a (5G) first received its HDR+ with Bracketing feature, which “operates ‘under the hood.'” It captures several images with different exposure times “to improve image quality (especially in shadows), resulting in more natural colors, improved details and texture, and reduced noise.”

Image by Google.

Although many smart devices have already had an HDR feature — the process by which several images are taken in quick succession and then combined and rendered together in a way to preserve detail across the tonal range — Google says that those have limitations. Namely, the process can lead to noise in shadows, which is further exacerbated by the image sensor’s physical constraints on mobile devices.

One way to deal with this issue is to use exposure bracketing, which means taking two different exposures and combining them, however, it can be a time-consuming process. Also, this may work for more capable cameras, but when it comes to mobile ones, Google says it can be difficult because it requires “capturing additional long exposure frames while maintaining the fast, predictable capture experience of the Pixel camera” and also because it’s hard to avoid “ghosting artifacts caused by motion frames” while taking advantage of long exposure frames.

To overcome these challenges, Google’s original HDR+ system used a different approach by prioritizing highlights and using burst mode to reduce noise in the shadows, which “works well for scenes with moderate dynamic range, but breaks down for HDR scenes,” due to two different types of noise that appear when capturing burst photos, which are “shot noise” and “read noise.”

Shot noise “depends only on the total amount of light captured,” which, if it was the only type of noise present, would result in burst photography as efficient as taking long exposures, however, every time a camera captures a frame, read noise is also introduced. This type of noise ” instead depends on the number of frames taken — that is, with each frame taken, an additional fixed amount of read noise is added.” The presence of both noises makes burst photography not as efficient in eliminating the total noise.

Google addressed this issue by adding bracketing to HDR+, which required a redesign of the capture strategy. When bracketing, an additional long exposure frame is captured after the shutter is pressed, which is not displayed in the viewfinder. That’s why the company recommends holding your camera still for half a second after the shutter has been pressed which helps the long exposure create a better quality image. Night Sight technology has also been improved by changing it from capturing 15 short exposure frames to 12 short and 3 long exposure frames.

In regard to its merging process, the technology chooses “one of the short frames as the reference frame to avoid potentially clipped highlights and motion blur” with all other frames “aligned to this frame before they are merged.” This method has a disadvantage whereby ghosting artifacts, caused by motion, can occur. To reduce this, Google claims it has designed “a new spatial merge algorithm, similar to the one used for Super Res Zoom, that decides per pixel whether image content should be merged or not.”

Left: Ghosting artifacts are visible around the silhouette of a moving person when deghosting is disabled.
Right: Robust merging produces a clean image. Image and caption by Google.

The new HDR+ with Bracketing technology is available on Pixel 4a (5G) and Pixel 5 devices in their default camera app, as well as in Night Sight, and Portrait modes. For Pixel 4 and Pixel 4a, the Google Camera app supports bracketing in Night Sight mode. Users don’t have to manually enable this feature because “depending on the dynamic range of the scene, and the presence of motion, HDR+ with bracketing chooses the best exposures to maximize image quality.”

To read a detailed description of this technology, you can visit Google AI blog post. Google has also shared a publicly accessible album containing images captured with HRD+ with Bracketing.


Image credits: Header photo licensed via Depositphotos.

Discussion