Google has just released a brand new, open-source JPEG encoder called Guetzli that can do two very neat things. First, it can decrease JPEG file size by 35% without a noticeable decrease in quality, and second, it can increase the quality of an image without increasing file size at all.
Editor’s Note: The image above is an illustration, not a demo of the encoder.
Guetzli is Swiss German for “cookie”, in case you’re wondering. Interested photographers can already download it off GitHub and give it a shot.
“Guetzli is a JPEG encoder for digital images and web graphics that can enable faster online experiences by producing smaller JPEG files while still maintaining compatibility with existing browsers, image processing applications and the JPEG standard,” explains Google on their Research blog. That last bit is the important part: compatibility.
Unlike the WebP format (actually the worst) and Google’s fancy RAISR technology, the Guetzli encoder doesn’t create a new file format. Any existing imaging program and browser that can read JPEGs—so … every existing imaging program and browser—can already read the output. Below are a couple of examples posted by Google:
In both of these images, the original image is on the left, the popular libjpeg encoder is used in the middle, and Google’s new Guetzli is used on the right. As you can see, there are some artifacts in the Guetzli image—this algorithm doesn’t increase quality while decreasing size—but they’re minimal and far fewer than you see in the libjpeg version.
Google achieved this by training their Guetzli algorithm to “strike a balance” between quality loss and decreased size by “approximat[ing] color perception and visual masking.” In English (thanks Ars Technica) that means Google’s algorithm is better at figuring out which colors to keep and which to toss during the “quantization” portion of JPEG compression.
The tradeoff is speed: Guetzli is slower than the encoders it out-performs. But the performance boost is so significant, Google believes the tradeoff is worth it.
Okay now, everybody say it with me:
(via Ars Technica)