PetaPixel

JPEGmini Magically Makes Your JPEGs Up to 5x Smaller

JPEGmini is a new image compression service that can magically reduce the file size of your JPEG photos by up to 5 times without any visible loss in quality. ICVT, the Israeli company behind the service, explains how the technology works in an interview with Megapixel:

Our technology analyzes each specific photo, and determines the maximum amount of compression that can be applied to the photo without creating any visual artifacts. In this way, the system compresses each photo to the maximum extent possible without hurting the perceived quality of the photo.

You can test out the technology on your own photos through the service’s website.

To test the service, I uploaded a 1024px wide photo saved at the highest quality (12) in Photoshop. The service was able to reduce the 654KB image to 158KB. In the following side-by-side comparison of 100% crops, can you tell which side shows the original photo and which side shows the compressed one?

The right side is the compressed photo that was made 4.1 times smaller through the service!

Now if only they’d come out with a simple WordPress plugin that automatically compresses every uploaded photo… We’d install it in a heartbeat.

JPEGmini (via Megapixel)


 
 
  • Sam Christopher Cornwell

    So the obvious question is will this format be available on Photoshop/Gimp soon?

  • http://www.petapixel.com Michael Zhang

    It’s actually not a new image format — it’s simply a service/algorithm that attempts spits out the optimal compression for each JPEG photo.

  • http://www.jaygunn.com Jay

    Not entirely sure how this is any different than the various image editors versions of “Save for Web”/”Save for Web and Devices”. You get a 2, 3, or 4 up comparison with the original image so you can tweak and get the optimal compression with minimal or no visual quality loss.

    And sure, 240×480 pictures side by side are going to look perfect in most cases. The real test is what these pictures look like at full screen, the very least. 100% isn’t necessary since the only people looking at a picture at 100% are the artists/designers/photographers producing them.

  • http://www.janluursema.nl Jan

    Well, 5 times is a bit optimistic (ahum) I think.

    I tried it myself and uploaded a file (3628x1666px) saved as jpeg 10 (the website crashed with larger files). It was 1235 kB and jpegmini made it 614 kB. I could save for web at 70% in Photoshop to make it 1060 kB or save as jpeg 9 to make it 877 kB. Both those methods usually don’t show any compression whatsoever.

    But still, shaving of 50% of the filesize is pretty nice, especially if you want to use large images on your portfolio website.

    So let’s hope they make a Photoshop plugin!

  • http://www.janluursema.nl Jan

    Well, 5 times is a bit optimistic (ahum) I think.

    I tried it myself and uploaded a file (3628x1666px) saved as jpeg 10 (the website crashed with larger files). It was 1235 kB and jpegmini made it 614 kB. I could save for web at 70% in Photoshop to make it 1060 kB or save as jpeg 9 to make it 877 kB. Both those methods usually don’t show any compression whatsoever.

    But still, shaving of 50% of the filesize is pretty nice, especially if you want to use large images on your portfolio website.

    So let’s hope they make a Photoshop plugin!

  • http://blog.wingtangwong.com/ Wing Wong

    Wrote up a process and comparison between using JPEGmini and using some scripted wrappers around ImageMagick’s ‘convert’ to handle compression at different levels and ‘compare’ to determine level of image deviation from original. I was able to achieve similar levels to JPEGmini. See link below, or just click my handle to my blog. :)
    http://blog.wingtangwong.com/2011/08/shrinking-down-jpegs-jpegmini.html

    I think the big gain that JPEGmini is offering is a simple and elegant interface. Ie, a way to just say: compress this image to a level that doesn’t make the image noticeably different. Ie, find the correct quality level to compress down to for each unique image.

    It seems that JPEGmini allows for more deviation with larger images than with smaller images. 

  • http://blog.wingtangwong.com/ Wing Wong

    Using JPEGmini’s method, I knocked down a 12MB(6000×4000) image down to 1.6MB. A 4.6MB(4000×3000) image to 1.1MB.  I was able to achieve the same with the homebrew script method. So it is doable, there is no special magic, though I believe JPEGmini is offering a simpler interface. Going to work on optimizing the locating of the threshold for a given image so that it can work as fast as JPEGmini’s method.

  • http://blog.wingtangwong.com/ Wing Wong

    I’ve compared before and after at 100% magnification and using photoshop difference layering. The differences are extremely minute. Ie, you would need to run auto-level on the resulting difference image to see where the differences are actually happening. 

  • http://twitter.com/d7e7r7 David Ritchie

    Is there no way to use this service offline?

    Slow internet in SA makes uploading large images tedious… 

  • http://blog.wingtangwong.com/ Wing Wong

    I think to use their service offline requires a license from them… or you could roll your own offline. I’m creating a small server that uses a similar process to compress my output jpeg images like how they do it, to make serving and copying images to my servers easier. 

    Currently working on making the scanning portion more efficient, and to create an easy to use API, maybe even add it to LR3 as a plugin at some point. That would be fun.

  • http://blog.wingtangwong.com/ Wing Wong

    Woot! Got my quality logic down from 5 minutes to 13 seconds for a 24MP jpeg. Working on getting it to below 10seconds, then maybe under 5-8 seconds. That would make it viable as a Lightroom 3 export plugin. Sweet sweet sweet.

  • http://blog.wingtangwong.com/ Wing Wong

    Okay, final post on this thread. Just finished optimizing the script a bit more so that it properly determines the min quality value for each image, and takes into account panorama cases. Average processing time for an image is between 12-15 seconds, depending on the image. 

    Some metrics:

    Some metrics:Optimal compression quality level determined: 77% >> Compresing [ DSC00620.jpg ] Optimal compression quality level determined: 77% >> Compresing [ DSC00622.jpg ] Optimal compression quality level determined: 77% >> Compresing [ DSC00624.jpg ] Optimal compression quality level determined: 77% >> Compresing [ DSC00626.jpg ] Optimal compression quality level determined: 77% >> Compresing [ DSC00629.jpg ] Optimal compression quality level determined: 77% >> Compresing [ DSC00632.jpg ] Optimal compression quality level determined: 75% >> Compresing [ DSC00633.jpg ] Optimal compression quality level determined: 77% >> Compresing [ DSC00636.jpg ] Optimal compression quality level determined: 77% >> Compresing [ DSC00637.jpg ] Optimal compression quality level determined: 59% >> Compresing [ DSC00639.jpg ] Optimal compression quality level determined: 65% >> Compresing [ DSC00642.jpg ] Optimal compression quality level determined: 59% >> Compresing [ DSC00644.jpg ] Optimal compression quality level determined: 77% >> Compresing [ DSC00648.jpg ] Optimal compression quality level determined: 75% >> Compresing [ DSC00652.jpg ] Optimal compression quality level determined: 68% >> Compresing [ DSC00653.jpg ] Optimal compression quality level determined: 81% >> Compresing [ DSC00665.jpg ] ORIGINALS:6.2M DSC006206.2M DSC006226.3M DSC006246.0M DSC006265.7M DSC006296.0M DSC006325.9M DSC006336.2M DSC006366.0M DSC006374.4M DSC006394.6M DSC006424.3M DSC006445.8M DSC006486.2M DSC006525.4M DSC006535.5M DSC00665Recompressed:988K DSC006201020K DSC006221016K DSC00624936K DSC00626864K DSC00629984K DSC00632844K DSC00633988K DSC00636924K DSC00637448K DSC00639528K DSC00642424K DSC00644808K DSC00648796K DSC00652596K DSC006531.0M DSC00665Summary: 91MB vs 13MBTotal time taken, about 3:20 to process 16 files, or about 12-13 seconds apiece. Pretty interesting. Had done a “let’s go through and compress everything” bit before, but had not thought to determine the compression level dynamically per image to yield good compression, without sacrificing visual image quality. Sweet.

    Test uploaded an image to JPEGmini’s web service and it looks like to upload, queue, wait for queue, and finally process an image took around 10-12 seconds. So they have some very efficient algorithms for determining the minimum quality for each image.12-13 seconds is, in my mind, acceptable for home use in say, optimally compressing an image export from Adobe Lightroom. With some optimization, I’m sure it can be dropped to around 5 seconds or so.

    http://blog.wingtangwong.com/2011/08/shrinking-down-jpegs-jpegmini.html?showComment=1314662362853#c2694319160562588493

  • http://twitter.com/akirasan akirasan

    Hi, I use jpegoptim on Ubuntu (it’s a command from console very fast and light) and I have compared the result of jpegoptim vs JPEGmini,…and sincerely,…I don’t see the “revolution” technique.

    See my test on http://www.akirasan.net/?p=799

  • http://twitter.com/akirasan akirasan

    Hi, I use jpegoptim on Ubuntu (it’s a command from console very fast and light) and I have compared the result of jpegoptim vs JPEGmini,…and sincerely,…I don’t see the “revolution” technique.

    See my test on http://www.akirasan.net/?p=799

  • http://twitter.com/akirasan akirasan

    Hi, I use jpegoptim on Ubuntu (it’s a command from console very fast and light) and I have compared the result of jpegoptim vs JPEGmini,…and sincerely,…I don’t see the “revolution” technique.

    See my test on http://www.akirasan.net/?p=799

  • http://blog.wingtangwong.com/ Wing Wong

    I just tried it out. Awesome. Blows my shell script out of the water in terms of speed/performance! Nice.  Yeah, perhaps the only thing that is revolutionary is the business model…. :)

  • http://Exposedplanet.com/ ExposedPlanet

    Interesting, but currently it is an image version of the solar-powered flashlight without batteries :)

    My problem (and that of most of the connected world) is limited upload speed. So uploading full uncompressed images to their website is a no go.
    An offline version would be much better, but apparently the scripts mentioned above do the same..