May 23, 2017Clément Hannicq3 min read
This article has been translated in russian here.
As you may already know, the average web page is now heavier than Doom.
One of the reasons for this increase is the weight of images, and the need to support higher resolutions.
Google just published a new JPEG compression algorithm: Guetzli.
The main idea of this algorithm is to focus on keeping the details the human eye is able to recognize easily while skipping details the eye is not able to notice.
I am no specialist, but the intended result is to get an image which percived quality is the same, but with a reduced filesize.
This is not a new image format, but a new way to compress JPEG images
Which means that there is no need for a custom image reader, the images are displayed by anything that already renders JPEGs.
On one of my projects, we had an image-heavy homepage (about 30Mb for the homepage alone, 27 of those only for the images).
I decided to give Guetzli a try, to convince our product owner and our designer that the quality loss would be acceptable, I tried this new algorithm on one of the high-res image we were not using (a 8574x5715, 22MB JPEG).
According to google (and my experiences confirms the figures), Guetzli takes about 300MB RAM per 1Mpix of image (so about 15GB for the image I had) and I did not have that memory available at the time (half a dozen node servers, a couple docker containers, chromium and a couple electron instances were taking enough space to get my computer under the requirement).
I retried after cleaning up every non-vital process, Guetzli took 12GB of RAM but succeeded.
Google also states that it take about one minute per MPix for Guetzli to process an image, which is about the time it took me (a bit above 40minutes).
The resulting image weighted under 7MB (from 22MB), and I could not determine by looking at them which was the compressed one (our designer could, but admitted that the difference was “incredibly small”).
6.9M home-guetzli.jpg 22M home-raw.jpg
That compression was made using Guetzli’s default quality setting (which goes from 84 to 100, to get under 84 you would need to compile and use a version where you change the minimal value).
I then decided to try different quality settings for that image (wrote a very simple script to do that without having to relaunch the process every 40 minutes, and to be able to do it during my sleep).
The results are here (and it seems that Guetzli’s default quality factor is 95).
6.9M ./home-guetzli.jpg 22M ./home-raw.jpg 3.0M ./home-raw.jpg.guetzli84.jpg 3.4M ./home-raw.jpg.guetzli87.jpg 4.2M ./home-raw.jpg.guetzli90.jpg 5.5M ./home-raw.jpg.guetzli93.jpg 8.8M ./home-raw.jpg.guetzli96.jpg 18M ./home-raw.jpg.guetzli99.jpg
Both the product owner and the designer agreed to go with the 84 quality factor. I then converted all our assets and we went from 30MB to less than 8MB for the homepage (3MB of those being the CSS/script).
Should be noted that there was not any form of image compression before.
The installation of Guetzli on my machine was painless (someone set up an AUR package containing Guetzli on archlinux, thanks a lot to whoever did that), and running it is straightfoward (as long as you have enough RAM).
There seems to be a brew package (for macOs users), but I did not test it.
Guetzli requires a lot of RAM and CPU time for huge images (a lot being relative, i.e. don’t expect to be able to do anything while it’s running).
If RAM is not your bottleneck you might even want to consider to run multiples instances of Guetzli in parallel on different images, as it is (as of this writting) only taking one core.
Being a JPEG encoder, it cannot output PNGs (so no transparency).
But it can convert and compress your PNGs.
It’s efficiency is tied to the initial quality of the picture: I noticed the compression ratio going from 7x on the largest image to 2x on small images.
The quality loss was also more visible on those small images.
On a few cases I also witnessed a loss of color saturation (which was deemed acceptable in my case).
Give Guetzli a try, it might give you unacceptable results (especially with low quality), but it might save you a few MBs on your website.
Web Developer at Theodo