figures
Blog

Everything You Need To Know About Google’s JPEG Encoder Guetzli

Admin Globaldots
19.05.2017
image 6 Min read

A website is the digital face of a company, the first thing a visitor sees. And it’s a well known fact that people build their opinions largely based on first impressions. In order to build a great first impression, online businesses focus on compelling and crisp visual elements – they want the company’s face to be as attractive as possible. The fact that an average of 65% of a page payload is made out of images just confirms it. The problem, however, is that images are heavy in terms of web performance. Simply put, a lot of them can end up significantly slowing page load times which ultimately ends up affecting user experience. No business owner wants their company’s face to be the “beautiful but slow” one.

Image Source

Tweet this: About 65% of a web page payload is made out of images

As software engineer Ronan Cremin noted, today’s web pages are the same size as the iconic 90s computer game Doom. The installer of Doom amounts at 2.39MB of space while an average web page requires 2.6MB of downloaded data, according to HTTP Archive. It means web pages are getting heavier which then affects overall page speed.

With internet users getting increasingly impatient, page load times are critical as never before. Speed is crucial on the internet, which further fuels the need for smaller or compressed high quality files – the smaller the file, the faster it loads. In order to help reduce image sizes, Google launched their own open-source JPEG encoder which they claim can reduce images by up to 35% more than any other available method, without any losses in terms of quality.

Introducing Guetzli

Curiously named Guetzli (translates to “cookie” from Swiss German), Google describes it as:

(…) a JPEG encoder for digital images and web graphics that can enable faster online experiences by producing smaller JPEG files while still maintaining compatibility with existing browsers, image processing applications and the JPEG standard.

Image Source

Tweet this: Guetzli encoder – smaller JPEGs while maintaining quality and compatibility

From what it achieves it’s similar to Google’s Zopfli algorithm, which generates smaller PNG and gzip files without the need for a new format. It is quite opposed to the techniques used in RNN-based image compression, RAISR, and WebP, which all require client changes for compression gains.

How it Works

The perceived quality of JPEGs is directly linked to the compression process they withstand:

  • Color space transformation
  • Discrete cosine transformation
  • Quantization

As for Guetzli, it is designed to target the last stage – quantization, during which the more quality reduction is applied, the smaller the output file becomes. Guetzli’s algorithm is built to balance minimal quality loss and file size reduction. It employs a complex search algorithm which closes the gap between JPEG and Guetzli psychovisual models. The downside, however, is that the process with search algorithms requires a lot more time to complete when compared to other methods.

Butteraugli – The Quality Metric

Encoders are made to trim unnecessary file data away. The best ones manage to do so without losing any visual quality. The thing is, it’s hard to objectively measure post-encoding visual quality. Research and development related to images and visual components still largely relies on real-world data, where people are asked to express their opinion on what they notice or not. It’s often impossible to consistently run that kind of tests which propelled the need for an objective metric, a mathematical one which will determine the level image quality.

For years, the main quality metric was Peak signal-to-noise ratio (PSNR) which measured the signal of the original image compared to the encoder’s lossy one. PSNR had to be replaced with newer metrics that could follow the rapid tech leaps. That’s why in 2004, the Structural SIMilarity (SSIM) index was introduced as a better model. It was crafted upon insights about perceptual concepts. For example, the human brain focuses more on structures and patterns, and easily dismisses bright portions of an image.

Image Source

Tweet this:The Butteraugli image quality metric is the brain behind Guetzli’s algorithm

In 2016, researchers at Google developed an new metric for image quality – Butteraugli. This particular metric takes image modeling to a whole new level. It follows the complicated biological processes of the human body (mainly the activity of the cells in the retina).

Butteraugli as a metric is still new and relatively untested. And it’s the brains that runs Guetzli’s algorithm. The team behind Guetzli explained that Butteraugli as a metric performs best on high-quality image levels, but is CPU and RAM-heavy once encoding is started. They also noted the tool still has room for improvements.

Comparison

Eric Portis from Cloudinary ran a few interesting comparison tests to check Guetzli’s capabilities against other encoders. Along with Guetzli, he encoded JPEGs with mozjpeg (Mozilla’s encoder), libjpeg (Independent JPEG Group) and q_auto (Cloudinary). Since Guetzli doesn’t allow for encoding at quality levels below 84, all the images were rendered at 84. The results:

  1. libjpeg generated heavy HQ images
  2. Average Guetzi was smaller than libjpeg
  3. mozjpeg was faster and even smaller
  4. q_auto produced the slimmest images

Curiously, Guetzli encoded JPEGs were slightly bigger than mozjpeg and q_auto JPEGs but when measured through DSSIM they appeared of lower quality. When the quality metric was shifted to Butteraugli, Guetzli showed by far the highest levels of encoded image quality. The test performed by Google’s researchers showed similar results. These results end backing up Google’s claims “Guetzli allows for 35% better compression” but only when the metric of their own design is applied.

Image Source

Tweet this: Guetzli allows 35% better compression but only when the Butteraugli metric is applied

Here’s what Portis also found about Guetzli:

  • Images are blockier than other encoders’ images, but that’s usually hard to notice without a loupe
  • Images are a bit crisper, preserve more fine detail and present sharper textures
  • Images are much better at keeping ringing artifacts in check, observable to unassisted eyes too
  • It’s great for JPEGs that suffer from typical ringing artifacts
  • Not the best when looking for smallest possible JPEGs

Check out more about Eric Portis’ comparison test in his original article. Also, Brian Rinaldi wrote an interesting article comparing Guetzli and Photoshop compressed images.

Conclusion

Google did a great job at providing yet another powerful open-source tool. Guetzli is a powerful encoder built for the future and has yet to show its full applicability. However, the smart approach it employs results in extremely long computation times as it’s quite heavy on CPU and RAM (Google said it requires about one minute of CPU time per megapixel). The Butteraugli metric is yet to prove itself as fully cut for what it is intended, but for now it seems to do the work. In its current version, Guetzli is perfect for keeping great quality in compressed images, but if the only parameter you look after is “image size” then there are still better (and faster) alternatives.

The key takeaway – the truth is far more sophisticated than the simple “35% better” headline. Finally, Guetzli is an ingenious tool, that offers great possibilities that are yet to be fully achieved and, more important, offers new ideas about what can be done with image compression. If you feel your site is too heavy, bloated and it could run faster, don’t hesitate to ask for help. Our experts here at GlobalDots can help you with anything web performance and security related issues.

Comments

0 comments

There’s more to see

slider item
Your Innovation Feed

eBook: Don’t Fortify, Amplify: The New Cloud Security Stack

Steven Puddephatt 25.11.21

2021’s Security leaders deal with everything from cloud-native insider threats to staying one step ahead of the unknown. While the cloud is made to amplify and speed up core business processes, the pressure to fortify cloud-borne assets from possible cyber threats painfully slows things down.  GlobalDots harnessed its 17-year cloud security experience to rethink cloud […]

Read more
slider item
Identity & Access Management (IAM)

How IT can Breeze through Onboardings without Additional Hirings

Dror Arie

Which IT Nuisance Would You Automate First? Employee onboarding is one of the heaviest, most complex operations on a company’s IT. This is especially true in fast-growing companies that may see multiple onboardings per day. And, of course, the wider a company’s software tools array, the more accounts to create and permissions to manage. In […]

Read more
slider item
Cloud Workload Protection

GlobalDots Partners with CWP Innovator Lacework

Li-Or Amir 23.11.21

In its constant endeavor to enrich its cloud security offering with the latest innovation, GlobalDots has recently introduced security unicorn Lacework to its vendor portfolio. Founded in 2015, Lacework offers a cloud security monitoring platform which brings together some of today’s top needs: Workload protection, container & K8s security, compliance monitoring. Last weekend (Nov. 18th, […]

Read more

Unlock Your Cloud Potential

Schedule a call with our experts. Discover new technology and get recommendations to improve your performance.
Contact us
figure figure figure figure figure