As expected from Google, the Mountain View-based tech giant is further investing in technology that’ll help improve the utilization of resources on the interwebs. The company has today introduced its latest contribution to the open-source community called ‘Guetzli’, which enables you to encode high-quality JPEG images 35 percent smaller in file size. This is a significant improvement over current JPEG compression techniques.
Google is devoting time to further reduce the size of existing JPEG image files with the primary aim of making web pages load faster. It conducted a research and discovered that over half of the image requests on the internet are JPEGs. There is no debating the fact that images from a majority of the size of any web page, thus, shrinking the size of these filed will result in improved browsing experiences. And the popularity and usage of JPEGs have led Google to experiment with the said format instead of building a new one from scratch.
Coming back to Guetzli, meaning “cookie” in Swiss German, this compression technique is quite similar to the Zopfli algorithm which is used to shrink PNG and gzip files without needing to introduce a new format. The RNN image compression algorithms, on the other hand, require changes to be implemented in both the client and ecosystem to witness benefits at internet scale.
As for the technology driving this open-source algorithm, Google says that the quality of JPEG images is directly related to its multi-stage compression process. Their Guetzli algorithm targets the quantization stage to reduce the size of image files by swapping it out for loss in visual quality. It has now learned to analyze color perception and visual masking before striking a perfect balance between minimal quality loss and file size.
Diving deeper into the technical details in the blog post, Google research engineers Robert Obryk and Jyrki Alakuijala said,
Guetzli specifically targets the quantization stage in which the more visual quality loss is introduced, the smaller the resulting file. Guetzli strikes a balance between minimal loss and file size by employing a search algorithm that tries to overcome the difference between the psychovisual modeling of JPEG’s format, and Guetzli’s psychovisual model.
They further continue to add that the only trade-off for this technique is that the compression process requires significantly longer to produce smaller images. Other compression algorithms such as ‘libjpeg’ can also produce images similar or larger in size than Guetzli, but the blog post confirms that its human raters consistently preferred the images produced by Guetzli. The tech giant now hopes that their new open-source compression algorithm will enable users to reduce website load times and bandwidth costs.
The “anonymous guy” behind the desk who keeps pushing press releases and sponsored content on our site.
P.S. Don’t go to the profile pic on the left, we keep trolling one of our own writers with this… :p