Google open-sources Zopfli compression algorithm to speed up Web downloads

Compression is about 100 times slower than conventional methods but compresses about 5 percent better, Google said

Google is open-sourcing a new general purpose data compression library called Zopfli that can be used to speed up Web downloads.

The Zopfli Compression Algorithm, which got its name from a Swiss bread recipe, is an implementation of the Deflate compression algorithm that creates a smaller output size compared to previous techniques, wrote Lode Vandevenne, a software engineer with Google's Compression Team, on the Google Open Source Blog on Thursday.

[ Get your websites up to speed with HTML5 today using the techniques in InfoWorld's HTML5 Deep Dive PDF how-to report. | Learn how to secure your Web browsers in InfoWorld's "Web Browser Security Deep Dive" PDF guide. ]

"The smaller compressed size allows for better space utilization, faster data transmission, and lower Web page load latencies. Furthermore, the smaller compressed size has additional benefits in mobile use, such as lower data transfer fees and reduced battery use," Vandevenne wrote.

The more exhaustive compression techniques used achieve higher data density but also make the compression a lot slower. This does not affect the decompression speed though, Vandenne wrote.

Zopfli is a compression-only library and existing software can be used to decompress the data, he said. Zopfli is compatible with Zip, PNG, gzip and HTTP requests among others, Vandevenne added.

Zopfli's output is generally 3 percent to 8 percent smaller compared to zlib, another compression library based on the Deflate compression algorithm, according to Vandevenne. "We believe that Zopfli represents the state of the art in Deflate-compatible compression," he said.

"This compressor takes more time (~100x slower), but compresses around 5 percent better than zlib and better than any other zlib-compatible compressor we have found," Google said on Zopfli's Google Code page. The code is available under Apache License 2.0.

The new compression library however requires 2 to 3 orders of magnitude more CPU time than zlib at maximum quality. Therefore, it is best suited for applications where data is compressed once and sent over the network many times, such as static content for the Web, Vandevenne said.

Vandevenne and his colleague Jyrki Alakuijala, a Google software engineer who also worked on the project, recommend in their research paper to use Zopfli "for compression of static content and other content where data transfer or storage costs are more significant than the increase in CPU time."

"By open sourcing Zopfli, thus allowing webmasters to better optimize the size of frequently accessed static content, we hope to make the Internet a bit faster for all of us," Vandevenne said.

Correction: This story as originally posted understated the additional CPU time required by the Zopfli compression algorithm compared to zlib.The article has been amended.

Copyright © 2013 IDG Communications, Inc.