Mozilla today announced the release of mozjpeg version 2.0. The JPEG encoder is now capable of reducing the size of both baseline and progressive JPEGs by 5 percent on average (compared to those produced by the standard JPEG library libjpeg-turbo upon which mozjpeg is based).

Many images will see “significantly larger reductions,” Mozilla Chief Technology Officer Andreas Gal told TNW. When asked for a more specific number, he said that “up to 15 percent is not unusual.”

The advances in the 2.0 release are possible thanks to trellis quantization, which improves compression for both baseline and progressive JPEGs without sacrificing anything in terms of compatibility (mozjpeg 1.0 only enhanced compression for progressive JPEGs). Other improvements include:

  • The cjpeg utility now supports JPEG input in order to simplify re-compression workflows.
  • New options to specifically tune for PSNR, PSNR-HVS-M, SSIM, and MS-SSIM metrics.
  • A single DC scan is now generated by default in order to be compatible with decoders that can’t handle arbitrary DC scans.

Mozilla today also revealed that Facebook is testing mozjpeg 2.0 to see whether it can be used to improve the compression of images on Facebook.com. The company has even donated $60,000 to contribute to the ongoing development of the technology.

“Facebook supports the work Mozilla has done in building a JPEG encoder that can create smaller JPEGs without compromising the visual quality of photos,” Stacy Kerkela, software engineering manager at Facebook, said in a statement. “We look forward to seeing the potential benefits mozjpeg 2.0 might bring in optimizing images and creating an improved experience for people to share and connect on Facebook.”

Mozilla told us that Facebook got in touch a few months ago, as did other companies which it is not yet ready to reveal. We asked Facebook about how it might apply the JPEG encoder, but a company spokesperson said she could not provide details about potential deployment until testing is complete.

Facebook could use the encoder on photos that users have already uploaded to the site, or it could apply it dynamically on images that are regularly accessed, such as profile pictures or link thumbnails. Whatever the case may be, the potential to reduce loading time is very high, given that Facebook is such an image-heavy service.

Facebook’s donation will be used in part to develop mozjpeg 3.0. Gal told TNW the next release will mainly focus on lossless, unmodified images. He believes there is even more room for improvement, and that the compression quality can be further enhanced by “a few more percent.”

Mozilla is aiming to create a production-quality JPEG encoder that improves compression while maintaining compatibility with the vast majority of deployed decoders, with the end goal of reducing page load times. To do so, the company is willing to investigate the limits of JPEG rather than take the route Google did, which was to create a new image format (WebP) and push for its adoption. By maintaining backwards compatibility, mozjpeg allows all browsers to benefit from the improvements without having to adopt new image formats.

So far, studies it has conducted have proved inconclusive, though mozjpeg has certainly breathed new life into JPEG. The work has certainly piqued the interest of at least one large tech company.

mozjpeg (GitHub)

Top Image Credit: Andreas Krappweis