QARV: Quantization-Aware ResNet VAE for Lossy Image Compression.

Zhihao Duan, Ming Lu, Jack Ma, Yuning Huang, Zhan Ma, Fengqing Zhu
Author Information

Abstract

This paper addresses the problem of lossy image compression, a fundamental problem in image processing and information theory that is involved in many real-world applications. We start by reviewing the framework of variational autoencoders (VAEs), a powerful class of generative probabilistic models that has a deep connection to lossy compression. Based on VAEs, we develop a new scheme for lossy image compression, which we name quantization-aware ResNet VAE (QARV). Our method incorporates a hierarchical VAE architecture integrated with test-time quantization and quantization-aware training, without which efficient entropy coding would not be possible. In addition, we design the neural network architecture of QARV specifically for fast decoding and propose an adaptive normalization operation for variable-rate compression. Extensive experiments are conducted, and results show that QARV achieves variable-rate compression, high-speed decoding, and better rate-distortion performance than existing baseline methods.

References

  1. IEEE Trans Pattern Anal Mach Intell. 2022 Mar;44(3):1247-1263 [PMID: 32966210]
  2. IEEE Trans Image Process. 2021;30:3179-3191 [PMID: 33606630]
  3. IEEE Trans Circuits Syst Video Technol. 2023 Aug;33(8):4108-4121 [PMID: 37547669]
  4. IEEE Trans Pattern Anal Mach Intell. 2022 Aug;44(8):4194-4211 [PMID: 33705308]
  5. IEEE Trans Image Process. 2004 Apr;13(4):600-12 [PMID: 15376593]

Grants

  1. R01 CA277839/NCI NIH HHS

Word Cloud

Created with Highcharts 10.0.0compressionlossyimageVAEQARVproblemVAEsquantization-awareResNetarchitecturedecodingvariable-ratepaperaddressesfundamentalprocessinginformationtheoryinvolvedmanyreal-worldapplicationsstartreviewingframeworkvariationalautoencoderspowerfulclassgenerativeprobabilisticmodelsdeepconnectionBaseddevelopnewschemenamemethodincorporateshierarchicalintegratedtest-timequantizationtrainingwithoutefficiententropycodingpossibleadditiondesignneuralnetworkspecificallyfastproposeadaptivenormalizationoperationExtensiveexperimentsconductedresultsshowachieveshigh-speedbetterrate-distortionperformanceexistingbaselinemethodsQARV:Quantization-AwareLossyImageCompression

Similar Articles

Cited By