Abstract
Learned image compression has exhibited promising compression performance, but variable bitrates over a wide range remain a challenge. State-of-the-art variable rate methods compromise the loss of model performance and require numerous additional parameters. In this paper, we present a Quantization-error-aware Variable Rate Framework (QVRF) that utilizes a univariate quantization regulator a to achieve wide-range variable rates within a single model. Specifically, QVRF defines a quantization regulator vector coupled with predefined Lagrange multipliers to control quantization error of all latent representation for discrete variable rates. Additionally, the reparameterization method makes QVRF compatible with a round quantizer. Exhaustive experiments demonstrate that existing fixed-rate VAE-based methods equipped with QVRF can achieve wide-range continuous variable rates within a single model without significant performance degradation. Furthermore, QVRF outperforms contemporary variable-rate methods in rate-distortion performance with minimal additional parameters.
Abstract (translated)
学习的图像压缩表现出令人期望的压缩性能,但广泛的可变比特率仍然是一个挑战。最先进的可变速率方法牺牲了模型性能,并要求大量的额外参数。在本文中,我们提出了一个量化误差aware的可变速率框架(QVRF),它使用单个编码器来实现在一个模型中实现广泛的可变速率。具体来说,QVRF定义了一个量化控制器向量,与预先定义的拉普拉斯移量一起控制离散变量率的所有潜在表示的量化误差。此外,重参数化方法使QVRF与圆形量化器兼容。充分的实验结果表明,现有固定速率VAE-based方法配备了QVRF可以在一个模型中实现广泛的连续可变速率,而没有明显的性能下降。此外,QVRF在比率失真性能方面比 contemporary 可变速率方法出色,仅需要少量的额外参数。
URL
https://arxiv.org/abs/2303.05744