Abstract
Diffusion models represent the cutting edge in image generation, but their high memory and computational demands hinder deployment on resource-constrained devices. Post-Training Quantization (PTQ) offers a promising solution by reducing the bitwidth of matrix operations. However, standard PTQ methods struggle with outliers, and achieving higher compression often requires transforming model weights and activations before quantization. In this work, we propose HadaNorm, a novel linear transformation that extends existing approaches and effectively mitigates outliers by normalizing activations feature channels before applying Hadamard transformations, enabling more aggressive activation quantization. We demonstrate that HadaNorm consistently reduces quantization error across the various components of transformer blocks, achieving superior efficiency-performance trade-offs when compared to state-of-the-art methods.
Abstract (translated)
扩散模型在图像生成领域代表了最先进的技术,但其高内存和计算需求阻碍了它们在资源受限设备上的部署。后训练量化(PTQ)通过减少矩阵运算的位宽提供了一种有前景的解决方案。然而,标准的PTQ方法难以处理异常值,并且要实现更高的压缩率通常需要在量化前转换模型权重和激活。在这项工作中,我们提出了HadaNorm,这是一种新颖的线性变换方法,它扩展了现有的方法并在应用哈达玛变换之前通过归一化激活特征通道有效地缓解了异常值问题,从而允许更激进的激活量化。我们证明,与现有最佳方法相比,HadaNorm在变压器块的各种组件中一致地减少了量化误差,并实现了更好的效率-性能权衡。
URL
https://arxiv.org/abs/2506.09932