Abstract
Recently, transformer-based methods have made impressive progress in single-image super-resolu-tion (SR). However, these methods are difficult to apply to lightweight SR (LSR) due to the challenge of balancing model performance and complexity. In this paper, we propose an efficient striped window transformer (ESWT). ESWT consists of efficient transformation layers (ETLs), allowing a clean structure and avoiding redundant operations. Moreover, we designed a striped window mechanism to obtain a more efficient ESWT in modeling long-term dependencies. To further exploit the potential of the transformer, we propose a novel flexible window training strategy. Without any additional cost, this strategy can further improve the performance of ESWT. Extensive experiments show that the proposed method outperforms state-of-the-art transformer-based LSR methods with fewer parameters, faster inference, smaller FLOPs, and less memory consumption, achieving a better trade-off between model performance and complexity.
Abstract (translated)
近年来,基于Transformer的方法在单张图像超分辨率(SR)方面取得了令人印象深刻的进展。然而,这些方法在处理轻量级SR(LSR)时面临平衡模型性能和复杂性的挑战,因此难以应用于LSR。在本文中,我们提出了一种高效的分割窗口Transformer(ESWT)。ESWT由高效的转换层(ETL)组成,能够实现干净结构并避免重复操作。此外,我们设计了一个分割窗口机制,以在建模长期依赖关系时实现更高效的ESWT。为了进一步利用Transformer的潜力,我们提出了一种新颖的灵活窗口训练策略。在没有额外成本的情况下,这种方法可以进一步改善ESWT的性能。广泛的实验结果表明, proposed方法在更少参数、更快推理、较小 FLOPs 和较少内存消耗的情况下比最先进的Transformer-based LSR方法表现更好,实现了模型性能与复杂性的更好的权衡。
URL
https://arxiv.org/abs/2301.09869