Abstract
This paper presents a novel method which simultaneously learns the number of filters and network features repeatedly over multiple epochs. We propose a novel pruning loss to explicitly enforces the optimizer to focus on promising candidate filters while suppressing contributions of less relevant ones. In the meanwhile, we further propose to enforce the diversities between filters and this diversity-based regularization term improves the trade-off between model sizes and accuracies. It turns out the interplay between architecture and feature optimizations improves the final compressed models, and the proposed method is compared favorably to existing methods, in terms of both models sizes and accuracies for a wide range of applications including image classification, image compression and audio classification.
Abstract (translated)
本文提出了一种在多个时期内同时学习滤波器数目和网络特征的新方法。我们提出了一种新的删减损失,以显式地强制优化器将重点放在有希望的候选过滤器上,同时抑制不太相关的过滤器的贡献。同时,我们还建议加强滤波器之间的差异性,这种基于多样性的正则化术语提高了模型尺寸和精度之间的权衡。结果表明,体系结构与特征优化之间的相互作用改善了最终的压缩模型,所提出的方法在模型尺寸和精度方面与现有方法进行了比较,适用于图像分类、图像压缩和音频分类等广泛的应用。
URL
https://arxiv.org/abs/1906.04505