Abstract
We propose SING (StabIlized and Normalized Gradient), a plug-and-play technique that improves the stability and generalization of the Adam(W) optimizer. SING is straightforward to implement and has minimal computational overhead, requiring only a layer-wise standardization of the gradients fed to Adam(W) without introducing additional hyper-parameters. We support the effectiveness and practicality of the proposed approach by showing improved results on a wide range of architectures, problems (such as image classification, depth estimation, and natural language processing), and in combination with other optimizers. We provide a theoretical analysis of the convergence of the method, and we show that by virtue of the standardization, SING can escape local minima narrower than a threshold that is inversely proportional to the network's depth.
Abstract (translated)
我们提出了SING(稳定和归一化梯度),这是一个可插拔的技术,可以提高Adam(W)优化器的稳定性和泛化能力。SING易于实现,并具有最小的计算 overhead,只需要在每个層上进行标准化,而不引入额外的超参数。我们支持提出的方法和其有效性和实用性,通过展示在各种架构、问题(如图像分类、深度估计和自然语言处理)以及与其他优化器的组合下改进的结果。我们提供了该方法的收敛理论分析,并表明通过标准化,SING可以逃避比阈值更窄的局部最小值。
URL
https://arxiv.org/abs/2305.15997