Abstract
In this study, we investigate the continuous time dynamics of Recurrent Neural Networks (RNNs), focusing on systems with nonlinear activation functions. The objective of this work is to identify conditions under which RNNs exhibit perpetual oscillatory behavior, without converging to static fixed points. We establish that skew-symmetric weight matrices are fundamental to enable stable limit cycles in both linear and nonlinear configurations. We further demonstrate that hyperbolic tangent-like activation functions (odd, bounded, and continuous) preserve these oscillatory dynamics by ensuring motion invariants in state space. Numerical simulations showcase how nonlinear activation functions not only maintain limit cycles, but also enhance the numerical stability of the system integration process, mitigating those instabilities that are commonly associated with the forward Euler method. The experimental results of this analysis highlight practical considerations for designing neural architectures capable of capturing complex temporal dependencies, i.e., strategies for enhancing memorization skills in recurrent models.
Abstract (translated)
在这项研究中,我们探讨了循环神经网络(RNN)的连续时间动态特性,并重点关注具有非线性激活函数的系统。本工作的目标是识别出在什么条件下,RNN会表现出持续振荡行为而不收敛到静态固定点的情况。我们发现斜对称权重矩阵对于在线性和非线性配置中实现稳定的极限环至关重要。此外,我们还展示了类似双曲正切(奇数、有界且连续)的激活函数能够通过确保状态空间中的运动不变量来保持这些振荡动态特性。数值模拟表明,非线性激活函数不仅维持了极限环,而且增强了系统积分过程的数值稳定性,从而缓解了通常与前向欧拉方法相关联的不稳定性问题。这项分析的实验结果突出了在设计能够捕捉复杂时间依赖性的神经架构时的实际考虑因素,即提高循环模型记忆能力的战略方法。
URL
https://arxiv.org/abs/2504.13951