Abstract
Neural ODEs (NODEs) are continuous-time neural networks (NNs) that can process data without the limitation of time intervals. They have advantages in learning and understanding the evolution of complex real dynamics. Many previous works have focused on NODEs in concise forms, while numerous physical systems taking straightforward forms, in fact, belong to their more complex quasi-classes, thus appealing to a class of general NODEs with high scalability and flexibility to model those systems. This, however, may result in intricate nonlinear properties. In this paper, we introduce ControlSynth Neural ODEs (CSODEs). We show that despite their highly nonlinear nature, convergence can be guaranteed via tractable linear inequalities. In the composition of CSODEs, we introduce an extra control term for learning the potential simultaneous capture of dynamics at different scales, which could be particularly useful for partial differential equation-formulated systems. Finally, we compare several representative NNs with CSODEs on important physical dynamics under the inductive biases of CSODEs, and illustrate that CSODEs have better learning and predictive abilities in these settings.
Abstract (translated)
神经常微分方程(Neural ODEs,NODEs)是连续时间的神经网络,可以处理不局限于特定时间间隔的数据。它们在学习和理解复杂现实动态的演变方面具有优势。许多先前的研究集中在简洁形式的NODEs上,但实际上,很多物理系统以较为直接的形式存在,并属于更加复杂的准类别,因此需要一类具有高度可扩展性和灵活性的一般NODEs来模拟这些系统。然而,这可能导致非线性性质变得复杂化。 本文中,我们介绍了ControlSynth神经常微分方程(CSODEs)。我们展示了尽管它们本质上的高非线性特征,通过易于处理的线性不等式仍可保证收敛性。在CSODEs的设计中,我们引入了一个额外的控制项来学习可能同时捕获不同尺度的动力学的能力,这尤其适用于基于偏微分方程表述的系统。最后,我们在具有CSODEs归纳偏置的重要物理动态场景下比较了几种代表性的神经网络与CSODEs,并说明了在这些设置中,CSODEs拥有更好的学习和预测能力。
URL
https://arxiv.org/abs/2411.02292