Abstract
Continuous-time models such as Neural ODEs and Neural Flows have shown promising results in analyzing irregularly sampled time series frequently encountered in electronic health records. Based on these models, time series are typically processed with a hybrid of an initial value problem (IVP) solver and a recurrent neural network within the variational autoencoder architecture. Sequentially solving IVPs makes such models computationally less efficient. In this paper, we propose to model time series purely with continuous processes whose state evolution can be approximated directly by IVPs. This eliminates the need for recurrent computation and enables multiple states to evolve in parallel. We further fuse the encoder and decoder with one IVP solver based on its invertibility, which leads to fewer parameters and faster convergence. Experiments on three real-world datasets show that the proposed approach achieves comparable extrapolation and classification performance while gaining more than one order of magnitude speedup over other continuous-time counterparts.
Abstract (translated)
连续时间模型如神经网络ODEs和神经网络Flows在分析不规则采样的时间序列方面取得了令人鼓舞的结果,这在电子健康记录中经常出现的不规则采样时间序列分析中具有重要的意义。基于这些模型,时间序列通常通过异变自编码器架构中的初始值问题求解器(IVP)求解器与循环神经网络的混合来处理。解决IVPs的序列化方法会导致这些模型的计算效率较低。在本文中,我们建议仅使用连续过程来建模时间序列,这些过程的状态演化可以直接近似于IVPs。这消除了循环计算的需求,并允许多个状态并行进化。我们进一步将编码器和解码器与一个IVP求解器结合,基于其逆问题的可逆性,这导致更少的参数和更快的收敛。对三个实际数据集的实验表明,提出的方法实现了可比的扩展和分类性能,而比其他连续时间模型获得了超过1个数量级的加速。
URL
https://arxiv.org/abs/2305.06741