Abstract
Transformer-based models have shown strong performance in time-series forecasting by leveraging self-attention to model long-range temporal dependencies. However, their effectiveness depends critically on the quality and structure of input representations derived from raw multivariate time-series data. This work proposes a two-stage forecasting framework that explicitly separates local temporal representation learning from global dependency modelling. In the first stage, a convolutional neural network (CNN) operates on fixed-length temporal patches to extract short-range temporal dynamics and non-linear feature interactions, producing compact patch-level token embeddings. Token-level self-attention is subsequently applied during representation learning to refine these embeddings by enabling interactions across temporal patches. In the second stage, a Transformer encoder processes the resulting token sequence to model inter-patch temporal dependencies and generate per-patch forecasts. Experiments conducted on synthetic multivariate time-series data with controlled static and dynamic factors demonstrate that the proposed patch-based tokenization strategy achieves competitive forecasting performance compared to convolutional and patch-based Transformer baselines. The results highlight the importance of structured temporal representations and show that decoupling local temporal encoding from global attention-based modelling yields more effective and stable time-series forecasting.
Abstract (translated)
基于Transformer的模型通过利用自注意力机制来建模长期时间依赖性,在时间序列预测中表现出强大的性能。然而,它们的有效性在很大程度上取决于从原始多变量时间序列数据派生的输入表示的质量和结构。这项工作提出了一种两阶段的时间序列预测框架,该框架明确地将局部时间表示学习与全局依赖关系建模分离。在这个框架的第一阶段,卷积神经网络(CNN)用于处理固定长度的时间片段,以提取短范围内的动态变化以及非线性特征交互,并生成紧凑的分片级别标记嵌入。随后,在表示学习过程中应用了令牌级别的自注意力机制,通过使时间片段之间能够相互作用来优化这些嵌入。 在第二阶段,Transformer编码器对由此产生的令牌序列进行处理,以建模跨分片的时间依赖关系并生成每个分片的预测值。实验使用具有可控静态和动态因素的合成多变量时间序列数据集进行,并表明所提出的基于片段的标记化策略相比卷积基线模型和基于片段的Transformer基线模型,在预测性能上具有竞争力。 结果强调了结构化的时序表示的重要性,同时展示了将局部时序编码从全局注意力机制建模中解耦能够获得更有效且稳定的时序预测效果。
URL
https://arxiv.org/abs/2601.12467