计算机工程与应用 ›› 2025, Vol. 61 ›› Issue (22): 170-182.DOI: 10.3778/j.issn.1002-8331.2408-0176

• 模式识别与人工智能 • 上一篇    下一篇

基于深度学习和时序拆解的长期时序预测模型

宋晓宝,邓力玮,王浩,张耀安,陈作胜,贺钰昕,曹文明   

  1. 1.深圳大学 广东省多媒体信息服务工程技术研究中心,广东 深圳 518060
    2.深圳大学 射频异质异构集成全国重点实验室,广东 深圳 518060 
    3.广东省智能信息处理重点实验室,广东 深圳 518060
    4.深圳技术大学 城市交通与物流学院,广东 深圳 518118
  • 出版日期:2025-11-15 发布日期:2025-11-14

Long-Term Time Series Prediction Model Based on Deep Learning and Temporal Decomposition

SONG Xiaobao, DENG Liwei, WANG Hao, ZHANG Yao’an, CHEN Zuosheng, HE Yuxin, CAO Wenming   

  1. 1.Guangdong Multimedia Information Service Engineering Technology Research Center, Shenzhen University, Shenzhen, Guangdong 518060, China
    2.State Key Laboratory of Radio Frequency Heterogeneous Integration, Shenzhen University, Shenzhen, Guangdong 518060, China
    3.Guangdong Key Laboratory of Intelligent Information Processing, Shenzhen, Guangdong 518060, China
    4.College of Urban Transportation and Logistics, Shenzhen Technology University, Shenzhen, Guangdong 518118, China
  • Online:2025-11-15 Published:2025-11-14

摘要: 近年来,一些研究工作将Transformer及其变体引入一般时序预测任务中,并取得了较为显著的性能提升,但目前这类算法对于长期时序预测任务仍存在计算消耗大、对数据内长期关联关系挖掘能力不足、感受野受限等诸多问题。为应对上述问题,从注意力机制与时序拆解思想入手,提出了一种新的长期时序预测模型DeepTD-LSP(deep temporal decomposition long-term series prediction)。该模型总体上采取编码器-解码器结构。编码器由多个编码层堆叠构成,每个编码层由频率拆解模块、前向传播模块和季节模块构成。频域拆解模块可将输入序列拆分为趋势信息、季节信息和噪声,三者之中季节信息最为复杂,挖掘难度较大,因此该模型中编码器主要负责季节信息的挖掘和编码工作。而解码器由多个解码层堆叠构成,每个解码层由频率拆解模块、前向传播模块、季节模块、季节注意力模块和趋势注意力模块构成。解码器主要负责多类输入信息的提取、融合与预测任务。模型在7个来自真实世界的数据集上进行了测试,实验结果表明DeepTD-LSP模型优于其他长期时序预测模型。

关键词: 时序预测, 深度学习, 时序分解, 注意力机制

Abstract: In recent years, some research efforts have introduced Transformer and its variants into general time series prediction tasks, achieving significant performance improvements. However, such models currently face several challenges in long-term time series prediction tasks, including high computational costs, limited ability to explore long-term relationships within the data, and restricted receptive fields. To address these issues, a new long-term time series prediction model named DeepTD-LSP (deep temporal decomposition long-term series prediction) is proposed based on the ideas of attention mechanism and temporal decomposition. The model adopts an encoder-decoder structure overall. The encoder consists of multiple stacked encoding layers, with each layer composed of frequency decomposition modules, feedforward modules, and seasonal modules. The frequency domain decomposition module separates the input sequence into trend information, seasonal information, and noise. Among these, seasonal information is the most complex and challenging to extract, hence in this model, the encoder primarily focuses on mining and encoding seasonal information. The decoder consists of multiple stacked decoding layers, with each layer composed of frequency decomposition modules, feedforward modules, seasonal modules, seasonal attention modules, and trend attention modules. The decoder is mainly responsible for extracting, integrating, and predicting various types of input information. The model is tested on seven real-world datasets, the experimental results demonstrate the superiority of the DeepTD-LSP model over other long-term time series prediction models.

Key words: time series prediction, deep learning, time series decomposition, attention mechanism