计算机工程与应用 ›› 2023, Vol. 59 ›› Issue (11): 57-62.DOI: 10.3778/j.issn.1002-8331.2211-0168

• 理论与研发 • 上一篇    下一篇

张量时间序列预测T-Transformer模型

李文,陈佳伟,刘瑞雪,侯玉国,杜守国   

  1. 1.上海对外经贸大学 统计与信息学院,上海 201620
    2.上海市大数据中心,上海 200072
  • 出版日期:2023-06-01 发布日期:2023-06-01

T-Transformer Model for Predicting Tensor Time Series

LI Wen, CHEN Jiawei, LIU Ruixue, HOU Yuguo, DU Shouguo   

  1. 1.School of Statistics and Information, Shanghai University of International Business and Economics, Shanghai 201620, China
    2.Shanghai Municipal Big Data Center, Shanghai 200072, China
  • Online:2023-06-01 Published:2023-06-01

摘要: 张量时间序列呈现了多个时间序列随着时间变化协同演化的特征,张量时间序列预测已成为一个重要的问题。针对张量时间序列的高维特征,提出了基于Transformer的张量时间序列预测方法T-Transformer,将张量运算和Transformer共同集成到一个统一的框架中。该方法用张量表示的高阶时间序列,运用张量的切片和向量化,将张量时间序列转换成向量,将该向量经过编码后输入到Transformer模型,得到张量时间序列预测值。在三个公开数据集上进行的实验表明,与基准方法相比较,所提出的方法取得了更好的预测效果。

关键词: 张量时间序列, 张量向量化, 注意力机制, Transformer模型, 时间序列预测

Abstract: Tensor time series shows the co-evolution characteristics of multiple time series with time changes, and tensor time series prediction has become an important problem. Aiming at the high-dimensional characteristics of tensor time series, T-Transformer, a tensor time series prediction method based on Transformer, is proposed, which integrates tensor operation and Transformer into a unified framework. The method first uses tensors to represent higher-order time series, uses tensor slices and vectorization to convert tensor time series into vectors, and then inputs the vectors into Transformer model after coding, and finally obtains the predicted value of tensor time series. Experiments on three open data sets show that the proposed method achieves better prediction results than the benchmark method.

Key words: tensor time series, tensor vectorization, attention, Transformer model, time series prediction