[1] WU H X, XU J H, WANG J M, et al. Autoformer: decomposition transformers with auto-correlation for long-term series forecasting[C]//Advances in Neural Information Processing Systems 34, 2021: 22419-22430.
[2] 刘颖, 李阳光, 瞿树晖, 等. 知识嵌入式图神经网络在风机多元状态预测中的应用[J]. 中国科学: 信息科学, 2022, 52(10): 1870-1882.
LIU Y, LI Y G, QU S H, et al. Application of knowledge-embedded graph neural network for multivariate state prediction of wind turbines[J]. Scientia Sinica: Informationis, 2022, 52(10): 1870-1882.
[3] 王兵, 吴思琪, 方宇. 三支残差修正的燃气负荷预测[J]. 计算机工程与应用, 2022, 58(22): 291-296.
WANG B, WU S Q, FANG Y. Gas load forecasting with three-way residual correction[J]. Computer Engineering and Applications, 2022, 58(22): 291-296.
[4] 卢雨田, 王小艺, 王立, 等. 工业大气污染物浓度的复合自回归网络预测[J]. 计算机工程与应用, 2019, 55(18): 223-228.
LU Y T, WANG X Y, WANG L, et al. Forecasting method for industrial exhaust gas based on compound nonlinear auto regressive neural network[J]. Computer Engineering and Applications, 2019, 55(18): 223-228.
[5] LAI G K, CHANG W C, YANG Y M, et al. Modeling long- and short-term temporal patterns with deep neural networks[C]//Proceedings of the 41st International ACM SIGIR Conference on Research & Development in Information Retrieval. New York: ACM, 2018: 95-104.
[6] 戴宇睿, 安俊秀, 陶全桧. 融合双通路注意力与VT-LSTM的金融时序预测[J]. 计算机工程与应用, 2023, 59(12): 157-165.
DAI Y R, AN J X, TAO Q H. Financial time-series prediction by fusing dual-pathway attention with VT-LSTM[J]. Computer Engineering and Applications, 2023, 59(12): 157-165.
[7] 詹熙, 潘志松, 黎维, 等. 基于多尺度特征和注意力的金融时序预测方法[J]. 计算机工程与应用, 2022, 58(19): 107-115.
ZHAN X, PAN Z S, LI W, et al. Financial time series forecasting method based on multi-scale features and attention mechanism[J]. Computer Engineering and Applications, 2022, 58(19): 107-115.
[8] SALINAS D, FLUNKERT V, GASTHAUS J, et al. DeepAR: probabilistic forecasting with autoregressive recurrent networks[J]. International Journal of Forecasting, 2020, 36(3): 1181-1191.
[9] 孙翊文, 王宇璐, 傅昆, 等. 交互门控循环单元及其在到达时间估计中的应用[J]. 中国科学: 信息科学, 2021, 51(5): 822-833.
SUN Y W, WANG Y L, FU K, et al. Interactive gated recurrent unit and its application for estimated time of arrival[J]. Scientia Sinica: Informationis, 2021, 51(5): 822-833.
[10] 冒泽慧, 顾彧行, 姜斌, 等. 基于改进LSTM的高速列车牵引系统微小渐变故障诊断[J]. 中国科学: 信息科学, 2021, 51(6): 997-1012.
MAO Z H, GU X G, JIANG B, et al. Incipient fault diagnosis for high-speed train traction systems via improved LSTM[J]. Scientia Sinica: Informationis, 2021, 51(6): 997-1012.
[11] PASCANU R, MIKOLOV T, BENGIO Y. On the difficulty of training recurrent neural networks[C]//Proceedings of the 30th International Conference on Machine Learning, 2013: 1310-1318.
[12] VASWANI A, SHAZEER N, PARMAR N, et al. Attention is all you need[C]//Advances in Neural Information Processing Systems 30, 2017: 5998-6008.
[13] DEVLIN J, CHANG M W, LEE K, et al. BERT: pre-training of deep bidirectional transformers for language understanding[J]. arXiv:1810.04805, 2018.
[14] DOSOVITSKIY A, BEYER L, KOLESNIKOV A, et al. An image is worth 16x16 words: transformers for image recognition at scale[J]. arXiv:2010.11929, 2020.
[15] 蔡美玲, 汪家喜, 刘金平, 等. 基于Transformer GAN架构的多变量时间序列异常检测[J]. 中国科学: 信息科学, 2023, 53(5): 972-992.
CAI M L, WANG J X, LIU J P, et al. Transformer-GAN architecture for anomaly detection in multivariate time series[J]. Scientia Sinica: Informationis, 2023, 53(5): 972-992.
[16] LI S Y, JIN X Y, XUAN Y, et al. Enhancing the locality and breaking the memory bottleneck of transformer on time series forecasting[C]//Advances in Neural Information Processing Systems 32, 2019.
[17] ZHOU H Y, ZHANG S H, PENG J Q, et al. Informer: beyond efficient transformer for long sequence time-series forecasting[J]. Proceedings of the AAAI Conference on Artificial Intelligence, 2021, 35(12): 11106-11115.
[18] CLEVELAND R, CLEVELAND W, MCRAE J, et al. STL: a seasonal-trend decomposition procedure based on loess[J]. Journal of Official Statistics, 1990, 6(1): 3-73.
[19] HYNDMAN R J, ATHANASOPOULOS G. Forecasting: principles and practice[M]. [S.l.]: OTexts, 2014.
[20] TAYLOR S J, LETHAM B. Forecasting at scale[J]. The American Statistician, 2018, 72(1): 37-45.
[21] ORESHKIN B N, CARPOV D, CHAPADOS N, et al. N-BEATS: neural basis expansion analysis for interpretable time series forecasting[J]. arXiv:1905.10437, 2019.
[22] ZHOU T, MA Z Q, WEN Q S, et al. FEDformer: frequency enhanced decomposed transformer for long-term series forecasting[C]//Proceedings of the 39th International Conference on Machine Learning, 2022.
[23] ZHANG X, JIN X, GOPALSWAMY K, et al. First de-trend then attend: rethinking attention for time-series forecasting[J], arXiv:2212.0815, 2022.
[24] NIE Y Q, NGUYEN N H, SINTHONG P, et al. A time series is worth 64 words: long-term forecasting with transformers[J]. arXiv:2211.14730, 2022.
[25] MANN M E, LEES J M. Robust estimation of background noise and signal detection in climatic time series[J]. Climatic Change, 1996, 33(3): 409-445.
[26] PENG X G, LIN Y X, CAO Q, et al. Traffic anomaly detection in intelligent transport applications with time series data using informer[C]//Proceedings of the 2022 IEEE 25th International Conference on Intelligent Transportation Systems. Piscataway: IEEE, 2022: 3309-3314.
[27] MA H F, LENG S Y, CHEN L N. Data-based prediction and causality inference of nonlinear dynamics[J]. Science China Mathematics, 2018, 61(3): 403-420.
[28] KITAEV N, KAISER ?, LEVSKAYA A. Reformer: the efficient transformer[J]. arXiv:2001.04451, 2020.
[29] LIU Y, HU T G, ZHANG H R, et al. iTransformer: inverted transformers are effective for time series forecasting[J]. arXiv:2310.06625, 2023.
[30] WOO G, LIU C H, SAHOO D, et al. ETSformer: exponential smoothing transformers for time-series forecasting[J]. arXiv:2202.01381, 2022.
[31] WU H X, HU T G, LIU Y, et al. TimesNet: temporal 2D-variation modeling for general time series analysis[J]. arXiv:2210.02186, 2022.
[32] WANG S Y, WU H X, SHI X M, et al. TimeMixer: decomposable multiscale mixing for time series forecasting[J]. arXiv:2405.14616, 2024.
[33] XU Z J, ZENG A L, XU Q. FITS: modeling time series with 10k parameters[J]. arXiv:2307.03756, 2023.
[34] YUN C, BHOJANAPALLI S, RAWAT A S, et al. Are Transformers universal approximators of sequence-to-sequence functions? [J]. arXiv:1912.10077, 2019. |