计算机工程与应用 ›› 2024, Vol. 60 ›› Issue (4): 192-199.DOI: 10.3778/j.issn.1002-8331.2210-0050

• 模式识别与人工智能 • 上一篇    下一篇

融合图卷积和卷积自注意力的股票预测方法

田红丽,崔姚,闫会强   

  1. 1. 河北工业大学  人工智能与数据科学学院,天津  300401
    2. 河北工业大学  经济管理学院,天津  300401
  • 出版日期:2024-02-15 发布日期:2024-02-15

Stock Prediction Method Combining Graph Convolution and Convolution Self-Attention

TIAN Hongli, CUI Yao, YAN Huiqiang   

  1. 1. School of Artificial Intelligence and Data Science, Hebei University of Technology, Tianjin 300401, China
    2. School of Economics and Management, Hebei University of Technology, Tianjin 300401, China
  • Online:2024-02-15 Published:2024-02-15

摘要: 随着我国股票市场的不断发展,一只股票的走势往往受其企业上下游产业发展的影响。针对主流股票预测模型忽略了股票间关联关系的不足,提出了融合图卷积和多头卷积自注意力的股票趋势预测模型。首先使用互相关系数计算多只关联股票的关系矩阵,再使用图卷积神经网络结合关系矩阵对关联股票进行特征提取,其次使用多头卷积自注意力提取时间特征,最后使用分类损失函数多项式展开框架对损失函数进行优化,并进行趋势预测。实验结果表明,所提模型在准确率、查全率、召回率以及F1分数上均优于门控循环单元、时间卷积网络等模型。

关键词: 股票趋势预测, 卷积自注意力, 去趋势互相关系数

Abstract: With the continuous development of China??s stock market, the trend of a stock is often affected by the development of the upstream and downstream industries of its enterprises. In view of the fact that the mainstream stock prediction model ignores the shortcomings of the correlation relationship between stocks, a stock trend prediction model  fusing graph convolution and long convolution self-attention is proposed. Firstly, the relationship matrix of multiple associated stocks is calculated using the correlation coefficient, then the graph convolutional network combining relationship matrix is used to extract the feature of the associated stocks. Secondly, the multi-head convolution is used to extract long-term features from attention. Finally, the classification loss function polynomial expansion framework is used to make trend prediction for loss function optimization. Experimental results show that the proposed model is superior to gated loop unit, time convolutional network and other models in terms of accuracy, precision, recall and F1 score.

Key words: stock trend forecasting, convolution self-attention, detrend the number of interrelationships