计算机工程与应用 ›› 2023, Vol. 59 ›› Issue (10): 134-141.DOI: 10.3778/j.issn.1002-8331.2112-0585

• 模式识别与人工智能 • 上一篇    下一篇

基于门控注意力的双通道情感分析及应用

魏龙,胡建鹏,张庚   

  1. 上海工程技术大学 电子电气工程学院,上海 201620
  • 出版日期:2023-05-15 发布日期:2023-05-15

Dual-Channel Sentiment Analysis and Application Based on Gated Attention

WEI Long, HU Jianpeng, ZHANG Geng   

  1. School of Electronic and Electrical Engineering, Shanghai University of Engineering Science, Shanghai 201620, China
  • Online:2023-05-15 Published:2023-05-15

摘要: 针对传统的基于深度学习的文本情感分类模型特征抽取不全面以及不能区分一词多义的问题,提出一种基于门控注意力的双通道情感分类模型BGA-DNet。该模型使用BERT预训练模型对文本数据进行处理,然后经过双通道网络提取文本特征,其中通道一利用TextCNN提取局部特征,通道二利用BiLSTM-Attention提取全局特征。同时引入门控注意力单元将部分无用的注意力信息过滤掉,并结合残差网络思想,确保双通道的输出在网络学习到饱和状态下保留原始编码信息。BGA-DNet在公开的酒店评论和餐饮评论两个数据集上进行实验评估,并与最新的情感分类方法进行对比,分别取得了准确率94.09%和91.82%的最佳效果。最后将BGA-DNet模型应用到真实的学生实验心得体会评价任务上,与其他方法相比准确率和[F1]值也是最高的。

关键词: 门控注意力, 双通道, 情感分类, BERT, BiLSTM-Attention

Abstract: Traditional deep learning-based text sentiment classification models usually cannot extract features completely and cannot distinguish polysemous words. To resolve these problems, a dual-channel sentiment classification model named BGA-DNet based on gated attention is proposed. The model uses the BERT pre-training model to process text data, and then extracts text features through a dual-channel network. The channel one uses TextCNN to extract local features, and the channel two uses BiLSTM-Attention to extract global features. At the same time, a gated attention unit is introduced to filter out some useless attention information. With a residual network, it also ensures that the output of the dual-channel retains the original coding information when the network reaches a saturated state. BGA-DNet is evaluated on two public datasets of hotel reviews and restaurant reviews, and compared with the latest sentiment classification methods, it achieves the best results with accuracy rate of 94.09% and 91.82%, respectively. At last, the BGA-DNet model is applied to the real dataset of students’ experiment reports, and the accuracy and [F1] value are also the highest.

Key words: gated attention, dual-channel, sentiment classification, BERT, BiLSTM-Attention