计算机工程与应用 ›› 2021, Vol. 57 ›› Issue (24): 161-167.DOI: 10.3778/j.issn.1002-8331.2007-0100

• 模式识别与人工智能 • 上一篇    下一篇

基于图注意力神经网络的中文隐式情感分析

杨善良,常征   

  1. 山东理工大学 计算机科学与技术学院,山东 淄博 255000
  • 出版日期:2021-12-15 发布日期:2021-12-13

Chinese Implicit Sentiment Analysis Based on Graph Attention Neural Network

YANG Shanliang, CHANG Zheng   

  1. College of Computer Science and Technology, Shandong University of Technology, Zibo, Shandong 255000, China
  • Online:2021-12-15 Published:2021-12-13

摘要:

情感分析是自然语言处理领域的重要任务之一,情感分析任务包含显式情感分析和隐式情感分析。由于隐式情感不包含显式情感词语,情感表达更加委婉,所以面临更大的挑战。提出基于图注意力神经网络的隐式情感分析模型ISA-GACNN(Implicit Sentiment Analysis Based on Graph Attention Convolutional Neural Network),构建文本和词语的异构图谱,使用图卷积操作传播语义信息,使用注意力机制计算词语对文本情感表达的贡献程度。针对多头注意力保存重复信息问题,使用注意力正交约束使得不同注意力存储不同的情感信息;针对情感信息分布不均的情况,提出注意力分值约束使模型关注部分重要词语。在隐式情感分析评测数据集上验证模型效果,所提出模型的[F]值达到91.7%,远高于文献中的基准模型;对注意力机制进行分析,验证了正交约束和分值约束的有效性。

关键词: 隐式情感分析, 注意力机制, 图神经网络, 正交约束

Abstract:

Sentiment analysis is one of important tasks in the field of Natural Language Processing. The task of sentiment analysis consists of implicit sentiment analysis and explicit sentiment analysis. Implicit sentiment suffers greater challenge because sentence does not include explicit emotion words and emotional expression is more euphemistic. Implicit sentiment analysis model based on graph attention convolutional neural network, ISA-GACNN, is proposed, constructing the heterogeneous graph of words and sentences. Graph convolutional neural network is used to propagating semantic information. Attention mechanism is used to compute the contribution to emotional expression of words.  In order to solve the problem of multiple attention preserving repeated information, attention orthogonal constraint is used to make different attention store different emotional information. In view of the uneven distribution of emotional information, attention score constraint is proposed to make the model focus on a limited number of important words. The performance of proposed model is verified on implicit sentiment analysis evaluation datasets. The [F] value reaches 91.7 percent, which is higher than the benchmark model in literatures. The attention mechanism is analyzed to verify the effectiveness of orthogonal constraint and score constraint.

Key words: implicit sentiment analysis, attention mechanism, graph neural network, orthogonal constraint