Computer Engineering and Applications ›› 2022, Vol. 58 ›› Issue (11): 171-177.DOI: 10.3778/j.issn.1002-8331.2011-0418

• Pattern Recognition and Artificial Intelligence • Previous Articles     Next Articles

Graph Attention Matrix Completion Based on Context of Knowledge Graph

SUN Wei, CHEN Pinghua   

  1. School of Computer, Guangdong University of Technology, Guangzhou 510006, China
  • Online:2022-06-01 Published:2022-06-01

基于知识图谱上下文的图注意矩阵补全

孙伟,陈平华   

  1. 广东工业大学 计算机学院,广州 510006

Abstract: In order to solve the graph convolution-encoder weight sharing and cannot distinguish the importance between neighbors in the process of extracting user and item information, graph neural network method cannot show capturing the non-local context information of the knowledge graph(KG) in the process of using the KG as auxiliary information, this paper proposes graph attention encoder framework based on bi-directional interactive graph pass, which displays the local and non-local context information of the KG. First, the embedding vector of the user and the item is obtained by the graph attention encoder. Second, the algorithm captures the local context information of the KG through the user-specific graph attention mechanism. In addition, random walk sampling is used to extract the non-local context of the entity and recurrent neural network is used to model the dependency between the entity and the non-local context entity. Finally, the links in the bipartite graph are reconstructed by bi-linear decoder. Compared with existing methods, the experimental results on real datasets verify the superiority of this model.

Key words: matrix completion, graph attention network, knowledge graph, contextual

摘要: 针对图卷积编码器提取用户、项目信息过程中权重共享,不能区分邻域之间重要性,以及知识图谱作为辅助信息时,基于图神经网络方法无法显示对知识图谱非本地上下文(最相关的高阶邻居集合)信息进行捕获的问题,提出一种基于双向交互图传递的图注意编码器框架,显示利用知识图谱本地(一阶邻居集合)和非本地上下文信息。通过图注意编码器获取用户、项目的嵌入向量;考虑用户对实体的个性化偏好,通过特定于用户的图注意机制来捕获知识图的本地上下文信息;使用随机游走抽样提取实体的非本地上下文,并使用递归神经网络建模实体与非本地上下文实体之间的依赖关系,通过一个双线性解码器重建二部图中的链接。与现有的方法相比,在真实数据集上的实验结果验证了该模型的优越性。

关键词: 矩阵补全, 图注意网络, 知识图谱, 上下文