计算机工程与应用 ›› 2025, Vol. 61 ›› Issue (17): 200-208.DOI: 10.3778/j.issn.1002-8331.2407-0500

• 模式识别与人工智能 • 上一篇    下一篇

基于图结构与文本信息对比的知识图谱补全

翟社平,杨晴,黄妍   

  1. 1.西安邮电大学 计算机学院,西安 710721
    2.西安邮电大学 陕西省网络数据分析与智能处理重点实验室,西安 710121
  • 出版日期:2025-09-01 发布日期:2025-09-01

Knowledge Graph Completion Based on Comparison of Graph Structures and Text Semantics

ZHAI Sheping, YANG Qing, HUANG Yan   

  1. 1.School of Computer Science and Technology, Xi’an University of Posts and Telecommunications, Xi’an 710121, China
    2.Shaanxi Key Laboratory of Network Data Analysis and Intelligent Processing, Xi’an University of Posts and Telecommunications, Xi’an 710121, China
  • Online:2025-09-01 Published:2025-09-01

摘要: 目前知识图谱补全方法主要使用图神经网络对图数据进行建模,忽视了文本信息在知识整合中的重要性。此外,现有模型未能充分挖掘细粒度三元组关系对实体嵌入更新的影响。针对以上问题,提出基于图结构与文本信息对比学习的知识图谱补全方法SCLKGC。在图编码器中采用特定类型的注意力机制,为实体邻域分配与其重要性相匹配的权重,从而实现实体邻域信息的更精准聚合。分别使用基于层次邻域和基于语义-结构的对比学习方法,以更有效地学习实体在知识图谱中的表示。实验部分使用两个真实数据集对所提方法进行验证。实验结果表明,与基线方法得到的最佳效果相比,在FB15k-237数据集上,SCLKGC的平均倒数排名(MRR)、Hits@1、Hits@3、Hits@10分别提高了0.008、0.016、0.011、0.013;在WN18RR数据集上,SCLKGC的MRR、Hits@1、Hits@3、Hits@10分别提高了0.006、0.008、0.006、0.012,验证了所提方法的有效性。

关键词: 知识图谱补全, 对比学习, 文本描述, 注意力机制

Abstract: Methods for knowledge graph completion primarily utilize graph neural networks to model graph data while neglecting the importance of textual semantics in knowledge integration. Existing models also fail to exploit the impact of fine-grained triple relationships on entity embeddings. This paper proposes a knowledge graph completion method based on structure and textual semantics contrastive learning, termed SCLKGC. A specific attention mechanism is employed in the graph encoder to allocate weights that match the importance of entities within their neighborhoods, achieving precise aggregation of neighborhood information. Hierarchical neighborhood-based and semantic-structural contrastive learning methods are utilized to effectively learn entity representations in the knowledge graph. Experiments validate the proposed method using two real-world datasets. Experimental results indicate that, compared to baseline methods, SCLKGC improves mean reciprocal rank (MRR), Hits@1, Hits@3, and Hits@10 by 0.008, 0.016, 0.011, 0.013, respectively, on the FB15k-237 dataset; on the WN18RR dataset, SCLKGC achieves improvements of 0.006, 0.008, 0.006, 0.012 in MRR, Hits@1, Hits@3, and Hits@10, respectively, validating the effectiveness of the proposed method.

Key words: knowledge graph completion, comparative learning, text description, attention mechanism