Computer Engineering and Applications ›› 2021, Vol. 57 ›› Issue (23): 171-177.DOI: 10.3778/j.issn.1002-8331.2103-0478

• Pattern Recognition and Artificial Intelligence • Previous Articles     Next Articles

Bidirectional Attention Question Answering Model Combining Knowledge Representation Learning

LU Qi, PAN Zhisong, XIE Jun   

  1. College of Command & Control Engineering, Army Engineering University of PLA, Nanjing 210000, China
  • Online:2021-12-01 Published:2021-12-02

融合知识表示学习的双向注意力问答模型

卢琪,潘志松,谢钧   

  1. 中国人民解放军陆军工程大学 指挥控制工程学院,南京 210000

Abstract:

Question Answering over Knowledge Graph(KGQA) is one of the research hotspots in the field of natural language processing and has received extensive attention in recent years. KGQA faces challenges such as multi-hop problems that need to combine multiple triples for reasoning and incomplete knowledge graphs. To solve these problems, a KR-BAT model which combines knowledge representation and bidirectional attention mechanism is proposed. It introduces knowledge representation learning to improve the global modeling ability and deals with incomplete knowledge graph; the bidirectional attention model captures the rich interactive information between candidate answers and questions, and give answers after analysis and reasoning. Experiments are conducted on the MetaQA dataset and compared with baseline models such as VRN, KV-MemNN, GraftNet. Results show that KR-BAT achieves very competitive performance on the complete knowledge graph, and is further improved than the baseline model on the incomplete knowledge graph.

Key words: knowledge graph, intelligent question answering, knowledge representation, attention

摘要:

知识图谱问答是自然语言处理领域的研究热点之一,近年来受到广泛的关注。知识图谱问答面临需要结合多条三元组进行推理的多跳问题以及知识图谱不完整等挑战,为解决这些问题,提出了一种融合知识表示学习的双向注意力模型(Bidirectional Attention model combining Knowledge Representation,KR-BAT)。引入知识表示学习以提高模型全局建模能力,应对知识图谱不完整的情况;使用双向注意力模型捕捉候选答案和问题间丰富的交互信息,经过分析推理给出答案。在MetaQA数据集上进行了实验,对比VRN、KV-MemNN、GraftNet等基准模型,在完整知识图谱上达到了非常有竞争力的性能,在不完整知识图谱上大幅度优于基准模型。

关键词: 知识图谱, 智能问答, 知识表示, 注意力