Computer Engineering and Applications ›› 2021, Vol. 57 ›› Issue (10): 146-153.DOI: 10.3778/j.issn.1002-8331.2007-0074

Previous Articles     Next Articles

Graph-Based Hierarchical Attention Networks for Fact Verification

XIE Yifei, LU Qi, LIU Xin, HU Yahao, PAN Zhisong, CHEN Hao   

  1. Command and Control Engineering College, PLA Army Engineering University, Nanjing 210001, China
  • Online:2021-05-15 Published:2021-05-10



  1. 陆军工程大学 指挥控制工程学院,南京 210001


Fact Verification(FV) requires to retrieve evidences from large-scale corpus to verify claims. Existing methods simply concatenate evidences and compare the cosine similarity between them, while ignoring the relation between evidences and the similarity on different levels. This paper presents the new model Graph-aware Hierarchical Attention Networks(GHAN). It can firstly retrieve evidences from Wikipedia through BERT(Bidirectional Encoder Representation from Transformers) and leverages convolutional neural network to extract features with different lengths. The translation matrix is consisted by these features with different granularities. Furthermore, this paper builds a graph among evidences in order to take into account both token-level and sentence-level information. And the kernel attention mechanism contributes to information propagation and verification. Experimental results on a large-scale benchmark dataset FEVER demonstrate that GHAN surpasses other published models based on BERT.

Key words: fact verification on text, graph attention network, kernel function;[N]-gram, convolutional neural network


事实验证任务要求能够从大规模的文本语料库中抽取相关的证据,并通过推理对给定的声明得出事实性的判断。现有的研究通常将检索到的证据拼接,然后比较声明和证据嵌入的余弦相似度,这些方法忽视了长距离证据之间的联系,以及不同层次的语义相似度,而这些特征对于推理验证至关重要。设计了一种基于图的多层次注意力模型(Graph-aware Hierarchical Attention Networks for Fact Verification,GHAN)。该模型首先通过BERT(Bidirectional Encoder Representation from Transformers)筛选出所需的证据片段,再利用卷积神经网络提取不同长度的[N]-gram特征,构造不同粒度的相似度转移矩阵提取相似度特征。为了综合考虑字符级别和句子级别的语义信息,将证据信息构建成信息融合图,再利用基于核函数的注意力机制进行信息传递与证据推理。该算法在FEVER数据集上取得较好的效果,优于其他基于BERT的方法。

关键词: 文本事实验证, 图注意力网络, 核函数, [N]-gram, 卷积神经网络