计算机工程与应用 ›› 2019, Vol. 55 ›› Issue (8): 132-137.DOI: 10.3778/j.issn.1002-8331.1801-0065

• 模式识别与人工智能 • 上一篇    下一篇

结合注意力与卷积神经网络的中文摘要研究

周才东1,曾碧卿1,2,王盛玉1,商  齐1   

  1. 1.华南师范大学 计算机学院,广州 510631
    2.华南师范大学 软件学院,广东 佛山 528225
  • 出版日期:2019-04-15 发布日期:2019-04-15

Chinese Summarization Research on Combination of Local Attention and Convolutional Neural Network

ZHOU Caidong1, ZENG Biqing1,2, WANG Shengyu1, SHANG Qi1   

  1. 1.School of Computer, South China Normal University, Guangzhou 510631, China
    2.School of Software, South China Normal University, Foshan, Guangdong 528225, China
  • Online:2019-04-15 Published:2019-04-15

摘要: 目前深度学习已经广泛应用于英文文本摘要领域,但是在中文文本摘要领域极少使用该方法进行研究。另外,在文本摘要领域主要使用的模型是编码-解码模型,在编码时输入的是原始的文本信息,缺乏对文本高层次特征的利用,导致编码的信息不够充分,生成的摘要存在词语重复、语序混乱等问题。因此,提出一种局部注意力与卷积神经网络结合的具备高层次特征提取能力的编码-解码模型。模型通过局部注意力机制与卷积神经网络结合的方式提取文本的高层次的特征,将其作为编码器输入,此后通过基于全局注意力机制的解码器生成摘要。实验结果证明,在中文文本数据集上该模型相对于其他模型有着较好的摘要效果。

关键词: 文本摘要, 神经网络, 注意力机制

Abstract: At present, deep learning has been widely applied in the field of English text summarization, but it is rarely used in the field of Chinese text summarization. In addition, the model mainly used in the field of text summarization is the encoder-decoder model, inputting original text information in the encoder, lacking use of advanced features of the text, resulting in inadequate encode information, repetition of the generated abstracts, word order disorder and other issues. Therefore, this paper proposes an encoder-decoder model with high-level feature extraction capability that combines local attention with convolutional neural network. The model extracts the advanced features of the text by means of combining the local attention mechanism and the convolutional neural network, which are used as the input of the encoder, and then the summary is generated by the decoder based on the global attention mechanism. Experiments on Chinese text datasets prove that this model has good performance in Chinese summarization.

Key words: text summary, neural network, attention mechanism