Computer Engineering and Applications ›› 2021, Vol. 57 ›› Issue (12): 155-160.DOI: 10.3778/j.issn.1002-8331.2003-0276

Previous Articles     Next Articles

Multi-head Attention Pooling-Based RCNN Model for Text Classification

ZHAI Yiming, WANG Binjun, ZHOU Zhining, TONG Xin   

  1. College of Police Information Engineering and Cyber Security, People’s Public Security University of China, Beijing 100038, China
  • Online:2021-06-15 Published:2021-06-10

面向文本分类的多头注意力池化RCNN模型

翟一鸣,王斌君,周枝凝,仝鑫   

  1. 中国人民公安大学 警务信息工程与网络安全学院,北京 100038

Abstract:

The strategy of max pooling in the pooling layer adopted by classic Recurrent Convolutional Neural Network(RCNN) is relatively onefold, which will ignore other features except the most prominent one and affect the classification accuracy. Therefore, a Multi-Head Attention Pooling-based Recurrent Convolutional Neural Network(MHAP-RCNN) is proposed. The mechanism of multi-head attention pooling can fully consider the contribution of each feature to classification, and can be dynamically optimized in the training process, which can effectively alleviate the above problem of max pooling. Experiments are performed on three public text classification data sets. The results show that the proposed model has better performance on text classification than classic RCNN and other models.

Key words: text classification, recurrent convolutional neural network, pooling, max pooling, multi-head attention pooling

摘要:

针对经典循环卷积神经网络(RCNN)在池化层采用的最大池化策略较为单一,会忽略除最突出特征外的其他特征,影响分类精度的问题,提出基于多头注意力池化的循环卷积神经网络(MHAP-RCNN)模型。多头注意力池化可以充分考虑各特征对分类的贡献,且能在训练过程中动态优化,有效缓解最大池化的单一性问题。在三个公开的文本分类数据集上进行实验,结果表明与经典RCNN及其他各模型相比,提出的模型具有更好的文本分类性能。

关键词: 文本分类, 循环卷积神经网络, 池化, 最大池化, 多头注意力池化