Computer Engineering and Applications ›› 2021, Vol. 57 ›› Issue (12): 155-160.DOI: 10.3778/j.issn.1002-8331.2003-0276

Previous Articles     Next Articles

Multi-head Attention Pooling-Based RCNN Model for Text Classification

ZHAI Yiming, WANG Binjun, ZHOU Zhining, TONG Xin   

  1. College of Police Information Engineering and Cyber Security, People’s Public Security University of China, Beijing 100038, China
  • Online:2021-06-15 Published:2021-06-10



  1. 中国人民公安大学 警务信息工程与网络安全学院,北京 100038


The strategy of max pooling in the pooling layer adopted by classic Recurrent Convolutional Neural Network(RCNN) is relatively onefold, which will ignore other features except the most prominent one and affect the classification accuracy. Therefore, a Multi-Head Attention Pooling-based Recurrent Convolutional Neural Network(MHAP-RCNN) is proposed. The mechanism of multi-head attention pooling can fully consider the contribution of each feature to classification, and can be dynamically optimized in the training process, which can effectively alleviate the above problem of max pooling. Experiments are performed on three public text classification data sets. The results show that the proposed model has better performance on text classification than classic RCNN and other models.

Key words: text classification, recurrent convolutional neural network, pooling, max pooling, multi-head attention pooling



关键词: 文本分类, 循环卷积神经网络, 池化, 最大池化, 多头注意力池化