%0 Journal Article %A YANG Xingrui %A ZHAO Shouwei %A ZHANG Ruxue %A YANG Xingjun %A TAO Yehui %T BiLSTM_CNN Classification Model Based on Self-Attention and Residual Network %D 2022 %R 10.3778/j.issn.1002-8331.2104-0258 %J Computer Engineering and Applications %P 172-180 %V 58 %N 3 %X It is difficult to extract enough text information from multi-classification tasks with bi-directional long short-term memory(BiLSTM) and convolutional neural network(CNN). A BiLSTM_CNN compound model based on self-attention mechanism and residual network(ResNet) is proposed. The weight of the information after the convolution operation is given by the self-attention mechanism. The pooling feature information is processed by layer normalization and then connects to the residual network, so that the model can learn the residual informationand further improve the classification performance of the model. In the process of calculation, Mish nonlinear activation function is applied, which is more smooth than the common Relu function. Compared with common deep learning models, the proposed method is superior to the existing mainstream models in terms of the accuracy and F1 evaluation indicators. The proposed model provides new research ideas for text classification problems. %U http://cea.ceaj.org/EN/10.3778/j.issn.1002-8331.2104-0258