%0 Journal Article %A TANG Zhuang %A WANG Zhishu %A ZHOU Ai %A FENG Meishan %A QU Wen %A LU Mingyu %T Transformer-Capsule Integrated Model for Text Classification %D 2020 %R 10.3778/j.issn.1002-8331.1909-0273 %J Computer Engineering and Applications %P 151-156 %V 56 %N 24 %X

Aiming at the problem that shallow single-model text classification algorithms cannot extract the multi-level features of text sequence well, this paper proposes a transformer-capsule integrated model, which uses capsule network and transformer to extract the local phrase features and global semantic features of text respectively. Through integration, the multi-level features of the text sequence is obtained more comprehensively. In addition, this paper proposes a dynamic routing algorithm based on attention mechanism to solve the interferences of noisy capsules in traditional dynamic routing, which assigns less weight to the noisy capsules, reduces interfering information transmitted to subsequent capsules, and experiments show that this mechanism can effectively improve classification performance. In this paper, four single-label datasets and one multi-label Reuters-21578 dataset of text classification are selected for experiments, and good experimental results are obtained. The F1 value on Reuters-21578 is increased by 3.6% compared with the Capsule-B model, it reaches 89.4%.

%U http://cea.ceaj.org/EN/10.3778/j.issn.1002-8331.1909-0273