Computer Engineering and Applications ›› 2021, Vol. 57 ›› Issue (4): 114-119.DOI: 10.3778/j.issn.1002-8331.1912-0057

Previous Articles     Next Articles

Natural Language Processing Model Based on One-Dimensional Dilated Convolution and Attention Mechanism

LIAO Wenxiong, ZENG Bi, XU Yayun   

  1. School of Computers, Guangdong University of Technology, Guangzhou 510006, China
  • Online:2021-02-15 Published:2021-02-06



  1. 广东工业大学 计算机学院,广州 510006


Natural language processing, as a branch of artificial intelligence, has a wide range of applications in daily life. With the application of recurrent neural networks in the field of natural language processing and the continuous evolution and iteration of recurrent neural networks, natural language processing has made a great leap. As a result, recurrent neural networks have quickly become mainstream algorithms in the field of natural language processing, but they have the disadvantages of complex structure and long training time. This paper proposes a natural language processing model based on one-dimensional dilated convolution and Attention mechanism. Firstly, one-dimensional dilated convolution is used to extract the deep features of linguistic text, and then the deep features are assigned weights through the Attention mechanism to integrate various temporal features. The experimental results show that the training time of the model only needs about 30% of the recurrent neural network, and the performance similar to the recurrent neural network can be achieved, which verifies the effectiveness of the proposed model.

Key words: dilated convolution, Attention mechanism, natural language processing



关键词: 扩展卷积, Attention机制, 自然语言处理