Computer Engineering and Applications ›› 2021, Vol. 57 ›› Issue (4): 114-119.DOI: 10.3778/j.issn.1002-8331.1912-0057
Previous Articles Next Articles
LIAO Wenxiong, ZENG Bi, XU Yayun
Online:
Published:
廖文雄,曾碧,徐雅芸
Abstract:
Natural language processing, as a branch of artificial intelligence, has a wide range of applications in daily life. With the application of recurrent neural networks in the field of natural language processing and the continuous evolution and iteration of recurrent neural networks, natural language processing has made a great leap. As a result, recurrent neural networks have quickly become mainstream algorithms in the field of natural language processing, but they have the disadvantages of complex structure and long training time. This paper proposes a natural language processing model based on one-dimensional dilated convolution and Attention mechanism. Firstly, one-dimensional dilated convolution is used to extract the deep features of linguistic text, and then the deep features are assigned weights through the Attention mechanism to integrate various temporal features. The experimental results show that the training time of the model only needs about 30% of the recurrent neural network, and the performance similar to the recurrent neural network can be achieved, which verifies the effectiveness of the proposed model.
Key words: dilated convolution, Attention mechanism, natural language processing
摘要:
自然语言处理作为人工智能的一个分支,在日常生活中有着广泛的应用。随着循环神经网络在自然语言处理领域的应用以及循环神经网络的不断演进与迭代,自然语言处理有了很大的飞跃。循环神经网络也因此迅速成为自然语言处理领域的主流算法,但是其具有结构复杂和训练时间漫长的缺点。提出一种基于一维扩展卷积和Attention机制的自然语言处理模型,利用一维扩展卷积提取语言文本的深层特征,再通过Attention机制给深层特征分配权重以整合各个时序特征。实验结果表明,该模型只需循环神经网络约30%的训练时间,就能达到与循环神经网络相近的性能,验证了该模型的有效性。
关键词: 扩展卷积, Attention机制, 自然语言处理
LIAO Wenxiong, ZENG Bi, XU Yayun. Natural Language Processing Model Based on One-Dimensional Dilated Convolution and Attention Mechanism[J]. Computer Engineering and Applications, 2021, 57(4): 114-119.
廖文雄,曾碧,徐雅芸. 结合一维扩展卷积与Attention机制的NLP模型[J]. 计算机工程与应用, 2021, 57(4): 114-119.
0 / Recommend
Add to citation manager EndNote|Ris|BibTeX
URL: http://cea.ceaj.org/EN/10.3778/j.issn.1002-8331.1912-0057
http://cea.ceaj.org/EN/Y2021/V57/I4/114