Computer Engineering and Applications ›› 2019, Vol. 55 ›› Issue (12): 145-148.DOI: 10.3778/j.issn.1002-8331.1803-0231

Previous Articles     Next Articles

Spoken Language Understanding Method Based on Recurrent Neural Network with Persistent Memory

XU Yingying, HUANG Hao   

  1. School of Information Science and Engineering, Xinjiang University, Urumqi 830046, China
  • Online:2019-06-15 Published:2019-06-13

引入外部记忆的循环神经网络的口语理解

许莹莹,黄  浩   

  1. 新疆大学 信息科学与工程学院,乌鲁木齐 830046

Abstract: Recurrent Neural Network(RNN) has increasingly shown its advantages in the Spoken Language Understanding(SLU) task. However, because of the problem of gradient disappearance and gradient explosion, the storage capacity of simple recurrent neural network is limited. A RNN that uses external memory is proposed to improve memory. Experiments are carried out on the ATIS data set and compared with other publicly reported models. The results show that, in oral comprehension tasks, the RNN introduced external memory has significantly improved accuracy, recall rate and F1-score, which is superior to traditional recurrent neural network and its variant structure.

Key words: Spoken Language Understanding(SLU), Recurrent Neural Network(RNN), Long Short Term Memory(LSTM) network, neural turing machine

摘要: 循环神经网络(RNN)越来越在口语理解(Spoken Language Understanding,SLU)任务中显示出优势。然而,由于梯度消失和梯度爆炸问题,简单循环神经网络的存储容量受到限制。提出一种使用外部存储器来提高记忆能力的循环神经网络。并在ATIS数据集上进行了实验,并与其他公开报道的模型进行比较。结果说明,在口语理解任务上,提出的引入外部记忆的循环神经网络在准确性、召回率和F1值都有较明显提高,优于传统循环神经网络及其变体结构。

关键词: 口语理解, 循环神经网络, 长短时记忆网络, 神经图灵机