Computer Engineering and Applications ›› 2019, Vol. 55 ›› Issue (18): 155-160.DOI: 10.3778/j.issn.1002-8331.1808-0256

Previous Articles     Next Articles

Modified Recurrent Neural Networks in Spoken Language Understanding

ZHANG Jingjing, HUANG Hao, HU Ying, WUSHOUR Silamu   

  1. College of Information Science and Engineering, Xinjiang University, Urumuqi 830046, China
  • Online:2019-09-15 Published:2019-09-11

口语理解中改进循环神经网络的应用

张晶晶,黄浩,胡英,吾守尔·斯拉木   

  1. 新疆大学 信息科学与工程学院,乌鲁木齐 830046

Abstract: The improvement of Spoken Language Understanding(SLU) plays an important role in spoken dialogue system. Recurrent neural network and its variants are used to improve performance of SLU. The modified recurrent neural network algorithm is proposed to enhance the SLU, which adds the memory of longer historical information and has fewer parameters. This method can efficiently obtain feature information that not only improves the precision and [F1] but also cut down experimental period. The experimental results on the ATIS corpus verify the effectiveness and reliability of the proposed algorithm.

Key words: Recurrent Neural Network(RNN), Long Short-Term Memory(LSTM), Gated Recurrent Unit(GRU), Spoken Language Understanding(SLU), Modified Recurrent Neural Networks(M-RNN)

摘要: 口语理解性能的提升对于口语对话系统的研究具有重要作用。为了提高口语理解性能,应用循环神经网络(RNN)及其变体(LSTM,GRU)方法。在此基础上,提出一种改进的循环神经网络(Modified-RNN)方法,该方法通过添加存储历史状态信息,能够存储更长时的信息,含有更少的参数,根据获取的更多信息提取特征信息增加获取信息的有效性,提高了口语理解的精准率和[F1],缩短了实验时间。在航空旅行信息数据库(ATIS)上的实验结果验证了该算法的有效性和可靠性。

关键词: 循环神经网络(RNN), 长短时记忆网络(LSTM), 门限循环单元(GRU), 口语理解(SLU), 改进循环神经网络(M-RNN)