Computer Engineering and Applications ›› 2019, Vol. 55 ›› Issue (19): 160-165.DOI: 10.3778/j.issn.1002-8331.1806-0310

Previous Articles     Next Articles

Recurrent Neural Network for Chinese Word Segmentation with Peephole-Connections

SUN Baoshan, LI Wei   

  1. College of Computer Science and software, Tianjin Polytechnic University, Tianjin 300387, China
  • Online:2019-10-01 Published:2019-09-30



  1. 天津工业大学 计算机科学与软件学院,天津 300387

Abstract: Long Short-Term Memory network(LSTM) can capture potential long-distance dependencies and has been widely used in Chinese word segmentation models. In order to further improve the effect, the word segmentation model is rebuilt by peepholes(peephole connection) in the case of error forgetting the key information in the processing sequence because of the structure problem in the memory unit. To optimize long distance dependencies, gradient truncation and guided information flow regularization are used. Through the construction of word segmentation experiments on current popular data sets, the results show that the memory unit of the peephole connection is more effective than the original memory unit to obtain the contextual features of the characters to be classified, effectively improving the missing information of the LSTM part and enhancing the memory ability of the network. The performance of the model word segmentation is higher.

Key words: Long Short-Term Memory(LSTM), sequence labeling, peephole connection, long-distance dependence, gradient truncation, Chinese word segmentation

摘要: 长短期记忆网络(LSTM)可以捕捉潜在的长距离依赖关系,已被广泛应用于中文分词模型。为进一步提高其分词效果,针对记忆单元因结构问题在处理序列时错误遗忘关键信息的情况,引入了窥视孔连接(peepholes)重新构建分词模型。为优化长距离依赖,使用了梯度截断、引导信息流正则化等手段。通过构建多种网络结构的分词模型在当下流行数据集上的分词实验,以此构建的双向循环网络结果表明:窥视孔连接的记忆单元比原记忆单元更有效的获取了待分类字符的上下文特征,有效地改善了LSTM部分信息缺失的情况,增强了网络的记忆能力,提高了模型分词性能。

关键词: 长短期记忆网络, 序列标注, 窥视孔连接, 长距离依赖, 梯度截断, 中文分词