Computer Engineering and Applications ›› 2013, Vol. 49 ›› Issue (22): 59-62.

Previous Articles     Next Articles

Buffer delay algorithm based on network depth of node

ZHANG Min, LI Demin, ZOU Jian, JIN Kang   

  1. Department of Information Science and Technology, Donghua University, Shanghai 201620, China
  • Online:2013-11-15 Published:2013-11-15

基于节点网络深度的缓冲延时算法研究

张  民,李德敏,邹  剑,金  康   

  1. 东华大学 信息科学与技术学院,上海 201620

Abstract: In ZigBee-based audio guide system, voice packets at the receiving end will produce the disorder problem. In order to eliminate the jitter, a buffer delay algorithm based on the depth of the tourist node of the network is proposed, using E audio forecasting model to get the objective predicted value of the audio to express subjective Mean Opinion Score(MOS) value, the corresponding optimal time delay is obtained by calculating the highest MOS value. By simplifying the algorithm and software simulation, the algorithm in audio guide system is efficient.

Key words: ZigBee, E-model, delay, jitter

摘要: 基于ZigBee技术的语音导游系统,语音数据包在游客接收端会产生失序问题。为了消除这种语音抖动,提出了一种基于游客节点网络深度的缓冲延时算法,采用E-Model语音预测模型,用客观的语音预测值去表示主观的MOS(Mean Opinion Score,平均意见值)评分值,通过计算最高MOS值得到对应的最优延时变量。通过简化算法并软件仿真,可以看出该算法在语音导游系统中的优越性。

关键词: ZigBee, E模型, 延时, 抖动