计算机工程与应用 ›› 2012, Vol. 48 ›› Issue (10): 68-74.

• 网络、通信、安全 • 上一篇    下一篇

面向直播HTTP Streaming系统的HTTP缓存服务器行为优化

李云飞1,谢伟凯1,鲁晨平1,张智强1,申瑞民2   

  1. 1.上海交通大学 现代远程教育研究中心,上海 200030
    2.上海交通大学 电子信息与电气工程学院,上海 200030
  • 出版日期:2012-04-01 发布日期:2012-04-11

Optimization on behavior of HTTP cache server in live HTTP streaming system

LI Yunfei1, XIE Weikai1, LU Chenping1, ZHANG Zhiqiang1, SHEN Ruimin2   

  1. 1.E-Learning Lab, Shanghai Jiaotong University, Shanghai 200030, China
    2.School of Electronic Information and Electrical Engineering, Shanghai Jiaotong University, Shanghai 200030, China
  • Online:2012-04-01 Published:2012-04-11

摘要: HTTP缓存服务器是提高HTTP Streaming系统客户并发量的关键环节。但当前主流HTTP缓存服务器,如Nginx、Squid、Varnish等,在缓存资源更新期间的行为都存在不足,当被应用在面向直播的HTTP Streaming系统中时,会周期性地把大量客户端请求转发至源服务器,从而制约了HTTP Streaming系统的可伸缩性。提出一种优化的HTTP缓存服务器在缓存更新期间的行为,即缓存服务器仅向源服务器转发一路客户端请求,缓存更新期间,拒绝其他关于该资源的请求。优化策略在使用最为广泛的Nginx服务器的基础上进行了实现。实验证明,优化后系统的伸缩性得到了显著提高。

关键词: HTTP Streaming, 缓存服务器, 缓存更新

Abstract: HTTP cache server plays a key role in increasing the scalability of HTTP streaming systems. However mainstream cache servers, such as Nginx, Squid and Varnish, behave improperly when the cache file is being updated. When used in live HTTP Streaming systems, this behavior will cause the cache server to deliver many unnecessary requests to the origin server periodically so that the scalability of the system is constrained. This paper puts forward an optimized behavior of the cache server under cache updating period, which is that the cache server just deliver one request to origin server and refuse all the other requests when the cache is being updated. The optimization strategy is implemented on Nginx. The evaluation results show that the scalability of the system is significantly enhanced with the optimization.

Key words: HTTP Streaming, cache server, cache updating