Computer Engineering and Applications ›› 2010, Vol. 46 ›› Issue (14): 166-168.DOI: 10.3778/j.issn.1002-8331.2010.14.048

• 图形、图像、模式识别 • Previous Articles     Next Articles

HVS and crossed masking based color image coding algorithm

WANG Xiang-yang1,2,LI Ling1   

  1. 1.School of Computer and Information Technology,Liaoning Normal University,Dalian,Liaoning 116029,China
    2.National Laboratory on Machine Perception,Peking University,Beijing 100871,China
  • Received:2009-09-11 Revised:2009-11-16 Online:2010-05-11 Published:2010-05-11
  • Contact: WANG Xiang-yang

基于HVS和交叉掩蔽的彩色图像编码算法

王向阳1,2,李 玲1   

  1. 1.辽宁师范大学 计算机与信息技术学院,辽宁 大连 116029
    2.北京大学 视觉与听觉信息处理国家重点实验室,北京 100871
  • 通讯作者: 王向阳

Abstract: Based on SPIHT coding scheme,a new color image coding algorithm based on asymmetric compression and the crossed masking in DWT domain is proposed,in which Human Visual System(HVS) model and human’s sensitivity to different components of color image,such as chrominance and luminance,is utilized.Firstly,a digital color image is converted to YCbCr space from RGB space,and the discrete wavelet transform is performed on every components of YCbCr space.Secondly,the wavelet coefficients of the luminance components are weighted using the crossed masking model,according to human’s sensitivity to luminance components.At the same time,the coding is made by applying the idea of asymmetric compression and SPIHT.Experimental results show that the proposed technique improves the quality of the reconstructed image in both PSNR and perceptual result,compared to SPIHT at the same bit rate.

Key words: color image coding, Set Partitioning In Hierarchical Trees(SPIHT), Human Visual System(HVS), crossed masking, weighting

摘要: 以层树分集(SPIHT)编码方案为基础,结合人类视觉系统(HVS)模型和人类视觉对彩色图像分量亮度和色度的不同敏感性,提出了一种基于非对称编码和交叉掩蔽的小波域彩色图像压缩编码算法。该算法首先将原始图像从RGB空间转换到YCbCr空间,然后对YCbCr空间的各分量进行离散小波变换;之后根据人类视觉对彩色图像的亮度分量的敏感性,用交叉掩蔽模型对亮度分量的小波系数进行加权处理;与此同时,利用非对称编码和SPIHT编码思想完成图像的压缩。仿真实验结果表明,文中算法是一种高效的图像压缩编码方法,其压缩效果明显优于SPIHT编码方案。

关键词: 彩色图像编码, 层树分集编码算法, 人眼视觉系统, 交叉掩蔽, 加权

CLC Number: