计算机工程与应用 ›› 2019, Vol. 55 ›› Issue (20): 139-144.DOI: 10.3778/j.issn.1002-8331.1806-0259

• 图形图像处理 • 上一篇    下一篇

结合Inception模型的卷积神经网络图像去噪方法

李敏,章国豪,曾建伟,杨晓锋,胡晓敏   

  1. 1.广东工业大学 信息工程学院,广州 510006
    2.广东工业大学 计算机学院,广州 510006
  • 出版日期:2019-10-15 发布日期:2019-10-14

Inception Model of Deep Convolutional Neural Network for Image Denoising

LI Min, ZHANG Guohao, ZENG Jianwei, YANG Xiaofeng, HU Xiaomin   

  1. 1.School of Information Engineering, Guangdong University of Technology, Guangzhou 510006, China
    2.School of Computers, Guangdong University of Technology, Guangzhou 510006, China
  • Online:2019-10-15 Published:2019-10-14

摘要: 为更有效地去除图像中的噪声,提出一种结合Inception模型的深度卷积神经网络(Convolutional Neural Network,CNN)图像去噪方法,以完整图像作为输入和输出,利用Inception结构密集提取原始图像和噪声多个不同空间尺度的特征,并采用多种调优策略,增强网络的整体学习能力。为避免梯度消失,使用线性修正单元(Rectified Linear Unit,ReLU)激活函数;为加速网络的训练,增加批量规范化(Batch Normalization,BN)操作;加入跳跃结构进行残差学习(Residual Learning,RL),提升网络的去噪性能。基于公共数据集BSDS300的三种高斯噪声等级实验结果表明,与其他图像去噪方法相比,模型在降低计算复杂度、提高收敛速度的同时,视觉效果更好,平均峰值信噪比(Peak Signal to Noise Ratio,PSNR)提升了约1.28 dB。

关键词: 图像去噪, 深度卷积神经网络, Inception模型, 批量规范化, 残差学习

Abstract: In order to remove the noise in images more effectively, a deep Convolutional Neural Network(CNN) combined with inception model is proposed for image denoising using the integrated image as input and output. Multiple spatial scale features are densely extracted through the inception structure to enhance the learning ability of the network. Rectified Linear Unit(ReLU) is used as an activate function to avoid the vanishing gradient problem. Batch Normalization (BN) and Residual Learning(RL) are utilized to speed up the training process as well as boost the overall denoising performance. The experimental results of three Gaussian noise levels based on the public dataset BSDS300 show that the proposed model has a 1.28 dB average Peak Signal to Noise Ratio(PSNR) increase and better visual results while reducing computational complexity and improving convergence rate.

Key words: image denoising, deep convolutional neural network, Inception model, batch normalization, residual learning