Computer Engineering and Applications ›› 2021, Vol. 57 ›› Issue (12): 193-200.DOI: 10.3778/j.issn.1002-8331.2003-0376

Previous Articles     Next Articles

Image Super Resolution Reconstruction Based on Residual Convolution Attention Network

CHEN Guihui, CHEN Wu, LI Zhongbing, YI Xin, LIU Huikang, HAN Chunyang   

  1. School of Electrical Information, Southwest Petroleum University, Chengdu 610500, China
  • Online:2021-06-15 Published:2021-06-10

残差卷积注意网络的图像超分辨率重建

谌贵辉,陈伍,李忠兵,易欣,刘会康,韩春阳   

  1. 西南石油大学 电气信息学院,成都 610500

Abstract:

In the process of image super-resolution reconstruction, there are some problems such as less image feature extraction, low information utilization, equal processing of high and low frequency information channels. The multi-scale residual attention block is constructed to maximize the network extraction of multi-dimensional feature information, and the channel attention mechanism is introduced to enhance the representation ability of high-frequency information channel. The convolution attention block feature extraction structure is introduced to reduce the loss of high-frequency image detail information. In the reconstruction layer of the network, the global long jump connection structure is introduced to further enrich the reconstruction high resolution image information flow. The experimental results show that the PSNR and SSIM of the proposed algorithm on Set5 and other benchmark datasets are significantly better than those of other deep convolutional neural networks, which verify the effectiveness and the advanced nature of the proposed method.

Key words: image super-resolution reconstruction, feature extraction, multi-scale residual attention block, convolution attention block

摘要:

针对极深神经网络图像超分辨率重建过程中,存在图像特征提取少、信息利用率低,平等处理高、低频信息通道的问题,提出了残差卷积注意网络的图像超分辨率重建算法。构造多尺度残差注意块,最大限度地提高网络提取到多尺寸特征信息,引入通道注意力机制,增强高频信息通道的表征能力。引入卷积注意块的特征提取结构,减少高频图像细节信息的丢失。在网络的重建层,引入全局跳远连接结构,进一步丰富重建的高分辨率图像信息的流动。实验结果表明,所提算法在Set5等基准数据集上的PSNR、SSIM比其他基于深度卷积神经网络的方法均明显提升,验证了提出方法的有效性与先进性。

关键词: 图像超分辨率重建, 特征提取, 多尺度残差注意块, 卷积注意块