Computer Engineering and Applications ›› 2022, Vol. 58 ›› Issue (20): 197-205.DOI: 10.3778/j.issn.1002-8331.2203-0266

• Graphics and Image Processing • Previous Articles     Next Articles

Image Super-Resolution with Light-Weighted Pyramid Pooling-Based Attention Network

FANG Jinsheng, ZHU Gupei   

  1. 1.School of Computer Science and Engineering, Minnan Normal University, Zhangzhou, Fujian 363000, China
    2.Fujian Province Key Laboratory of Data Science and Intelligence Application,  Zhangzhou, Fujian 363000, China
  • Online:2022-10-15 Published:2022-10-15

轻型金字塔池化注意力机制网络实现图像超分

方金生,朱古沛   

  1. 1.闽南师范大学 计算机学院,福建 漳州 363000
    2.数据科学与智能应用福建省高校重点实验室(闽南师范大学),福建 漳州 363000

Abstract: In the task of image super-resolution reconstruction based on deep learning, most of the current algorithms improve their performance by expanding the network scale, which leads to the increase of computing resources. To solve the problem mentioned above, a light-weighted pyramid pooling-based attention network(LiPAN) is proposed which is composed of information distillation block, pyramid pooling and backward attention fusion module. The attention mechanism ensures that the network extracts important features, the pyramid pooling structure can get more context information and obtain more accurate reconstruction results, and the distillation structure can effectively improve the network performance and reduce network parameters. Compared with state-of-the-art lightweight network models, quantitative evaluation among the scale factor of 2, 3, 4 on four public datasets, including Set5, Set14, BSD100 and Urban100, the proposed LiPAN model is able to achieve superior PSNR and SSIM values. It is shown that LiPAN has better super-resolution reconstruction performance when the model parameter is comparable to the current mainstream light-weighted network.

Key words: super-resolution reconstruction, attention mechanism, distillation network, light-weighted, pyramid pooling

摘要: 在基于深度学习的图像超分辨率重建领域,通过扩大网络规模以提高性能将导致计算资源损耗增加。为此,提出了一种轻量级的基于金字塔池化注意力机制网络(light-weighted pyramid pooling-based attention network,LiPAN),该算法模型由融合注意力机制的信息蒸馏块、多层金字塔池化结构和反向注意力融合模块组成。注意力机制确保了网络对重要特征的提取,金字塔池化结构可获取更多的上下文信息,得到更准确的重建结果,蒸馏结构的引入可有效地提高网络性能并减少网络参数。与目前主流的轻量级网络模型相比,提出的LiPAN模型在Set5、Set14、BSD100及Urban100四个公共数据集分别进行2倍、3倍和4倍下采样重建并定量评估,获得最优峰值信噪比和结构相似度。由此表明,提出的LiPAN在网络模型参数与当前主流的轻量级网络相当的情况下,具有更优的超分辨率重建性能。

关键词: 超分重建, 注意力机制, 蒸馏网络, 轻量级, 金字塔池化