Computer Engineering and Applications ›› 2022, Vol. 58 ›› Issue (9): 151-160.DOI: 10.3778/j.issn.1002-8331.2011-0476

• Pattern Recognition and Artificial Intelligence • Previous Articles     Next Articles

Application of Parallel Attention Mechanism in Image Semantic Segmentation

ZHANG Han, ZHANG Dexiang, CHEN Peng, ZHANG Jun, WANG Bing   

  1. 1.School of Electrical Engineering and Automation, Anhui University, Hefei 230601, China
    2.National Engineering Research Center for Agro-Ecological Big Data Analysis & Application, Internet Academy, Anhui University, Hefei 230601, China
    3.School of Electrical and Information Engineering, Anhui University of Technology, Ma’anshan, Anhui 201804, China
  • Online:2022-05-01 Published:2022-05-01



  1. 1.安徽大学 电气工程与自动化学院,合肥 230601
    2.安徽大学 农业生态大数据分析与应用技术国家地方联合工程研究中心,互联网学院,合肥 230601
    3.安徽工业大学 电气与信息工程学院,安徽 马鞍山 201804

Abstract: The integration of attention mechanism in convolutional neural networks has increasingly become an important method for semantic segmentation to strengthen feature learning. This paper proposes a convolutional neural network that combines local attention and global attention. The input image is extracted by the backbone network and input to the local attention and global attention modules in parallel. The local attention module uses an encoding-decoding structure to achieve multi-scale local feature fusion. The global attention module captured global information based on the correlation between each pixel and all pixels on the feature map. Fusion of two attention modules not only reduce the loss of local information but also capture global information with long distance dependencies. This paper uses a data-dependent upsampling method to replace the bilinear interpolation method to upsample the feature map to the input size and improves the segmentation results. This paper uses Dice Loss loss function and adds weight coefficients before the category loss for the imbalanced of sample to further improve the segmentation results. The method obtains Mean IoU scores of 96.39%, 93.44%, 96.28% on the pill contamination dataset, pill crack dataset, and corridor dataset, respectively.

Key words: local attention, global attention, data-dependent upsampling, imbalanced of sample

摘要: 在卷积神经网络中融入注意力机制越来越成为语义分割强化特征学习的重要方法。提出了一种融合了局部注意力和全局注意力的卷积神经网络。输入图像经主干网络的特征提取,并行输入给局部注意力和全局注意力模块。局部注意力模块以编码-解码结构实现多尺寸的局部特征融合,全局注意力模块根据每个像素与其所在特征图上所有像素的相关性捕获全局信息。融合两个注意力模块不仅减少了局部信息的丢失,而且捕获了具有长距离依赖的全局信息,有效提升了特征提取的能力。采用一种数据相关的上采样方法代替双线性插值法恢复特征图至输入尺寸,同时改善了分割效果。采用Dice Loss损失函数并针对样本不平衡问题在类别损失前加入权重系数进一步改善了分割效果。该方法在药丸污点数据集、药丸缺损数据集以及走廊数据集上分别得到了96.39%、93.44%、96.28%的平均交并比结果。

关键词: 局部注意力, 全局注意力, 数据相关上采样, 样本不平衡