Computer Engineering and Applications ›› 2022, Vol. 58 ›› Issue (1): 182-189.DOI: 10.3778/j.issn.1002-8331.2106-0174

• Pattern Recognition and Artificial Intelligence • Previous Articles     Next Articles

Facial Expression Recognition Based on Multi-scale Feature Attention Mechanism

ZHANG Peng, KONG Weiwei, TENG Jinbao   

  1. 1.Xi’an University of Posts and Telecommunications, Xi’an 710121, China
    2.Shaanxi Provincial Key Laboratory of Network Data Analysis and Intelligent Processing, Xi’an 710121, China
  • Online:2022-01-01 Published:2022-01-06

基于多尺度特征注意力机制的人脸表情识别

张鹏,孔韦韦,滕金保   

  1. 1.西安邮电大学,西安 710121
    2.陕西省网络数据分析与智能处理重点实验室,西安 710121

Abstract: Aiming at the problems that the effective feature extraction is not strong and the recognition accuracy is not high in the process of facial expression recognition with traditional convolutional neural network, a facial expression recognition method based on multi-scale feature attention mechanism is proposed. Firstly, a two-layer convolutional layer is used to extract shallow feature information. Secondly, the dilated convolution is added to the Inception structure parallelly to extract multi-scale feature information, and then the channel attention mechanism is introduced to improve model’s ability of expressing important feature information. Finally, it inputs the obtained features into the Softmax layer for classification. The simulation experimenal results show that the proposed model achieves 68.8% and 96.04% recognition accuracy on the public datasets FER2013 and CK+, respectively, which have better recognition performance than many classic algorithms.

Key words: convolutional neural network, facial expression recognition, dilated convolution, channel attention mechanism

摘要: 针对传统卷积神经网络在人脸表情识别过程中存在有效特征提取针对性不强、识别准确率不高的问题,提出一种基于多尺度特征注意力机制的人脸表情识别方法。用两层卷积层提取浅层特征信息;在Inception结构基础上并行加入空洞卷积,用来提取人脸表情的多尺度特征信息;引入通道注意力机制,提升模型对重要特征信息的表示能力;最后,将得到的特征输入Softmax层进行分类。通过在公开数据集FER2013和CK+上进行仿真实验,分别取得了68.8%和96.04%的识别准确率,结果表明该方法相比许多经典算法有更好的识别效果。

关键词: 卷积神经网络, 人脸表情识别, 空洞卷积, 通道注意力机制