计算机工程与应用 ›› 2025, Vol. 61 ›› Issue (6): 295-303.DOI: 10.3778/j.issn.1002-8331.2311-0170

• 图形图像处理 • 上一篇    下一篇

自适应分离知识蒸馏的遥感目标检测

杨晓雨,顾进广   

  1. 1.武汉科技大学 计算机科学与技术学院,武汉 430065
    2.智能信息处理与实时工业系统湖北重点实验室,武汉 430065
  • 出版日期:2025-03-15 发布日期:2025-03-14

Adaptive Separation of Knowledge Distillation for Remote Sensing Object Detection

YANG Xiaoyu, GU Jinguang   

  1. 1.School of Computer Science and Technology, Wuhan University of Science and Technology, Wuhan 430065, China
    2.Hubei Key Laboratory of Intelligent Information Processing and Real-Time Industrial System, Wuhan 430065, China
  • Online:2025-03-15 Published:2025-03-14

摘要: 近年来,深度模型在大规模应用方面取得巨大成功,但计算复杂度和存储需求等问题使它们在资源有限的设备上难以部署。知识蒸馏(KD)是一种压缩模型的方法,然而现有方法未考虑遥感数据集的特点。具体来说,在遥感数据集中由于背景复杂,图像中目标物体较小,直接应用现有的知识蒸馏方法时会出现大量噪声,影响训练性能。因此提出了自适应分离知识蒸馏(ASKD)方法。ASKD允许学生模型自动选择多尺度核心特征,减少噪声,同时通过分离全局和局部特征,有效抑制背景干扰。在LEVIR和SSDD数据集上,ASKD在单阶段和双阶段检测器上都取得了出色性能。例如,基于ResNet-18的Faster RCNN,ASKD在SSDD上实现了59.2%的mAP,比基线模型高出2.0个百分点,甚至胜过教师模型。

关键词: 知识蒸馏, 遥感, 轻量级目标检测

Abstract: In recent years, deep models have achieved great success in large-scale applications, but issues such as computational complexity and storage requirements make them difficult to deploy on resource-limited devices. Knowledge distillation (KD) is a method for compressing model, however, existing methods do not consider the characteristics of remote sensing datasets. Specifically, in remote sensing datasets due to the complex background and small target objects in the images, a large amount of noise occurs when applying the existing knowledge distillation methods directly, which affects the training performance. Therefore, the adaptive separation of knowledge distillation (ASKD) method is proposed. ASKD allows the student model to automatically select multi-scale core features to reduce noise, and at the same time effectively suppresses background interference by separating global and local features. ASKD achieves excellent performance on both single-stage and two-stage detectors on both LEVIR and SSDD datasets. For example, based on Faster RCNN of ResNet-18, ASKD achieves 59.2% mAP on SSDD, which is 2.0 percentage points higher than the baseline model and even better than the teacher model.

Key words: knowledge distillation, remote sensing, lightweight object detection