计算机工程与应用 ›› 2025, Vol. 61 ›› Issue (17): 282-291.DOI: 10.3778/j.issn.1002-8331.2404-0128

• 图形图像处理 • 上一篇    下一篇

高效跟踪头的注意力跟踪算法

杨晓强,胡淏   

  1. 西安科技大学 计算机科学与技术学院,西安 710000
  • 出版日期:2025-09-01 发布日期:2025-09-01

Attention Tracking Algorithm for Efficient Tracking Heads

YANG Xiaoqiang, HU Hao   

  1. School of Computer Science and Technology, Xi’an University of Science and Technology, Xi’an 710000, China
  • Online:2025-09-01 Published:2025-09-01

摘要: 为提升跟踪任务中网络跟踪精度及运行速度,提出了一种改进的注意力机制跟踪算法(MFATrack)。针对复杂背景对网络跟踪目标产生的干扰,使用深度可分离卷积、ECA与低通滤波器相结合的动态跟踪模块(MFAM)提升网络目标判别性特征的发掘能力。设计了基于MFAM跟踪头网络,减少深层网络信息损耗并提升其稳定性,提升网络的运行速度。损失函数中,将分类与回归损失相融合,分类损失融合了感知化交并比,回归中采用广义交并比损失,使得网络在训练中更加关注于被跟踪目标。实验结果表明,相比基础算法,在GOT-10k数据集上准确率提升了4.1个百分点,数据集OTB中心误差值提升了1.9个百分点,在UAV123的跟踪成功率上提升了3.6个百分点。

关键词: 目标跟踪, 感知化交并比, 注意力机制, 低通滤波器

Abstract: To improve the accuracy and running speed of network tracking in tracking tasks, an improved attention mechanism tracking algorithm (MFATrack) is proposed. To address the interference caused by complex backgrounds on network tracking targets, a dynamic tracking module (MFAM) combining depth-wise separable convolution, ECA, and low-pass filters is used to enhance the ability to discover discriminative features of network targets. And the paper designs a tracking head network based on MFAM to reduce deep network information loss and improve its stability, thereby improving the network’s running speed. In the loss function, classification and regression losses are combined. The classification loss incorporates perceptual intersection and union ratio, while the regression loss uses generalized intersection and union ratio loss, making the network more focused on the tracked target during training. The experimental results show that compared with the basic algorithm, the accuracy of the GOT-10k dataset is increased by 4.1 percentage points, the OTB center error value of the dataset is increased by 1.9 percentage points, and the tracking success rate of UAV123 is increased by 3.6 percentage points.

Key words: target tracking, aware IoU, attention mechanism, low-pass filter