[1] SZEGEDY C, ZAREMBA W, SUTSKEVER I, et al. Intriguing properties of neural networks[J]. arXiv:1312.6199, 2013.
[2] THYS S, VAN RANST W, GOEDEMé T. Fooling automated surveillance cameras: adversarial patches to attack person detection[C]//Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2019.
[3] ATHALYE A, ENGSTROM L, ILYAS A, et al. Synthesizing robust adversarial examples[C]//Proceedings of the International Conference on Machine Learning, 2018: 284-293.
[4] LU M, LI Q, CHEN L, et al. Scale-adaptive adversarial patch attack for remote sensing image aircraft detection[J]. Remote Sensing, 2021, 13(20): 4078.
[5] DEN HOLLANDER R, ADHIKARI A, TOLIOS I, et al. Adversarial patch camouflage against aerial detection[C]//Proceedings of the IEEE Conference on Artificial Intelligence and Machine Learning in Defense Applications, 2020: 77-86.
[6] DU A, CHEN B, CHIN T J, et al. Physical adversarial attacks on an aerial imagery object detector[C]//Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision, 2022: 1796-1806.
[7] CHOW K H, LIU L, LOPER M, et al. Adversarial objectness gradient attacks in real-time object detection systems[C]//Proceedings of the 2020 Second IEEE International Conference on Trust, Privacy and Security in Intelligent Systems and Applications (TPS-ISA), 2020: 263-272.
[8] 王志波,王雪,马菁菁,等. 面向计算机视觉系统的对抗样本攻击综述[J]. 计算机学报, 2023, 46(2): 436-468.
WANG Z B, WANG X, MA J J, et al. Survey on adversarial example attack for computer vision systems[J]. Chinese Journal of Computers, 2023, 46(2): 436-468.
[9] LU J, SIBAI H, FABRY E. Adversarial examples that fool detectors[J]. arXiv:1712.02494, 2017.
[10] XIE C, WANG J, ZHANG Z, et al. Adversarial examples for semantic segmentation and object detection[C]//Proceedings of the IEEE International Conference on Computer Vision, 2017: 1369-1378.
[11] REN S, HE K, GIRSHICK R, et al. Faster R-CNN: towards real-time object detection with region proposal networks[C]//Advances in Neural Information Processing Systems, 2015.
[12] LI Y, TIAN D, CHANG M C, et al. Robust adversarial perturbation on deep proposal-based models[J]. arXiv:1809.
05962, 2018.
[13] WEI X, LIANG S, CHEN N, et al. Transferable adversarial attacks for image and video object detection[J]. arXiv:1811. 12641, 2018.
[14] MADRY A, MAKELOV A, SCHMIDT L, et al. Towards deep learning models resistant to adversarial attacks[J]. arXiv:1706.06083, 2017.
[15] LIU X, YANG H, LIU Z, et al. Dpatch: an adversarial patch attack on object detectors[J]. arXiv:1806.02299, 2018.
[16] 王烨奎,曹铁勇,郑云飞,等. 基于特征图关注区域的目标检测对抗攻击方法[J]. 计算机工程与应用, 2023, 59(2):261-270.
WANG Y K, CAO T Y, ZHENG Y F, et al. Adversarial attacks for object detection based on region of interest of feature maps[J]. Computer Engineering and Applications,2023, 59(2): 261-270.
[17] SHARIF M, BHAGAVATULA S, BAUER L, et al. Accessorize to a crime: real and stealthy attacks on state-of-the-art face recognition[C]//Proceedings of the 2016 ACM SIGSAC Conference on Computer and Communications Security, 2016: 1528-1540.
[18] WANG Y, LV H, KUANG X, et al. Towards a physical-world adversarial patch for blinding object detection models[J]. Information Sciences, 2021, 556: 459-471.
[19] REDMON J, FARHADI A. YOLO9000: better, faster, stronger[C]//Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 2017: 7263-7271.
[20] REDMON J, FARHADI A. YOLOv3: an incremental improvement[J]. arXiv:1804.02767, 2018.
[21] HOORY S, SHAPIRA T, SHABTAI A, et al. Dynamic adversarial patch for evading object detection models[J]. arXiv:2010.13070, 2020.
[22] HU Y C T, KUNG B H, TAN D S, et al. Naturalistic physical adversarial patch for object detectors[C]//Proceedings of the IEEE/CVF International Conference on Computer Vision, 2021: 7848-7857.
[23] TAN J, JI N, XIE H, et al. Legitimate adversarial patches: evading human eyes and detection models in the physical world[C]//Proceedings of the 29th ACM International Conference on Multimedia, 2021: 5307-5315.
[24] WANG D, JIANG T, SUN J, et al. FCA: learning a 3D full-coverage vehicle camouflage for multi-view physical adversarial attack[C]//Proceedings of the AAAI Conference on Artificial Intelligence, 2022: 2414-2422.
[25] HU Z, HUANG S, ZHU X, et al. Adversarial texture for fooling person detectors in the physical world[C]//Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2022: 13307-13316.
[26] ZOLFI A, KRAVCHIK M, ELOVICI Y, et al. The translucent patch: a physical and universal attack on object detectors[C]//Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2021: 15232-15241.
[27] ZHANG Y C, ZHANG Y, QI J H, et al. Adversarial patch attack on multi-scale object detection for UAV remote sensing images[J]. Remote Sensing, 2022, 14(21): 5298.
[28] SAVA P A, SCHULZE J P, SPERL P, et al. Assessing the impact of transformations on physical adversarial attacks[C]//Proceedings of the 15th ACM Workshop on Artificial Intelligence and Security, 2022: 79-90.
[29] BUSLAEV A, IGLOVIKOV V I, KHVEDCHENYA E, et al. Albumentations: fast and flexible image augmentations[J]. Information, 2020, 11(2): 125.
[30] ZHU P, WEN L, DU D, et al. Detection and tracking meet drones challenge[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2021, 44(11): 7380-7399. |