[1] IBRAHIM I A, RAWINDRAN H, ALAM M M, et al. Mitigating persistent organic pollutants from marine plastics through enhanced recycling: a review[J]. Environmental Research, 2024, 240: 117533.
[2] ROCHMAN C M, BROWNE M A, HALPERN B S, et al. Classify plastic waste as hazardous[J]. Nature, 2013, 494(7436): 169-171.
[3] MADRICARDO F, GHEZZO M, NESTO N, et al. How to deal with seafloor marine litter: an overview of the state-of-the-art and future perspectives[J]. Frontiers in Marine Science, 2020, 7: 505134.
[4] STEELE C L W, MILLER M R. Temporal trends in anthropogenic marine macro-debris and micro-debris accumulation on the California Channel Islands[J]. Frontiers in Marine Science, 2022, 9: 905969.
[5] MITCHELL K, LIMA A T, VAN CAPPELLEN P. Selenium in buoyant marine debris biofilm[J]. Marine Pollution Bulletin, 2019, 149: 110562.
[6] KAMAEV A N, SUKHENKO V A, KARMANOV D A. Constructing and visualizing three-dimensional sea bottom models to test AUV machine vision systems[J]. Programming and Computer Software, 2017, 43(3): 184-195.
[7] TURNER J A, BABCOCK R C, HOVEY R, et al. AUV-based classification of benthic communities of the Ningaloo shelf and mesophotic areas[J]. Coral Reefs, 2018, 37(3): 763-778.
[8] ZHENG Q H, TIAN X Y, YU Z G, et al. MobileRaT: a lightweight radio transformer method for automatic modulation classification in drone communication systems[J]. Drones, 2023, 7(10): 596.
[9] ZHENG Q H, SAPONARA S, TIAN X Y, et al. A real-time constellation image classification method of wireless communication signals based on the lightweight network MobileViT[J]. Cognitive Neurodynamics, 2024, 18(2): 659-671.
[10] ZOCCO F, LIN T C, HUANG C I, et al. Towards more efficient EfficientDets and real-time marine debris detection[J]. IEEE Robotics and Automation Letters, 2023, 8(4): 2134-2141.
[11] ZHOU W, ZHENG F J, YIN G, et al. YOLOTrashCan: a deep learning marine debris detection network[J]. IEEE Transactions on Instrumentation and Measurement, 2022, 72: 5002012.
[12] BOCHKOVSKIY A, WANG C Y, LIAO H Y. YOLOv4:optimal speed and accuracy of object detection[J]. arXiv:2004.10934, 2020.
[13] K?HLER M, EISENBACH M, GROSS H M. Few-shot object detection: a comprehensive survey[J]. IEEE Transactions on Neural Networks and Learning Systems, 2024, 35(9): 11958-11978.
[14] SáNCHEZ-FERRER A, VALERO-MAS J J, GALLEGO A J, et al. An experimental study on marine debris location and recognition using object detection[J]. Pattern Recognition Letters, 2023, 168: 154-161.
[15] ZHU S H, ZHANG K. Few-shot object detection via data augmentationand distribution calibration[J]. Machine Vision and Applications, 2023, 35(1): 11.
[16] WANG B, HUA L J, MEI H, et al. Monitoring marine pollution for carbon neutrality through a deep learning method with multi-source data fusion[J]. Frontiers in Ecology and Evolution, 2023, 11: 1257542.
[17] YAN X P, CHEN Z L, XU A N, et al. Meta R-CNN: towards general solver for instance-level low-shot learning[C]//Proceedings of the 2019 IEEE/CVF International Conference on Computer Vision. Piscataway: IEEE, 2019: 9576-9585.
[18] FAN Q, ZHUO W, TANG C K, et al. Few-shot object detection with attention-RPN and multi-relation detector[C]//Proceedings of the 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition. Piscataway: IEEE, 2020: 4012-4021.
[19] HAN G X, HUANG S Y, MA J W, et al. Meta Faster R-CNN: towards accurate few-shot object detection with attentive feature alignment[C]//Proceedings of the AAAI Conference on Artificial Intelligence, 2022: 780-789.
[20] KINGMA D P, WELLING M. Auto-encoding variational Bayes[J]. arXiv:1312.6114, 2013.
[21] HAN J M, REN Y Q, DING J, et al. Few-shot object detection via variational feature aggregation[C]//Proceedings of the AAAI Conference on Artificial Intelligence, 2023: 755-763.
[22] WANG X, HUANG T E, DARRELL T, et al. Frustrat-ingly simple few-shot object detection[C]//Proceedings of the International Conference on Machine Learning, 2020.
[23] SUN B, LI B H, CAI S C, et al. FSCE: few-shot object detection via contrastive proposal encoding[C]//Proceedings of the 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition. Piscataway: IEEE, 2021: 7348-7358.
[24] REN S Q, HE K M, GIRSHICK R, et al. Faster R-CNN: towards real-time object detection with region proposal networks[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2017, 39(6): 1137-1149.
[25] QIAO L M, ZHAO Y X, LI Z Y, et al. DeFRCN: decoupled faster R-CNN for few-shot object detection[C]//Proceedings of the 2021 IEEE/CVF International Conference on Computer Vision. Piscataway: IEEE, 2021: 8661-8670.
[26] SONG Y S, WANG T, CAI P Y, et al. A comprehensive survey of few-shot learning: evolution, applications, challenges, and opportunities[J]. ACM Computing Surveys, 2023, 55(13s): 1-40.
[27] YAN D T, HUANG J T, SUN H, et al. Few-shot object detection with weight imprinting[J]. Cognitive Computation, 2023, 15(5): 1725-1735.
[28] LI L J, YAO X W, WANG X, et al. Robust few-shot aerial image object detection via unbiased proposals filtration[J]. IEEE Transactions on Geoscience and Remote Sensing, 2023, 61: 5617011.
[29] WANG H, TIAN S Z, FU Y, et al. Feature augmentation based on information fusion rectification for few-shot image classification[J]. Scientific Reports, 2023, 13: 3607.
[30] YAO J, SHI T Y, CHE X P, et al. DA-FSOD: a novel data augmentation scheme for few-shot object detection[J]. IEEE Access, 2023, 11: 92100-92110.
[31] HONG J, FULTON M, SATTAR J. TrashCan: a semantically-segmented dataset towards visual detection of marine debris[J]. arXiv:2007.08097, 2020.
[32] FU C P, LIU R S, FAN X, et al. Rethinking general underwater object detection: datasets, challenges, and solutions[J]. Neurocomputing, 2023, 517: 243-256.
[33] EVERINGHAM M, VAN GOOL L, WILLIAMS C K I, et al. The pascal visual object classes (VOC) challenge[J]. International Journal of Computer Vision, 2010, 88(2): 303-338.
[34] XIAO Y, LEPETIT V, MARLET R. Few-shot object detection and viewpoint estimation for objects in the wild[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2023, 45(3): 3090-3106. |