[1] OTTER D W, MEDINA J R, KALITA J K. A survey of the usages of deep learning for natural language processing[J]. IEEE Transactions on Neural Networks and Learning Systems, 2020, 32(2): 604-624.
[2] XU S, WANG J, SHOU W, et al. Computer vision techniques in construction: a critical review[J]. Archives of Computational Methods in Engineering, 2021, 28(5): 3383-3397.
[3] PARISI G, KEMKER R, PART J. Continual lifelong learning with neural networks: a review[J]. Neural Networks, 2019, 113(1): 54-71.
[4] ELWELL R, POLIKAR R. Incremental learning of concept drift in nonstationary environments[J]. IEEE Transactions on Neural Networks, 2011, 22(10): 1517-1531.
[5] YU L, TWARDOWSKI B, LIU X, et al. Semantic drift compensation for class-incremental learning[C]//Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 2020: 6980-6989.
[6] 韩亚楠, 刘建伟, 罗雄麟. 连续学习研究进展[J]. 计算机研究与发展, 2022, 59(6): 1213-1239.
HAN Y N, LIU J W, LUO X L. Research progress of continual learning[J]. Journal of Computer Research and Development, 2022, 59(6): 1213-1239.
[7] 徐岩柏, 景运革. 多源数据矩阵增量约简算法[J]. 计算机工程与应用, 2022, 58(3): 195-200.
XU Y B, JING Y G. Matrix-based incremental reduction approach of multi-resource data[J]. Computer Engineering and Applications, 2022, 58(3): 195-200.
[8] TAO X, HONG X, CHANG X, et al. Few-shot class-incremental learning[C]//Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 2020: 12180-12189.
[9] LI H, BARNAGHI P, ENSHAEIFAR S, et al. Continual learning using Bayesian neural networks[J]. IEEE Transactions on Neural Networks and Learning Systems, 2020, 32(9): 4243-4252.
[10] MAI Z, LI R, JEONG J, et al. Online continual learning in image classification: an empirical survey[J]. Neurocomputing, 2022, 469: 28-51.
[11] KIRKPATRICK J, PASCANU R, RABINOWITZ N, et al. Overcoming catastrophic forgetting in neural networks[J]. Proceedings of the National Academy of Sciences, 2017, 114(13): 3521-3526.
[12] LI Z, HOIEM D. Learning without forgetting[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2017, 40(12): 2935-2947.
[13] REBUFFI S A, KOLESNIKOV A, SPERL G, et al. ICARL: incremental classifier and representation learning[C]//Proceedings of the 2017 IEEE Conference on Computer Vision and Pattern Recognition. Piscataway, NJ: IEEE, 2017: 2001-2010.
[14] 莫建文, 陈瑶嘉. 基于分类特征约束变分伪样本生成器的类增量学习[J]. 控制与决策, 2021, 36(10): 2475-2482.
MO J W, CHEN Y J. Class incremental learning based on variational pseudo-sample generator with classification feature constraints[J]. Control and Decision, 2021, 36(10): 2475-2482.
[15] ZHU F, ZHANG X Y, WANG C, et al. Prototype augmentation and self-supervision for incremental learning[C]//Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2021: 5871-5880.
[16] 吴楚, 王士同. 任务相似度引导的渐进深度神经网络及其学习[J]. 计算机科学与探索, 2023, 17(5): 1126-1138.
WU Q, WANG S T. Task-similarity guided progressive deep neural network and its learning[J]. Journal of Frontiers of Computer Science and Technology, 2023, 17(5): 1126-1138.
[17] SCWARZ J, CZARNECKI W, LUKETINA J, et al. Progress & compress: a scalable framework for continual learning[C]//International Conference on Machine Learning, 2018: 4528-4537.
[18] HE J, ZHU F. Online continual learning for visual food classification[C]//Proceedings of the IEEE/CVF International Conference on Computer Vision, 2021: 2337-2346.
[19] HOU S, PAN X, LOY C C, et al. Learning a unified classifier incrementally via rebalancing[C]//Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2019: 831-839.
[20] DHAR P, SINGH R V, PENG K C, et al. Learning without memorizing[C]//Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2019: 5138-5146.
[21] KOLOURI S, KETZ N A, SOLTOGGIO A, et al. Sliced cramer synaptic consolidation for preserving deeply learned representations[C]//International Conference on Learning Representations, 2019: 5348-5358.
[22] TERCAN H, DEIBERT P, MEISEN T. Continual learning of neural networks for quality prediction in production using memory aware synapses and weight transfer[J]. Journal of Intelligent Manufacturing, 2022, 33(1): 283-292.
[23] JOAN S, DíDAC S, MIRON M, et al. Overcoming catastrophic forgetting with hard attention to the task[C]//Proceedings of the 36th International Conference on Machine Learning. New York: ACM, 2019: 4555-4564. |