[1] LIU B. Lifelong machine learning: a paradigm for continuous learning[J]. Frontiers of Computer Science, 2017, 11: 359-361.
[2] ZENG G X, CHEN Y, CUI B, et al. Continual learning of context-dependent processing in neural networks[J]. Nature Machine Intelligence, 2019, 1(8): 364-372.
[3] NIU S C, WU J X, ZHANG Y F, et al. Disturbance-immune weight sharing for neural architecture search[J]. Neural Networks, 2021, 144: 553-564.
[4] CHEN S L, WU J J, LU Q H, et al. Cross-scene loop-closure detection with continual learning for visual simultaneous localization and mapping[J]. International Journal of Advanced Robotic Systems, 2021, 18(5): 17298814211050560.
[5] LI X, WANG W. GopGAN: gradients orthogonal projection generative adversarial network with continual learning[J]. IEEE Transactions on Neural Networks and Learning Systems, 2023, 34(1): 215-227.
[6] HUA J Q, LI Y G, MOU W P, et al. An accurate cutting tool wear prediction method under different cutting conditions based on continual learning[J]. Proceedings of the Institution of Mechanical Engineers, Part B: Journal of Engineering Manufacture, 2022, 236(1/2): 123-131.
[7] CORY C, BENAVIDES-PRADO D, KOH Y S. Continual correction of errors using smart memory replay[C]//Proceedings of the International Joint Conference on Neural Networks. Piscataway: IEEE, 2021: 1-8.
[8] LIN G L, CHU H L, LAI H J. Towards better plasticity-stability trade-off in incremental learning: a simple linear connector[C]//Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. Piscataway: IEEE, 2022: 89-98.
[9] WANG S P, LI X R, SUN J, et al. Training networks in null space of feature covariance for continual learning[C]//Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. Piscataway: IEEE, 2021: 184-193.
[10] LI Z H, DERWK H. Learning without forgetting[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2018, 40(12) : 2935-2947.
[11] REBUFFI S, ALEXANDER K, CHRISTOPH H L. iCaRL: incremental classifier and representation learning[C]//Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 2017: 2001-2010.
[12] ROBINS A. Catastrophic forgetting, rehearsal and pseudorehearsal[J]. Connection Science, 1995, 7(2): 123-146.
[13] ATKINSON C, MCCANE B, SZYMANSKI L, et al. Pseudo-recursal: solving the catastrophic forgetting problem in deep neural networks[J]. arXiv:1802.03875, 2018.
[14] SHIN H, LEE J K, KIM J, et al. Continual learning with deep generative replay[J]. arXiv:1705.08690, 2017.
[15] MALLYA A, LAZEBNIK S. PackNet: adding multiple tasks to a single network by iterative pruning[C]//Proceedings of the IEEE/CVF Computer Vision and Pattern Recognition Conference, 2018: 7765-7773.
[16] SERRà J, SURíS D, MIRON M, et al. Overcoming catastrophic forgetting with hard attention to the task[J]. arXiv:1801.01423, 2018.
[17] KIRKPATRICK J, PASCANU R, RABINOWITZ N, et al. Overcoming catastrophic forgetting in neural networks[J]. arXiv:1612.00796, 2016.
[18] LEE S W, HA J W, ZHANG B, et al. Overcoming catastrophic forgetting by incremental moment matching[C]//Proceedings of the 31st International Conference on Neural Information Processing Systems, 2017: 4655-4665.
[19] RAHAF A, FRANCESCA B, MOHAMED E, et al. Memory aware synapses: learning what not to forget[C]//Proceedings of the 15th European Conference on Computer Vision, 2018: 139-154.
[20] ZHANG B, GUO Y, LI Y, et al. Memory recall: a simple neural network training framework against catastrophic forgetting[J]. IEEE Transactions on Neural Networks and Learning Systems, 2022, 33(5): 2010-2022.
[21] XU H, HERBERT J. Overcoming catastrophic interference using conceptoraided backpropagation[J]. arXiv:1707.04853, 2017.
[22] HU W, LIN Z, LIU B, et al. Overcoming catastrophic forgetting via model adaptation[C]//Proceedings of the International Conference on Learning Representations, 2018.
[23] ZHOU D, WANG Q, QI Z, et al. Deep class incremental learning: a survey[J]. arXiv:2302.03648, 2023. |