[1] LI Q B, WEN Z Y, WU Z M, et al. A survey on federated learning systems: vision, hype and reality for data privacy and protection[J]. IEEE Transactions on Knowledge and Data Engineering, 2023, 35(4): 3347-3366.
[2] BERGHEL H. Equifax and the latest round of identity theft roulette[J]. Computer, 2017, 50(12): 72-76.
[3] WANG C H, WANG S S, ZHAO C, et al. The secondary isolated data island: isolated data island caused by blockchain in federated learning[C]//Proceedings of the 2023 IEEE International Conference on Bioinformatics and Biomedicine. Piscataway: IEEE, 2023: 4159-4166.
[4] MCMAHAN H B, MOORE E, RAMAGE D, et al. Communication-efficient learning of deep networks from
decentralized data[J]. arXiv:1602.05629, 2016.
[5] FAN TFAN T, KANG Y, KANG Y, MA G Q, et al. FATE-LLM: a industrial grade federated learning framework for large language models[J]. arXiv:2310.10049, 2023.
[6] IMTEAJ A, THAKKER U, WANG S Q, et al. A survey on federated learning for resource-constrained IoT devices[J]. IEEE Internet of Things Journal, 2022, 9(1): 1-24.
[7] LEE N, AJANTHAN T, TORR P H S, et al. Understanding the effects of data parallelism and sparsity on neural network training[J]. arXiv:2003.11316, 2020.
[8] KARIMIREDDY S P, KALE S P, MOHRI M, et al. Scaffold: stochastic controlled averaging for federated learning[C]//Proceedings of the International Conference on Machine Learning, 2020: 5132-5143.
[9] LI Q B, DIAO Y Q, CHEN Q, et al. Federated learning on non-IID data silos: an experimental study[C]//Proceedings of the 2022 IEEE 38th International Conference on Data Engineering. Piscataway: IEEE, 2022: 965-978.
[10] LIU B Y, LV N Y, GUO Y C, et al. Recent advances on federated learning: a systematic survey[J]. Neurocomputing, 2024, 597: 128019.
[11] LI T, SAHU A K, ZAHEER M, et al. Federated optimization in heterogeneous networks[J]. Proceedings of Machine Learning and Systems, 2020, 2: 429-450.
[12] WANG J Y, LIU Q H, LIANG H, et al. Tackling the objective inconsistency problem in heterogeneous federated optimization[C]//Advances in Neural Information Processing Systems, 2020: 7611-7623.
[13] ZHANG J Q, HUA Y, WANG H, et al. GPFL: simultaneously learning global and personalized feature information for personalized federated learning[C]//Proceedings of the 2023 IEEE/CVF International Conference on Computer Vision. Piscataway: IEEE, 2023: 5018-5028.
[14] RABBANI T, BORNSTEIN M, HUANG F R, et al. Large-scale distributed learning via private on-device locality-sensitive hashing[C]//Proceedings of the 37th International Conference on Neural Information Processing Systems. New York: ACM, 2023: 16153-16171.
[15] TAN Y, LONG G D, LIU L, et al. FedProto: federated prototype learning across heterogeneous clients[J]. Proceedings of the AAAI Conference on Artificial Intelligence, 2022, 36(8): 8432-8440.
[16] CHEN C, ZHANG J, TUNG A K H, et al. Robust federated recommendation system[J]. arXiv:2006.08259, 2020.
[17] SMITH V, CHIANG C K, SANJABI M, et al. Federated multi-task learning[C]//Advances in Neural Information Processing Systems, 2017: 4427-4437.
[18] ABDULRAHMAN S, TOUT H, OULD-SLIMANE H, et al. A survey on federated learning: the journey from centralized to distributed on-site learning and beyond[J]. IEEE Internet of Things Journal, 2021, 8(7): 5476-5497.
[19] BHAGOJI A, CHAKRABORTY S, MITTAL P, et al. Analyzing federated learning through an adversarial lens[C]//Proceedings of the International Conference on Machine Learning, 2019: 634-643.
[20] MOSHAWRAB M, ADDA M, BOUZOUANE A, et al. Reviewing federated learning aggregation algorithms; strategies, contributions, limitations and future perspectives[J]. Electronics, 2023, 12(10): 2287.
[21] VINYALS O, BLUNDELL C, LILLICRAP T, et al. Matching networks for one shot learning[C]//Proceedings of the 30th International Conference on Neural Information Processing Systems. New York: ACM, 2016: 3637-3645.
[22] KORCH G, ZERNEL R, SALAKHUTDINOV R. Siamese neural networks for one-shot image recognition[C]//Proceedings of the 32nd International Conference on Machine Learning, 2015.
[23] SNELL J, SWERSKY K, ZEMEL R S. Prototypical networks for few-shot learning[C]//Advances in Neural Information Processing Systems, 2017: 4080-4090.
[24] LIU L, HAMILTON W, LONG G D, et al. A universal representation transformer layer for few-shot image classification[J]. arXiv:2006.11702, 2020.
[25] LI J N, ZHOU P, XIONG C M, et al. Prototypical contrastive learning of unsupervised representations[J]. arXiv:2005.04966,2020.
[26] XUE Y H, JOSHI S, GAN E, et al. Which features are learnt by contrastive learning?On the role of simplicity bias in class collapse and feature suppression[C]//Proceedings of the International Conference on Machine Learning, 2023: 38938-38970.
[27] LI Z W, ZHAO L, ZHANG Z Z, et al. Steering prototypes with prompt-tuning for rehearsal-free continual learning[C]//Proceedings of the 2024 IEEE/CVF Winter Conference on Applications of Computer Vision. Piscataway: IEEE, 2024: 2511-2521.
[28] HAO F S, HE F X, LIU L, et al. Class-aware patch embedding adaptation for few-shot image classification[C]//Proceedings of the 2023 IEEE/CVF International Conference on Computer Vision. Piscataway: IEEE, 2023: 18859-18869.
[29] GUO C, MOUSAVI A, WU X, et al. Breaking the glass ceiling for embedding-based classifiers for large output spaces[C]//Advances in Neural Information Processing Systems, 2019: 4943-4953.
[30] YU F X, RAWAT A S, MENON A K, et al. Federated learning with only positive labels[C]//Proceedings of the 37th International Conference on Machine Learning. New York: ACM, 2020: 10946-10956.
[31] ZHANG T. Statistical behavior and consistency of classification methods based on convex risk minimization[J]. The Annals of Statistics, 2004, 32(1): 56-85.
[32] CEVHER V, BECKER S, SCHMIDT M. Convex optimization for big data: scalable, randomized, and parallel algorithms for big data analytics[J]. IEEE Signal Processing Magazine, 2014, 31(5): 32-43.
[33] ZHU L, LIU Z, HAN S. Deep leakage from gradients[C]//Advances in Neural Information Processing Systems, 2019.
[34] GAMBA M, AZIZPOUR H, BJ?RKMAN M. On the Lipschitz constant of deep networks and double descent[J]. arXiv:2301.12309, 2023.
[35] 王鑫, 郭雅婷, 杨波. 有限域(F28)8上基于循环异或移位结构的次优扩散层研究[J]. 陕西科技大学学报, 2023, 41(4): 188-194.
WANG X, GUO Y T, YANG B. Study on suboptimal diffusion layer based on rotational-XOR shifted structure over finite domain (F28)8[J]. Journal of Shaanxi University of Science & Technology, 2023, 41(4): 188-194.
[36] 王鑫, 韩志宇, 王新梅, 等. 一种基于多变量公钥密码体制的改进签名模型[J]. 陕西科技大学学报, 2020, 38(5): 157-164.
WANG X, HAN Z Y, WANG X M, et al. An improved signature model based on multivariate polynomial public key cryptosystem[J]. Journal of Shaanxi University of Science & Technology, 2020, 38(5): 157-164.
[37] ALEX K. HINTON G. Learning multiple layers of features from tiny images[Z]. Handbook of Systemic Autoimmune Diseases, 2009. |