[1] SUN C, LI H, LI X, et al. Convergence of recommender systems and edge computing: a comprehensive survey[J]. IEEE Access, 2020, 8: 47118-47132.
[2] NAWAZ A, AHMED S, KHATTAK H A, et al. Latest advances in Internet of things and big data with requirements and taxonomy[C]//Proceedings of the 7th International Conference on Information Technology Trends, Abu Dhabi, 2020: 13-19.
[3] SHI Y, YANG K, JIANG T, et al. Communication-efficient edge AI: algorithms and systems[J]. IEEE Communications Surveys & Tutorials, 2020, 22(4): 2167-2191.
[4] 孙兵, 刘艳, 王田, 等. 移动边缘网络中联邦学习效率优化综述[J]. 计算机研究与发展, 2022, 59(7): 1439-1469.
SUN B, LIU Y, WANG T, et al. Survey on optimization of federated learning efficiency in mobile edge networks[J]. Journal of Computer Research and Development, 2022, 59(7): 1439-1469.
[5] LIM W Y B, LUONG N C, HOANG D T, et al. Federated learning in mobile edge networks: a comprehensive survey[J]. IEEE Communications Surveys & Tutorials, 2020, 22(3): 2031-2063.
[6] DOMINGO-FERRER J, BLANCO-JUSTICIA A, MANJóN J, et al. Secure and privacy-preserving federated learning via co-utility[J]. IEEE Internet of Things Journal, 2022, 9(5): 3988-4000.
[7] LIU Y, YUAN X, XIONG Z, et al. Federated learning for 6G communications: challenges, methods, and future directions[J]. China Communications, 2020, 17(9): 105-118.
[8] TAO Y, ZHOU J. Straggler remission for federated learning via decentralized redundant Cayley tree[C]//Proceedings of the 12th IEEE Latin-American Conference on Communications, 2020: 1-6.
[9] DI LORENZO P, BATTILORO C, MERLUZZI M, et al. Dynamic resource optimization for adaptive federated learning at the wireless network edge[C]//Proceedings of the 2021 IEEE International Conference on Acoustics, Speech and Signal Processing, Toronto, 2021: 4910-4914.
[10] KHAN L U, MAJEED U, HONG C S. Federated learning for cellular networks: joint user association and resource allocation[C]//Proceedings of the 21st Asia-Pacific Network Operations and Management Symposium, Tainan, China, 2020: 405-408.
[11] SERVETNYK M, FUNG C C, HAN Z. Unsupervised federated learning for unbalanced data[C]//Proceedings of the 2020 IEEE Global Communications Conference, Taipei, China, 2020: 1-6.
[12] WANG L P, WANG W, LI B. CMFL: mitigating communication overhead for federated learning[C]//Proceedings of the 39th International Conference on Distributed Computing Systems, Dallas, 2019: 954-964.
[13] DU Y, YANG S, HUANG K. High-dimensional stochastic gradient quantization for communication-efficient edge learning[J]. IEEE Transactions on Signal Processing, 2020, 68: 2128-2142.
[14] CALDAS S, DUDDU S M K, WU P, et al. Leaf: a benchmark for federated settings[J]. arXiv:1812. 01097, 2018.
[15] SANTRA S, HSIEH J W, LIN C F. Gradient descent effects on differential neural architecture search: a survey[J]. IEEE Access, 2021, 9: 89602-89618.
[16] PU S, OLSHEVSKY A, PASCHALIDIS I C. A sharp estimate on the transient time of distributed stochastic gradient descent[J]. IEEE Transactions on Automatic Control, 2022, 67(11): 5900-5915.
[17] KHAN M W, ZEESHAN M, USMAN M. Traffic scheduling optimization in cognitive radio based smart grid network using mini-batch gradient descent method[C]//Proceedings of the 14th Iberian Conference on Information Systems and Technologies, Barcelona, 2019: 1-5.
[18] SHI L, SHEN Y D. Diversifying convex transductive experimental design for active learning[C]//Proceedings of the 25th International Joint Conference on Artificial Intelligence, New York, 2016: 1997-2003.
[19] YOU X, WANG R, TAO D. Diverse expected gradient active learning for relative attributes[J]. IEEE Transactions on Image Processing, 2014, 23: 3203-3217.
[20] TAIK A, MLIKA Z, CHERKAOUI S. Data-aware device scheduling for federated edge learning[J]. IEEE Transactions on Cognitive Communications and Networking, 2022, 8(1): 408-421.
[21] YANG Y, MA Z, NIE F, et al. Multi-class active learning by uncertainty sampling with diversity maximization[J]. International Journal of Computer Vision, 2015, 113(2): 113-127.
[22] POURKAMALI-ANARAKI F, BENNETTE W D. Adaptive data compression for classification problems[J]. IEEE Access, 2021, 9: 157654-157669.
[23] MECATI M, VETRO A, TORCHIANO M. Detecting discrimination risk in automated decision-making systems with balance measures on input data[C]//Proceedings of the 2021 IEEE International Conference on Big Data, 2021: 4287-4296.
[24] JIANG J, SHANG P, ZHANG Z, LI X. Permutation entropy analysis based on Gini-Simpson index for financial time series[J]. Physica A: Statistical Mechanics and Its Applications, 2017, 486: 273-283.
[25] KHORASANI S M, HODTANI G A, KAKHKI M M. Investigation and comparison of ECG signal sparsity and features variations due to pre-processing steps[J]. Biomedical Signal Processing and Control, 2019, 49: 87-95.
[26] LIU D, ZHU G, ZHANG J, et al. Data-importance aware user scheduling for communication-efficient edge machine learning[J]. IEEE Transactions on Cognitive Communications and Networking, 2021, 7(1): 265-278.
[27] GARG A, GUPTA D, SAXENA S, et al. Validation of random dataset using an efficient CNN model trained on MNIST handwritten dataset[C]//Proceedings of the 2019 6th International Conference on Signal Processing and Integrated Networks, 2019: 602-606.
[28] XIAO H, RASUL K, VOLLGRAF R. Fashion-MNIST: a novel image dataset for benchmarking machine learning algorithms[J]. arXiv:1708.07747, 2017. |