[1] KOLECK T A, DREISBACH C, BOURNE P E, et al. Natural language processing of symptoms documented in free-text narratives of electronic health records: a systematic review[J]. Journal of the American Medical Informatics Association, 2019, 26(4): 364-379.
[2] CAI J, LUO J, WANG S, et al. Feature selection in machine learning: a new perspective[J]. Neurocomputing, 2018, 300: 70-79.
[3] 王振飞, 袁佩瑶, 曹中亚, 等. 面向高维不平衡数据的特征选择算法[J]. 小型微型计算机系统, 2024, 45(8):1839-1846.
WANG Z F, YUAN P Y, CAO Z Y, et al. Feature selection algorithm for high dimensional unbalanced data[J]. Journal of Chinese Mini-Micro Computer Systems, 2024, 45(8): 1839-1846.
[4] ZEBARI R, ABDULAZEEZ A, ZEEBAREE D, et al. A comprehensive review of dimensionality reduction techniques for feature selection and feature extraction[J]. Journal of Applied Science and Technology Trends, 2020, 1(1): 56-70.
[5] ZHANG Y. Safety management of civil engineering construction based on artificial intelligence and machine vision technology[J]. Advances in Civil Engineering, 2021, 2021: 1-14.
[6] SAYED G I, HASSANIEN A E, AZAR A T. Feature selection via a novel chaotic crow search algorithm[J]. Neural Computing and Applications, 2019, 31: 171-188.
[7] KAMALOV F. Orthogonal variance decomposition-based feature selection[J]. Expert Systems with Applications, 2021, 182: 115191.
[8] 孙林, 张起峰, 徐久成. 基于互信息的Fisher Score多标记特征选择[J]. 南京大学学报 (自然科学), 2023, 59(1): 55-66.
SUN L, ZHANG Q F, XU J C. Multilabel feature selection based on Fisher Score with mutual information[J]. Journal of Nanjing University (Natural Sciences), 2023, 59(1): 55-66.
[9] PALMA-MENDOZA R J, RODRIGUEZ D, DE-MARCO S L. Distributed relief-based feature selection in spark[J]. Knowledge and Information Systems, 2018, 57: 1-20.
[10] LI Y, SUN Z, LIU X, et al. Feature selection based on a large-scale many-objective evolutionary algorithm[J]. Computational Intelligence and Neuroscience, 2021(12): 1-11.
[11] SANTUCCI V, BAIOLETTI M, MILANI A. An algebraic framework for swarm and evolutionary algorithms in combinatorial optimization[J]. Swarm and Evolutionary Computation, 2020, 55: 100673.
[12] ZHANG Y, WANG S, PHILLIPS P, et al. Binary PSO with mutation operator for feature selection using decision tree applied to spam detection[J]. Knowledge-Based Systems, 2014, 64: 22-31.
[13] XUE X, YAO M, WU Z. A novel ensemble-based wrapper method for feature selection using extreme learning machine and genetic algorithm[J]. Knowledge and Information Systems, 2018, 57: 389-412.
[14] TAN P, WANG X, WANG Y. Dimensionality reduction in evolutionary algorithms-based feature selection for motor imagery brain-computer interface[J]. Swarm and Evolutionary Computation, 2020, 52: 100597.
[15] AHADZADEH B, ABDAR M, SAFARA F, et al. SFE: a simple, fast and efficient feature selection algorithm for high-dimensional data[J]. IEEE Transactions on Evolutionary Computation, 2023, 27(6):1896-1911.
[16] HANCOCK J T, WANG H, KHOSHGOFTAAR T M, et al. Data reduction techniques for highly imbalanced medicare big data[J]. Journal of Big Data, 2024, 11(1): 8.
[17] CHAWLA N V, BOWYER K W, HALL L O, et al. SMOTE: synthetic minority over-sampling technique[J]. Journal of Artificial Intelligence Research, 2002, 16: 321-357.
[18] DAI Q, LIU J, LIU Y. Multi-granularity relabeled under-sampling algorithm for imbalanced data[J]. Applied Soft Computing, 2022, 124: 109083.
[19] TELIKANI A, RUDBARDEH N E, SOLEYMANPOUR S, et al. A cost-sensitive machine learning model with multitask learning for intrusion detection in IoT[J]. IEEE Transactions on Industrial Informatics, 2024, 20(3): 3880-3890.
[20] ABDULRAUF SHARIFAI G, ZAINOL Z. Feature selection for high-dimensional and imbalanced biomedical data based on robust correlation based redundancy and binary grasshopper optimization algorithm[J]. Genes, 2020, 11(7): 717.
[21] XU Z, SHEN D, NIE T, et al. A cluster-based oversampling algorithm combining SMOTE and k-means for imbalanced medical data[J]. Information Sciences, 2021, 572: 574-589.
[22] REVATHI M, RAMYACHITRA D. A modified borderline smote with noise reduction in imbalanced datasets[J]. Wireless Personal Communications, 2021, 121(3): 1659-1680.
[23] CINAR A C. A comprehensive comparison of accuracy-based fitness functions of metaheuristics for feature selection[J]. Soft Computing, 2023, 27(13): 8931-8958.
[24] DU L, XU Y, ZHU H. Feature selection for multi-class imbalanced data sets based on genetic algorithm[J]. Annals of Data Science, 2015, 2: 293-300.
[25] 苏璇, 王远军. 面向高维不平衡医学数据的特征选择算法[J]. 小型微型计算机系统, 2024, 45(2): 309-318.
SU X, WANG Y J. Feature selection algorithm for high-dimensional unbalanced medical data[J]. Journal of Chinese Mini-Micro Computer Systems, 2024, 45(2): 309-318.
[26] 于涛, 高岳林. 融入小生境和混合变异策略的鲸鱼优化算法[J]. 计算机工程与应用, 2024, 60(10):88-104.
YU T, GAO Y L. Whale optimization algorithm Integrating niche and hybrid mutation strategy[J]. Computer Engineering and Applications, 2024, 60(10): 88-104.
[27] SUN H, DU J, JIN C, et al. Global source optimization based on adaptive nonlinear particle swarm optimization algorithm for inverse lithography[J]. IEEE Photonics Journal, 2021, 13(4): 1-7.
[28] 陈博文, 邹海. 总结性自适应变异的粒子群算法[J]. 计算机工程与应用, 2022, 58(8):67-75.
CHEN B W, ZOU H. Self-conclusion and self-adaptive variation particle swarm optimization[J]. Computer Engineering and Applications, 2022, 58(8):67-75.
[29] SALESI S, COSMA G, MAVROVOUNIOTIS M. TAGA: Tabu asexual genetic algorithm embedded in a filter/filter feature selection approach for high-dimensional data[J]. Information Sciences, 2021, 565: 105-127.
[30] GAN M, ZHANG L. Iteratively local fisher score for feature selection[J]. Applied Intelligence, 2021, 51: 6167-6181. |