Computer Engineering and Applications ›› 2024, Vol. 60 ›› Issue (19): 1-17.DOI: 10.3778/j.issn.1002-8331.2403-0142
• Research Hotspots and Reviews • Previous Articles Next Articles
SU Yilei, LI Weijun, LIU Xueyang, DING Jianping, LIU Shixia, LI Haonan, LI Guanfeng
Online:
2024-10-01
Published:
2024-09-30
苏易礌,李卫军,刘雪洋,丁建平,刘世侠,李浩南,李贯峰
SU Yilei, LI Weijun, LIU Xueyang, DING Jianping, LIU Shixia, LI Haonan, LI Guanfeng. Review of Text Classification Methods Based on Graph Neural Networks[J]. Computer Engineering and Applications, 2024, 60(19): 1-17.
苏易礌, 李卫军, 刘雪洋, 丁建平, 刘世侠, 李浩南, 李贯峰. 基于图神经网络的文本分类方法研究综述[J]. 计算机工程与应用, 2024, 60(19): 1-17.
Add to citation manager EndNote|Ris|BibTeX
URL: http://cea.ceaj.org/EN/10.3778/j.issn.1002-8331.2403-0142
[1] 陈秀明, 储天启, 王先传. 基于LSTM-CNN-Attention的新闻分类研究[J]. 阜阳师范大学学报(自然科学版), 2022, 39(4): 62-69. CHEN X M, CHU T Q, WANG X C. Research on news classification based on LSTM-CNN-Attention[J]. Journal of Fuyang Normal University(Natural Science), 2022, 39(4):62-69. [2] 李阳, 王石, 朱俊武, 等. 方面级情感分析综述[J]. 计算机科学, 2023, 50(S1): 34-40. LI Y, WANG S, ZHU J W, et al. Summarization of aspect-level sentiment analysis [J]. Computer Science, 2023, 50(S1): 34-40. [3] 任伟建, 刘圆圆, 计妍, 等. 基于RNN-LSTM新冠肺炎疫情下的微博舆情分析[J]. 吉林大学学报(信息科学版), 2022, 40(4): 581-588. REN W J, LIU Y Y, JI Y, et al. Public opinion analysis on Weibo based on RNN-LSTM in COVID-19[J]. Journal of Jilin University (Information Science Edition), 2022, 40(4): 581-588. [4] 余新言, 曾诚, 王乾, 等. 基于知识增强和提示学习的小样本新闻主题分类方法[J]. 计算机应用, 2024, 44(6): 1767-1774. YU X Y, ZENG C, WANG Q, et al. Few-shot news topic classification method based on knowledge enhancement and prompt learning[J]. Journal of Computer Applications, 2024, 44(6): 1767-1774. [5] ZAVRAK S, YILMAZ S. Email spam detection using hierarchical attention hybrid deep learning method[J]. Expert Systems with Applications, 2023, 233: 120977. [6] 闫悦, 郭晓然, 王铁君, 等. 问答系统研究综述[J]. 计算机系统应用, 2023, 32(8): 1-18. YAN Y, GUO X R, WANG T J, et al. Survey on question answering system research [J]. Computer Systems &?Applications, 2023, 32(8): 1-18. [7] WAN C H, LEE L H, RAJKUMAR R, et al. A hybrid text classification approach with low dependency on parameter by integrating K-nearest neighbor and support vector machine[J]. Expert Systems with Applications, 2012, 39(15): 11880-11888. [8] 辛梓铭, 王芳. 基于改进朴素贝叶斯算法的文本分类研究[J]. 燕山大学学报, 2023, 47(1): 82-88. XIN Z M, WANG F. Research on text classification based on improved naive Bayes algorithm[J]. Journal of Yanshan University, 2023, 47(1): 82-88. [9] XIONG L, YAO Y. Study on an adaptive thermal comfort model with K-nearest-neighbors (KNN) algorithm[J]. Building and Environment, 2021, 202: 108026. [10] GUAN X, LIANG J, QIAN Y, et al. A multi-view OVA model based on decision tree for multi-classification tasks[J]. Knowledge-Based Systems, 2017, 138: 208-219. [11] JALAL N, MEHMOOD A, CHOI G S, et al. A novel improved random forest for text classification using feature ranking and optimal number of trees[J]. Journal of King Saud University-Computer and Information Sciences, 2022, 34(6): 2733-2742. [12] ZHAO R, MAO K. Fuzzy bag-of-words model for document representation[J]. IEEE Transactions on Fuzzy Systems, 2017, 26(2): 794-804. [13] ALMAHMOUD R H, HAMMO B H. SEWAR: a corpus-based N-gram approach for extracting semantically-related words from Arabic medical corpus[J]. Expert Systems with Applications, 2024, 238: 121767. [14] VAN ZAANEN M, KANTERS P H M. Automatic mood classification using TF* IDF based on lyrics[C]//Proceedings of the 11th International Society for Music Information Retrieval Conference (ISMIR 2010), 2010: 75-80. [15] 裴志利, 阿茹娜, 姜明洋, 等. 基于卷积神经网络的文本分类研究综述[J]. 内蒙古民族大学学报(自然科学版), 2019, 34(3): 206-210. PEI Z L, A R N, JIANG M Y, et al. Survey of text classification research based on convolutional neural networks[J]. Journal of Innner Mongolia University for Nationalities (Natural Science), 2019, 34(3): 206-210. [16] 杨丽, 吴雨茜, 王俊丽, 等. 循环神经网络研究综述[J]. 计算机应用, 2018, 38(S2): 1-6. YANG L, WU Y X, WANG J L, et al. Research on recurrent neural network[J]. Journal of Computer Applications, 2018, 38(S2): 1-6. [17] 王鑫, 吴际, 刘超, 等. 基于LSTM循环神经网络的故障时间序列预测[J]. 北京航空航天大学学报, 2018, 44(4): 772-784. WANG X, WU J, LIU C, et al. Exploring LSTM based recurrent neural network for failure time series prediction[J].?Journal of Beijing University of Aeronautics and Astronautics, 2018, 44(4): 772-784. [18] KIPF T N, WELLING M. Semi-supervised classification with graph convolutional networks[J]. arXiv:1609.02907, 2016. [19] ZHOU J, CUI G, HU S, et al. Graph neural networks: a review of methods and applications[J]. AI Open, 2020, 1: 57-81. [20] 檀莹莹, 王俊丽, 张超波. 基于图卷积神经网络的文本分类方法研究综述[J]. 计算机科学, 2022, 49(8): 205-216. TAN Y Y, WANG J L, ZHANG C B. Review of text classification methods based on graph convolutional network[J]. Computer Science, 2022, 49(8): 205-216. [21] VELI?KOVI? P, CUCURULL G, CASANOVA A, et al. Graph attention networks[J]. arXiv:1710.10903, 2017. [22] LI Y, TARLOW D, BROCKSCHMIDT M, et al. Gated graph sequence neural networks[J]. arXiv:1511.05493, 2015. [23] YUN S, JEONG M, KIM R, et al. Graph transformer networks[C]//Advances in Neural Information Processing Systems, 2019, 32. [24] GUAN B, ZHANG J, SETHARES W A, et al. SpecNet: spectral domain convolutional neural network[J]. arXiv:1905.10915, 2019. [25] CHUNG F R K. Spectral graph theory[Z]. 1997. [26] LE-TIEN T, TO T N, VO G. Graph-based signal processing to convolutional neural networks for medical image segmentation[J]. SEATUC Journal of Science and Engineering, 2022, 3(1): 9-15. [27] VERMA D. Application of convolution theorem[J]. International Journal of Trend in Scientific Research and Development, 2018, 2(4): 981-984. [28] DEFFERRARD M, BRESSON X, VANDERGHEYNST P. Convolutional neural networks on graphs with fast localized spectral filtering[C]//Advances in Neural Information Processing Systems, 2016, 29. [29] LI R, WANG S, ZHU F, et al. Adaptive graph convolutional neural networks[C]//Proceedings of the AAAI Conference on Artificial Intelligence, 2018. [30] ZHUANG C, MA Q. Dual graph convolutional networks for graph-based semi-supervised classification[C]//Proceedings of the 2018 World Wide Web Conference, 2018: 499-508. [31] XU B, SHEN H, CAO Q, et al. Graph wavelet neural network[J]. arXiv:1904.07785, 2019. [32] MONTI F, BOSCAINI D, MASCI J, et al. Geometric deep learning on graphs and manifolds using mixture model CNNs[C]//Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 2017: 5115-5124. [33] GILMER J, SCHOENHOLZ S S, RILEY P F, et al. Neural message passing for quantum chemistry[C]//Proceedings of the International Conference on Machine Learning, 2017: 1263-1272. [34] GAO H, WANG Z, JI S. Large-scale learnable graph convolutional networks[C]//Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, 2018: 1416-1424. [35] ZHANG M, CUI Z, NEUMANN M, et al. An end-to-end deep learning architecture for graph classification[C]//Proceedings of the AAAI Conference on Artificial Intelligence, 2018. [36] HAJIBABAEE P, MALEKZADEH M, HEIDARI M, et al. An empirical study of the graphsage and word2vec algorithms for graph multiclass classification[C]//Proceedings of the 2021 IEEE 12th Annual Information Technology, Electronics and Mobile Communication Conference (IEMCON), 2021: 515-522. [37] YING R, HE R, CHEN K, et al. Graph convolutional neural networks for web-scale recommender systems[C]//Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, 2018: 974-983. [38] HUANG W, ZHANG T, RONG Y, et al. Adaptive sampling towards fast graph representation learning[C]//Advances in Neural Information Processing Systems, 2018, 31. [39] YAO L, MAO C, LUO Y. Graph convolutional networks for text classification[C]//Proceedings of the AAAI Conference on Artificial Intelligence, 2019: 7370-7377. [40] CHURCH K, HANKS P. Word association norms, mutual information, and lexicography[J]. Computational Linguistics, 1990, 16(1): 22-29. [41] LIU X, YOU X, ZHANG X, et al. Tensor graph convolutional networks for text classification[C]//Proceedings of the AAAI Conference on Artificial Intelligence, 2020: 8409-8416. [42] WU F, SOUZA A, ZHANG T, et al. Simplifying graph convolutional networks[C]//Proceedings of the International Conference on Machine Learning, 2019: 6861-6871. [43] WANG K, HAN S C, POON J. Induct-GCN: inductive graph convolutional networks for text classification[C]//Proceedings of the 2022 26th International Conference on Pattern Recognition (ICPR), 2022: 1243-1249. [44] DAI Y, SHOU L, GONG M, et al. Graph fusion network for text classification[J]. Knowledge-Based Systems, 2022, 236: 107659. [45] YANG L, CHEN H, LI Z, et al. Give us the facts: enhancing large language models with knowledge graphs for fact-aware language modeling[J]. IEEE Transactions on Knowledge and Data Engineering, 2024, 36(7): 3091-3110. [46] SOYALP G, ALAR A, OZKANLI K, et al. Improving text classification with transformer[C]//Proceedings of the 2021 6th International Conference on Computer Science and Engineering (UBMK), 2021: 707-712. [47] DEVLIN J, CHANG M W, LEE K, et al. BERT: pre-training of deep bidirectional transformers for language understanding[J]. arXiv:1810.04805, 2018. [48] LIN Y, MENG Y, SUN X, et al. BertGCN: transductive text classification by combining GCN and BERT[J]. arXiv:2105.05727, 2021. [49] XUE Y. BERTGACN: text classification by combining BERT and GCN and GAT[C]//Proceedings of the 2023 3rd International Conference on Neural Networks, Information and Communication Engineering (NNICE), 2023: 422-426. [50] DONG K, LIU Y, XU F, et al. DCAT: combining multi- semantic dual-channel attention fusion for text classification[J]. IEEE Intelligent Systems, 2023, 38(4): 10-19. [51] XU X, XU Z, LING Z, et al. Comprehensive implementation of TextCNN for enhanced collaboration between natural language processing and system recommendation[J]. arXiv:2403.09718, 2024. [52] 李全鑫, 庞俊, 朱峰冉. 结合Bert与超图卷积网络的文本分类模型[J]. 计算机工程与应用, 2023, 59(17): 107-115. LI Q X, PANG J, ZHU F R. Text classification method based on integration of Bert and hypergraph convolutional network[J]. Computer Engineering and Applications, 2023, 59(17): 107-115. [53] LINMEI H, YANG T, SHI C, et al. Heterogeneous graph attention networks for semi-supervised short text classification[C]//Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), 2019: 4821-4830. [54] CHEN C, CHENG Z, LI Z, et al. Hypergraph attention networks[C]//Proceedings of the 2020 IEEE 19th International Conference on Trust, Security and Privacy in Computing and Communications (TrustCom), 2020: 1560-1565. [55] LI K, FENG Y, GAO Y, et al. Hierarchical graph attention networks for semi-supervised node classification[J]. Applied Intelligence, 2020, 50: 3441-3451. [56] WANG X, WANG C, YANG H, et al. KGAT: an enhanced graph-based model for text classification[C]//Proceedings of the CCF International Conference on Natural Language Processing and Chinese Computing. Cham: Springer International Publishing, 2022: 656-668. [57] 宋建平, 王毅, 孙开伟, 等. 结合双曲图注意力网络与标签信息的短文本分类方法[J]. 计算机工程与应用, 2024, 60(9): 188-195. SONG J P, WANG Y, SUN K W, et al. Short text classification combined with hyperbolic graph attention networks and labels[J]. Computer Engineering and Applications, 2024, 60(9): 188-195. [58] PATRICK M K, ADEKOYA A F, MIGHTY A A, et al. Capsule networks-a survey[J]. Journal of King Saud University-Computer and Information Sciences, 2022, 34(1): 1295-1310. [59] BANG J, PARK J, PARK J. GACaps-HTC: graph attention capsule network for hierarchical text classification[J]. Applied Intelligence, 2023, 53(17): 20577-20594. [60] ZHANG Y, YU X, CUI Z, et al. Every document owns its structure: inductive text classification via graph neural networks[J]. arXiv:2004.13826, 2020. [61] DEY R, SALEM F M. Gate-variants of gated recurrent unit (GRU) neural networks[C]//Proceedings of the 2017 IEEE 60th International Midwest Symposium on Circuits and Systems (MWSCAS), 2017: 1597-1600. [62] WANG Y, WANG C, ZHAN J, et al. Text FCG: fusing contextual information via graph learning for text classification[J]. Expert Systems with Applications, 2023, 219: 119658. [63] ZHANG J, LIU F, XU W, et al. Feature fusion text classification model combining CNN and BiGRU with multi-attention mechanism[J]. Future Internet, 2019, 11(11): 237. [64] LI W, LI S, MA S, et al. Recursive graphical neural networks for text classification[J]. arXiv:1909.08166, 2019. [65] DENG Z, SUN C, ZHONG G, et al. Text classification with attention gated graph neural network[J]. Cognitive Computation, 2022, 14(4): 1464-1473. [66] ZHENG S, ZHOU J, MENG K, et al. Label-dividing gated graph neural network for hierarchical text classification[C]//Proceedings of the 2022 International Joint Conference on Neural Networks (IJCNN), 2022. [67] ZHANG H, ZHANG J. Text graph transformer for document classification[C]//Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP), 2020. [68] HU Z, DONG Y, WANG K, et al. Heterogeneous graph transformer[C]//Proceedings of the Web Conference 2020, 2020: 2704-2710. [69] GONG J, TENG Z, TENG Q, et al. Hierarchical graph transformer-based deep learning model for large-scale multi-label text classification[J]. IEEE Access, 2020, 8: 30885-30896. [70] ZHOU W N, ZHOU S D. Summary of model and applications of graph convolutional network[J]. Journal of Research in Science and Engineering, 2021, 3(1). [71] LI Q, HAN Z, WU X M. Deeper insights into graph convolutional networks for semi-supervised learning[C]//Proceedings of the AAAI Conference on Artificial Intelligence, 2018. [72] YANG C, WANG R, YAO S, et al. Revisiting over-smoothing in deep GCNs[J]. arXiv:2003.13663, 2020. [73] 徐冰冰, 岑科廷, 黄俊杰, 等. 图卷积神经网络综述[J]. 计算机学报, 2020, 43(5): 755-780. XU B B, CEN K T, HUANG J J,et al.A survey on graph convolutional neural network[J], Chinese Journal of Computers, 2020, 43(5): 755-780. [74] PAREJA A, DOMENICONI G, CHEN J, et al. EvolveGCN: evolving graph convolutional networks for dynamic graphs[C]//Proceedings of the AAAI Conference on Artificial Intelligence, 2020: 5363-5370. [75] HAMILTON W, YING Z, LESKOVEC J. Inductive representation learning on large graphs[C]//Advances in Neural Information Processing Systems, 2017, 30. [76] WANG H, ZHANG F, WANG J, et al. Exploring high-order user preference on the knowledge graph for recommender systems[J]. ACM Transactions on Information Systems (TOIS), 2019, 37(3): 1-26. |
[1] | SONG Jianping, WANG Yi, SUN Kaiwei, LIU Qilie. Short Text Classification Combined with Hyperbolic Graph Attention Networks and Labels [J]. Computer Engineering and Applications, 2024, 60(9): 188-195. |
[2] | YANG Wentao, LEI Yuqi, LI Xingyue, ZHENG Tiancheng. Chinese Long Text Classification Model Based on BERT Fused Chinese Input Methods and BLCG [J]. Computer Engineering and Applications, 2024, 60(9): 196-202. |
[3] | JIANG Jielin, ZHU Yongwei, XU Xiaolong, CUI Yan, ZHAO Yingnan. Chinese Short Text Classification with Hybrid Features and Multi-Head Attention [J]. Computer Engineering and Applications, 2024, 60(9): 237-243. |
[4] | HU Zhiqiang, LI Pengjun, WANG Jinlong, XIONG Xiaoyun. Research on Policy Tools Classification Based on ChatGPT Augmentation and Supervised Contrastive Learning [J]. Computer Engineering and Applications, 2024, 60(7): 292-305. |
[5] | ZHENG Xiaoli, WANG Wei, DU Yuxuan, ZHANG Chuang. Demand Aware Attention Graph Neural Network for Session-Base Recommendation [J]. Computer Engineering and Applications, 2024, 60(7): 128-140. |
[6] | CHEN Zhaohong, HONG Zhiyong, YU Wenhua, ZHANG Xin. Extreme Multi-Label Text Classification Based on Balance Function [J]. Computer Engineering and Applications, 2024, 60(4): 163-172. |
[7] | WANG Nan, TAN Shuru, XIE Xiaolan, LI Hairong. Pre-Training Model of Public Opinion Event Vector [J]. Computer Engineering and Applications, 2024, 60(18): 189-197. |
[8] | ZHANG Qintong, WANG Yuchao, WANG Hexi, WANG Junxin, CHEN Hai. Comprehensive Review of Large Language Model Fine-Tuning [J]. Computer Engineering and Applications, 2024, 60(17): 17-33. |
[9] | LI Jiandong, FU Jia, LI Jiaqi. Multi-Label Text Classification Combining Bidirectional Attention and Contrast Enhancement Mechanism [J]. Computer Engineering and Applications, 2024, 60(16): 105-115. |
[10] | YANG Chunxia, HUANG Yukun, YAN Han, WU Yalei. Multi-Label Text Classification Model Integrating GAT and Head-Tail Label [J]. Computer Engineering and Applications, 2024, 60(15): 150-160. |
[11] | DONG Xiaohui, GUO Tingfu, ZHU Haijiang, DANG Xiaochao, LI Fenfang. Construction and Application of Fault Knowledge Graph for Mine Hoist [J]. Computer Engineering and Applications, 2024, 60(14): 348-356. |
[12] | LI Yi, GENG Chaoyang, YANG Dan. Fin-BERT-Based Event Extraction Method for Chinese Financial Domain [J]. Computer Engineering and Applications, 2024, 60(14): 123-132. |
[13] | YU Fengrui. Survey on Automated Recognition and Extraction of TTPs [J]. Computer Engineering and Applications, 2024, 60(13): 1-22. |
[14] | GU Xunxun, LIU Jianping, XING Jialu, REN Haiyu. Text Classification:Comprehensive Review of Prompt Learning Methods [J]. Computer Engineering and Applications, 2024, 60(11): 50-61. |
[15] | CAO Yukun, WEI Ziyue, TANG Yijia, JIN Chengkun, LI Yunfeng. Hierarchical Label Text Classification Method with Deep Label Assisted Classification Task [J]. Computer Engineering and Applications, 2024, 60(10): 105-112. |
Viewed | ||||||
Full text |
|
|||||
Abstract |
|
|||||