[1] DAMASHEK M. Gauging similarity with n-grams: language-independent categorization of text[J]. Science, 1995, 267(5199): 843-848.
[2] BRONSTEIN M M, BRUNA J, LECUN Y, et al. Geometric deep learning: going beyond Euclidean data[J]. IEEE Signal Processing Magazine, 2017, 34(4): 18-42.
[3] 王强, 江昊, 羿舒文, 等. 复杂网络的双曲空间表征学习方法[J]. 软件学报, 2021, 32(1): 93-117.
WANG Q, JIANG H, YI S W, et al. Hyperbolic representation learning for complex networks[J]. Journal of Software, 2021, 32(1): 93-117.
[4] LUO Y, UZUNER O, SZOLOVITS P. Bridging semantics and syntax with graph algorithms-state-of-the-art of extracting biomedical relations[J]. Briefings Bioinform, 2017, 18(1): 160-178.
[5] ZENG J C, LI J, SONG Y, et al. Topic memory networks for short text classification[C]//Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, 2018: 3120-3131.
[6] 杨朝强, 邵党国, 杨志豪, 等. 多特征融合的中文短文本分类模型[J]. 小型微型计算机系统, 2020, 41(7): 1421-1426.
YANG Z Q, SHAO D G, YANG Z H, et al. Chinese short text classification model with multi-feature fusion[J]. Journal of Chinese Computer Systems, 2020, 41(7): 1421-1426.
[7] 郑诚, 董春阳, 黄夏炎, 等. 基于BTM图卷积网络的短文本分类方法[J]. 计算机工程与应用, 2021, 57(4): 155-160.
ZHENG C, DONG C Y, HUANG X Y, et al. Short text classification method based on BTM graph convolutional network[J]. Computer Engineering and Applications, 2021, 57(4): 155-160.
[8] 唐加山, 段丹丹. 文本分类中基于CHI和PCA混合特征的降维方法[J]. 重庆邮电大学学报 (自然科学版), 2022, 34(1): 164-171.
TANG J S, DUAN D D. Research on dimension reduction method based on mixed features of CHI and PCA in text classification[J]. Journal of Chongqing University of Posts and Telecommunications (Natural Science), 2022, 34(1): 164-171.
[9] MIKOLOV T, SUTSKEVER I, CHEN K, et al. Distributed representations of words and phrases and their compositionality[C]//Advances in Neural Information Processing Systems, 2013.
[10] TANG J, QU M, MEI Q. PTE: predictive text embedding through large-scale heterogeneous text networks[C]//Proceedings of the 21st ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, 2015: 1165-1174.
[11] WU Z, PAN S, CHEN F, et al. A comprehensive survey on graph neural networks[J]. IEEE Transactions on Neural Networks and Learning Systems, 2020, 32(1): 4-24.
[12] KIPF T N, WELLING M. Semi-supervised classification with graph convolutional networks[J]. arXiv:1609.02907, 2016.
[13] VELI?KOVI? P, CUCURULL G, CASANOVA A, et al. Graph attention networks[J]. arXiv:1710.10903, 2017.
[14] LI Y, TARLOW D, BROCKSCHMIDT M, et al. Gated graph sequence neural networks[J]. arXiv:1511.05493, 2015.
[15] RAHIMI A, COHN T, BALDWIN T. Semi-supervised user geolocation via graph convolutional networks[J]. arXiv:1804.
08049, 2018.
[16] YAO L, MAO C, LUO Y. Graph convolutional networks for text classification[C]//Proceedings of the AAAI Conference on Artificial Intelligence, 2019: 7370-7377.
[17] HUANG L, MA D, LI S, et al. Text level graph neural network for text classification[J]. arXiv:1910.02356, 2019.
[18] NICKEL M, KIELA D. Poincaré embeddings for learning hierarchical representations[C]//Advances in Neural Information Processing Systems, 2017: 6338-6347.
[19] NICKEL M, KIELA D. Learning continuous hierarchies in the lorentz model of hyperbolic geometry[C]//International Conference on Machine Learning, 2018: 3779-3788.
[20] ZHANG C, GAO J. Hype-han: hyperbolic hierarchical attention network for semantic embedding[C]//Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, 2021: 3990-3996.
[21] CHEN B, HUANG X, XIAO L, et al. Hyperbolic interaction model for hierarchical multi-label classification[C]//Proceedings of the AAAI Conference on Artificial Intelligente, 2020: 7496-7503.
[22] 周羿吉. 基于双曲空间的自然语言评价模型研究[D]. 成都: 电子科技大学, 2021.
ZHOU Y J. Research of natural language evaluation model based on hyperbolic space[D]. Chengdu: University of Electronic Science and Technology of China, 2021.
[23] JOULIN A, GRAVE E, BOJANOWSKI P, et al. Bag of tricks for efficient text classification[J]. arXiv:1607.01759, 2016.
[24] SHEN D, WANG G, WANG W, et al. Baseline needs more love: on simple word-embedding-based models and associated pooling mechanisms[J]. arXiv:1805.09843, 2018.
[25] DING K, WANG J, LI J, et al. Be more with less: hypergraph attention networks for inductive text classification[J]. arXiv:2011.00387, 2020. |