[1] 李博涵, 向宇轩, 封顶, 等. 融合知识感知与双重注意力的短文本分类模型[J]. 软件学报, 2022, 33(10): 3565-3581.
LI B H, XIANG Y X, FENG D, et al. Short text classification model combining knowledge aware and dual attention[J]. Journal of Software, 2022, 33(10): 3565-3581.
[2] 淦亚婷, 安建业, 徐雪. 基于深度学习的短文本分类方法研究综述[J]. 计算机工程与应用, 2023, 59(4): 43-53.
GAN Y T, AN J Y, XU X. Survey of short text classification methods based on deep learning[J]. Computer Engineering and Applications, 2023, 59(4): 43-53.
[3] 郑诚, 陈杰, 董春阳. 结合图卷积的深层神经网络用于文本分类[J]. 计算机工程与应用, 2022, 58(7): 206-212.
ZHENG C, CHEN J, DONG C Y. Deep neural network combined with graph convolution for text classification[J]. Computer Engineering and Applications, 2022, 58(7): 206-212.
[4] CONNEAU A, SCHWENK H, BARRAULT L, et al. Very deep convolutional networks for text classification[J]. arXiv:1606.01781, 2016.
[5] TAI K S, SOCHER R, MANNING C D. Improved semantic representations from tree-structured long short-term memory networks[J]. arXiv:1503.00075, 2015.
[6] ZENG D, LIU K, LAI S, et al. Relation classification via convolutional deep neural network[C]//Proceedings of COLING 2014, the 25th International Conference on Computational Linguistics: Technical Papers, 2014: 2335-2344.
[7] BOLLEGALA D, ATANASOV V, MAEHARA T, et al. ClassiNet-predicting missing features for short-text classification[J]. ACM Transactions on Knowledge Discovery from Data (TKDD), 2018, 12(5): 1-29.
[8] LEE J, CHO K, HOFMANN T. Fully character-level neural machine translation without explicit segmentation[J]. Transactions of the Association for Computational Linguistics, 2017, 5: 365-378.
[9] LAI Y, FENG Y, YU X, et al. Lattice CNNs for matching based Chinese question answering[C]//Proceedings of the AAAI Conference on Artificial Intelligence, 2019: 6634-6641.
[10] 肖琳, 陈博理, 黄鑫, 等. 基于标签语义注意力的多标签文本分类[J]. 软件学报, 2020, 31(4): 1079-1089.
XIAO L, CHEN B L, HUANG X, et al. Multi-label text classification method based on label semantic information[J]. Journal of Software, 2020, 31(4): 1079-1089.
[11] TAO H, TONG S, ZHAO H, et al. A radical-aware attention-based model for chinese text classification[C]//Proceedings of the AAAI Conference on Artificial Intelligence, 2019: 5125-5132.
[12] HAO M, XU B, LIANG J Y, et al. Chinese short text classification with mutual-attention convolutional neural networks[J]. ACM Transactions on Asian and Low-Resource Language Information Processing (TALLIP), 2020, 19(5): 1-13.
[13] YU C T, SALTON G. Precision weighting—an effective automatic indexing method[J]. Journal of the ACM (JACM), 1976, 23(1): 76-88.
[14] BIJALWAN V, KUMAR V, KUMARI P, et al. KNN based machine learning approach for text and document mining[J]. International Journal of Database Theory and Application, 2014, 7(1): 61-70.
[15] GOUDJIL M, KOUDIL M, BEDDA M, et al. A novel active learning method using SVM for text classification[J]. International Journal of Automation and Computing, 2018, 15(3): 290-298.
[16] KIM Y. Convolutional neural networks for sentence classification[C]//Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing, Doha, Qatar, 2014: 1746-1751.
[17] LIU J, CHANG W C, WU Y, et al. Deep learning for extreme multi-label text classification[C]//Proceedings of the 40th International ACM SIGIR Conference on Research and Development in Information Retrieval, 2017: 115-124.
[18] JOHNSON R, ZHANG T. Deep pyramid convolutional neural networks for text categorization[C]//Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics, 2017: 562-570.
[19] SHEN T, ZHOU T, LONG G, et al. Disan: directional self-attention network for RNN/CNN-free language understanding[C]//Proceedings of the AAAI Conference on Artificial Intelligence, 2018: 5446-5455.
[20] JANG B, KIM M, HARERIMANA G, et al. Bi-LSTM model to increase accuracy in text classification: combining Word2vec CNN and attention mechanism[J]. Applied Sciences, 2020, 10(17): 5841.
[21] ZHANG Y, ZHENG J, JIANG Y, et al. A text sentiment classification modeling method based on coordinated CNN‐LSTM-
attention model[J]. Chinese Journal of Electronics, 2019, 28(1): 120-126.
[22] LIU Z, HUANG H, LU C, et al. Multichannel CNN with attention for text classification[J]. arXiv:2006.16174, 2020.
[23] MIKOLOV T, CHEN K, CORRADO G, et al. Efficient estimation of word representations in vector space[J]. arXiv:1301.3781, 2013.
[24] MIKOLOV T, SUTSKEVER I, CHEN K, et al. Distributed representations of words and phrases and their compositionality[C]//Advances in Neural Information Processing Systems, 2013: 3111-3119.
[25] DEVLIN J, CHANG M W, LEE K, et al. Bert: pre-training of deep bidirectional transformers for language understanding[J]. arXiv:1810.04805, 2018.
[26] ZHANG Z, HAN X, LIU Z, et al. ERNIE: enhanced language representation with informative entities[J]. arXiv:1905.
07129, 2019.
[27] SUN Y, WANG S, LI Y, et al. Ernie 2.0: a continual pre-training framework for language understanding[C]//Proceedings of the AAAI Conference on Artificial Intelligence, 2020: 8968-8975.
[28] VASWANI A, SHAZEER N, PARMAR N, et al. Attention is all you need[C]//Advances in Neural Information Processing Systems, 2017: 5998-6008.
[29] LI Y, ZHANG Y, ZHAO Z, et al. CSL: a large-scale Chinese scientific literature dataset[J]. arXiv:2209.05034, 2022.
[30] LIU P, QIU X, HUANG X. Recurrent neural network for text classification with multi-task learning[J]. arXiv:1605.05101, 2016.
[31] LAI S, XU L, LIU K, et al. Recurrent convolutional neural networks for text classification[C]//Twenty-Ninth AAAI Conference on Artificial Intelligence, 2015: 2267-2273.
[32] SUN Y, WANG S, FENG S, et al. Ernie 3.0: large-scale knowledge enhanced pre-training for language understanding and generation[J]. arXiv:2107.02137, 2021. |