[1] ZHU Y, LU X, HONG J, et al. Joint dynamic topic model for recognition of lead-lag relationship in two text corpora[J]. Data Mining and Knowledge Discovery, 2022, 36(6): 2272-2298.
[2] YANG Z, YANG D, DYER C, et al. Hierarchical attention networks for document classification[C]//Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, 2016: 1480-1489.
[3] KHOT T, CLARK P, GUERQUIN M, et al. Qasc: a dataset for question answering via sentence composition[C]//Proceedings of the AAAI Conference on Artificial Intelligence, 2020: 8082-8090.
[4] SAXENA A, TRIPATHI A, TALUKDAR P. Improving multi-hop question answering over knowledge graphs using knowledge base embeddings[C]//Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, 2020: 4498-4507.
[5] PETRESCU A, TRUIC? C O, APOSTOL E S, et al. EDSA-ensemble: an event detection sentiment analysis ensemble architecture[J]. arXiv:2301.12805, 2023.
[6] JOHN-OTUMU A M, RAHMAN M M, NWOKONKWO O C, et al. AI-based techniques for online social media network sentiment analysis: a methodical review[J]. International Journal of Computer and Information Engineering, 2022, 16(12): 555-560.
[7] XIAO L, HUANG X, CHEN B, et al. Label-specific document representation for multi-label text classification[C]//Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), 2019: 466-475.
[8] ZHANG X, ZHANG Q W, YAN Z, et al. Enhancing label correlation feedback in multi-label text classification via multi-task learning[J]. arXiv:2106.03103, 2021.
[9] DUAN L, YOU Q, WU X, et al. Multilabel text classification algorithm based on fusion of two-stream transformer[J]. Electronics, 2022, 11(14): 2138-2149.
[10] CHEN Y. Convolutional neural network for sentence classification[D]. Waterloo: University of Waterloo, 2015.
[11] 滕金保, 孔韦韦, 田乔鑫, 等. 基于LSTM-Attention与CNN混合模型的文本分类方法[J]. 计算机工程与应用, 2021, 57(14): 126-133.
TENG J B, KONG W W, TIAN Q X, et al. Text classification method based on LSTM-Attention and CNN hybrid model[J]. Computer Engineering and Applications, 2021, 57(14): 126-133.
[12] JOULIN A, GRAVE E, BOJANOWSKI P, et al. Bag of tricks for efficient text classification[J]. arXiv:1607.01759, 2016.
[13] YOU R, ZHANG Z, WANG Z, et al. AttentionXML: label tree?based attention?aware deep model for high?performance extreme multi-label text classification[C]//Proceedings of the 33rd International Conference on Neural Information Processing Systems, 2019.
[14] DU C, CHEN Z, FENG F, et al. Explicit interaction model towards text classification[C]//Proceedings of the AAAI Conference on Artificial Intelligence, 2019: 6359-6366.
[15] YADAV R K, JIAO L, GRANMO O C, et al. Enhancing interpretable clauses semantically using pretrained word representation[J]. arXiv:2104.06901, 2021.
[16] ZENG F, CHEN N, YANG D, et al. Simplified-boosting ensemble convolutional network for text classification[J]. Neural Processing Letters, 2022, 54(6): 4971-4986.
[17] KENTON J D M W C, TOUTANOVA L K. BERT: pre-training of deep bidirectional transformers for language understanding[C]//Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), 2019: 4171-4186.
[18] LIU Y, OTT M, GOYAL N, et al. RoBERTa: a robustly optimized bert pretraining approach[J]. arXiv:1907.11692, 2019.
[19] 黄友文, 魏国庆, 胡燕芳. DistillBIGRU: 基于知识蒸馏的文本分类模型[J]. 中文信息学报, 2022, 36(4): 81-89.
HUANG Y W, WEI G Q, HU Y F. DistillBIGRU: text classification model based on knowledge distillation[J]. Journal of Chinese Information Processing, 2022, 36(4): 81-89.
[20] LIN Y, MENG Y, SUN X, et al. BERTGCN: transductive text classification by combining GCN and BERT[J]. arXiv:2105.05727, 2021.
[21] HU S, DING N, WANG H, et al. Knowledgeable prompt-tuning: Incorporating knowledge into prompt verbalizer for text classification[J]. arXiv:2108.02035, 2021.
[22] 孙坤, 秦博文, 桑基韬, 等. 基于共享语义空间的多标签文本分类[J]. 计算机工程与应用, 2023, 59(12): 100-105.
SUN K, QIN B W, SANG J T, et al. Multi-label text classification based on shared semantic space[J]. Computer Engineering and Applications, 2023, 59(12): 100-105.
[23] 林呈宇, 王雷, 薛聪. 标签语义增强的弱监督文本分类模型[J]. 计算机应用, 2023, 43(2): 335-342.
LIN C Y, WANG L XUE C. Weakly-supervised text classification with label semantic enhancement[J]. Journal of Computer Applications, 2023, 43(2): 335-342.
[24] LIU P, QIU X, HUANG X. Recurrent neural network for text classification with multi-task learning[C]//Proceedings of the International Joint Conference on Artificial Intelligence, New York, USA, 2016: 2873-2879.
[25] VASWANI A, SHAZEER N, PARMAR N, et al. Attention is all you need[C]//Proceedings of the 31st Internatioal Conferenceon Neural Information Processing Systems, 2017: 5998-6008.
[26] 张文轩, 殷雁君, 智敏. 基于概率分布的图卷积文本分类模型[J]. 中文信息学报, 2022, 36(4): 100-110.
ZHANG W X, YIN Y J, ZHI M. Probability distribution based graph convolution network for text classification[J]. Journal of Chinese Information Processing, 2022, 36(4): 100-110. |