[1] YANG Z, YANG D, DYER C, et al. Hierarchical attention networks for document classification[C]//Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, 2016: 1480-1489.
[2] AL-SABAHI K, ZUPING Z, NADHER M. A hierarchical structured self-attentive model for extractive document summarization (HSSAS)[J]. IEEE Access, 2018, 6: 24205-24212.
[3] CAMBRIA E, OLSHER D, RAJAGOPAL D. SenticNet 3: a common and common-sense knowledge base for cognition-driven sentiment analysis[C]//Proceedings of the 28th AAAI Conference on Artificial Intelligence, 2014.
[4] 刘鑫, 梅红岩, 王嘉豪, 等. 图神经网络推荐方法研究[J]. 计算机工程与应用, 2022, 58(10): 41-49.
LIU X, MEI H Y, WANG J H, et al. Research on graph neural network recommendation method[J]. Computer Engineering and Applications, 2022, 58(10): 41-49.
[5] AGRAWAL R, GUPTA A, PRABHU Y, et al. Multi-label learning with millions of labels: recommending advertiser bid phrases for web pages[C]//Proceedings of the 22nd International Conference on World Wide Web, 2013: 13-24.
[6] PARTALAS I, KOSMOPOULOS A, BASKIOTIS N, et al. LSHTC: a benchmark for large-scale text classification[J]. arXiv:1503.08581, 2015.
[7] 王岳, 李雅文, 李昂. 科技资源文本层次多标签分类方法[J]. 计算机工程与应用, 2023, 59(13): 92-98.
WANG Y, LI Y W, LI A. Academic resource text hierarchical multi-label classification[J]. Computer Engineering and Applications, 2023, 59(13): 92-98.
[8] LOZAMENCíA E, FüRNKRANZ J. Efficient pairwise multilabel classification for large-scale problems in the legal domain[C]//Proceedings of the Joint European Conference on Machine Learning and Knowledge Discovery in Databases. Berlin, Heidelberg: Springer, 2008: 50-65.
[9] DEVLIN J, CHANG M W, LEE K, et al. BERT: pre-training of deep bidirectional transformers for language understanding[J]. arXiv:1810.04805, 2018.
[10] ZUBIAGA A. Enhancing navigation on Wikipedia with social tags[J]. arXiv:1202. 5469, 2012.
[11] BABBAR R, SCH?LKOPF B. DiSMEC: distributed sparse machines for extreme multi-label classification[C]//Proceedings of the 10th ACM International Conference on Web Search and Data Mining, 2017: 721-729.
[12] BABBAR R, SCH?LKOPF B. Data scarcity, robustness and extreme multi-label classification[J]. Machine Learning, 2019, 108(8): 1329-1351.
[13] YEN I E H, HUANG X, RAVIKUMAR P, et al. PD-sparse: a primal and dual sparse approach to extreme multiclass and multilabel classification[C]//Proceedings of the 33rd International Conference on Machine Learning, 2016: 3069-3077.
[14] PRABHU Y, KAG A, HARSOLA S, et al. Parabel: partitioned label trees for extreme classification with application to dynamic search advertising[C]//Proceedings of the 2018 World Wide Web Conference, 2018: 993-1002.
[15] BHATIA K, JAIN H, KAR P, et al. Sparse local embeddings for extreme multi-label classification[C]//Advances in Neural Information Processing Systems 28, 2015.
[16] TAGAMI Y. AnnexML: approximate nearest neighbor search for extreme multi-label classification[C]//Proceedings of the 23rd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, 2017: 455-464.
[17] LIU J, CHANG W C, WU Y, et al. Deep learning for extreme multi-label text classification[C]//Proceedings of the 40th International ACM SIGIR Conference on Research and Development in Information Retrieval, 2017: 115-124.
[18] YOU R, ZHANG Z, WANG Z, et al. AttentionXML: label tree-based attention-aware deep model for high-performance extreme multi-label text classification[C]//Advances in Neural Information Processing Systems 32, 2019.
[19] CHANG W C, YU H F, ZHONG K, et al. Taming pretrained transformers for extreme multi-label text classification[C]//Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, 2020: 3163-3171.
[20] VASWANI A, SHAZEER N, PARMAR N, et al. Attention is all you need[C]//Advances in Neural Information Processing Systems 30, 2017.
[21] 黄伟, 刘贵全. MSML-BERT模型的层级多标签文本分类方法研究[J]. 计算机工程与应用, 2022, 58(15): 191-201.
HUANG W, LIU G Q. Study on hierarchical multi-label text classification method of msml-bert model[J]. Computer Engineering and Applications, 2022, 58(15): 191-201.
[22] CHARTE F, RIVERA A J, DEL JESUS M J, et al. Addressing imbalance in multilabel classification: measures and random resampling algorithms[J]. Neurocomputing, 2015, 163: 3-16.
[23] YANG Y, LIU X. A re-examination of text categorization methods[C]//Proceedings of the 22nd Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, 1999: 42-49.
[24] LIN T Y, GOYAL P, GIRSHICK R, et al. Focal loss for dense object detection[C]//Proceedings of the 2017 IEEE International Conference on Computer Vision, 2017: 2980-2988.
[25] CUI Y, JIA M, LIN T Y, et al. Class-balanced loss based on effective number of samples[C]//Proceedings of the 2019 IEEE/CVF Conference on Cmputer Vision and Pattern Recognition, 2019: 9268-9277.
[26] WU T, HUANG Q, LIU Z, et al. Distribution-balanced loss for multi-label classification in long-tailed datasets[C]//Proceedings of the 16th European Conference on Computer Vision. Cham: Springer, 2020: 162-178.
[27] MIKOLOV T, CHEN K, CORRADO G, et al. Efficient estimation of word representations in vector space[J]. arXiv: 1301. 3781, 2013.
[28] PENNINGTON J, SOCHER R, MANNING C D. GloVe: global vectors for word representation[C]//Proceedings of the 2014 Conference on Empirical Methods in Ntural Language Processing, 2014: 1532-1543.
[29] WYDMUCH M, JASINSKA K, KUZNETSOV M, et al. A no-regret generalization of hierarchical softmax to extreme multi-label classification[C]//Advances in Neural Information Processing Systems 31, 2018.
[30] KHANDAGALE S, XIAO H, BABBAR R. Bonsai: diverse and shallow trees for extreme multi-label classification[J]. Machine Learning, 2020, 109(11): 2099-2119.
[31] JIANG T, WANG D, SUN L, et al. LightXML: transformer with dynamic negative sampling for high-performance extreme multi-label text classification[C]//Proceedings of the 35th AAAI Conference on Artificial Intelligence, 2021: 7987-7994.
[32] ZHANG R, WANG Y S, YANG Y, et al. Long-tailed extreme multi-label text classification with generated pseudo label descriptions[J]. arXiv: 2204. 00958, 2022.
[33] WANG Q, SHU H, ZHU J. GUDN: a novel guide network for extreme multi-label text classification[J]. arXiv:2201. 11582, 2022. |