[1] XU K, YANG Z G, KANG P P, et al. Document-level attention-based BiLSTM-CRF incorporating disease dictionary for disease named entity recognition[J]. Computers in Biology and Medicine, 2019, 108: 122-132.
[2] SOUZA F, NOGUEIRA R, LOTUFO R D, et al.Portuguese named entity recognition using BERT-CRF[J].arXiv:1909.10649, 2019.
[3] ZHANG Y, YANG J.Chinese NER using lattice LSTM[C]// 56th Annual Meeting of the Association for Computational Linguistics, 2018.
[4] HUANG Z, WEI X, KAI Y. Bidirectional LSTM-CRF models for sequence tagging[EB/OL].[2021-04-05].https://arxiv. org/pdf/1508.01991.pdf.
[5] 曹春萍, 关鹏举. 基于E-CNN和BLSTM-CRF的临床文本命名实体识别[J].计算机应用研究, 2019, 36(12): 3748-3751.
CAO C P, GUAN P J. Clinical text named entity recognition based on E-CNN and BLSTM-CRF[J]. Application Research of Computers, 2019, 36(12):3748-3751.
[6] WU H, LV L, YU B H. Chinese named entity recognition based on transfer learning and BiLSTM-CRF[J]. Journal of Chinese Computer Systems, 2019, 40(6):1142-1147.
[7] LI Y, DU G D, XIANG Y, et al. Towards Chinese clinical named entity recognition by dynamic embedding using domain-specific knowledge[J]. Journal of Biomedical Informatics, 2020, 106(C): 103435.
[8] XU Y X, HUANG H Y, FENG C, et al. A supervised multihead self-attention network for nested named entity recognition[C]//Proceedings of the 35th AAAI Conference on Artificial Intelligence, the 33rd Conference on Innovative Applications of Artificial Intelligence, the 11th Symposium on Educational Advances in Artificial Intelligence, Feb 2-9, 2021. Menlo Park: AAAI, 2021: 14185-14193.
[9] 罗熹, 夏先运, 安莹, 等.结合多头自注意力机制与BiLSTM-CRF的中文临床实体识别[J].湖南大学学报(自然科学版), 2021, 48(4):45-55.
LUO X, XIA X Y, AN Y, et al.Chinese CNER combined with multi-head self-attention and BiLSTM-CRF[J].Journal of Hunan University (Natural Sciences), 2021, 48(4):45-55.
[10] 张世豪, 杜圣东, 贾真, 等.基于深度神经网络和自注意力机制的医学实体关系抽取[J].计算机科学, 2021, 48(10): 77-84.
ZHANG S H, DU S D, JIA Z, et al.Medical entity relation extraction based on deep neural network and self-attention mechanism[J].Computer Science, 2021, 48(10):77-84.
[11] CHEN X, ZHANG N, LI L, et al. Lightner: a lightweight generative framework with prompt-guided attention for low resource NER[J]. arXiv:2109.00720, 2021.
[12] 侯旭东, 滕飞, 张艺.基于深度自编码的医疗命名实体识别模型[J].计算机应用, 2022, 42(9):2686-2692.
HOU X D, TENG F, ZHANG Y. Medical named entity recognition model based on deep auto-encoding[J].Journal of Computer Applications, 2022, 42(9): 2686-2692.
[13] 巩敦卫, 张永凯, 郭一楠, 等.融合多特征嵌入与注意力机制的中文电子病历命名实体识别[J].工程科学学报, 2021, 43(9):1190-1196.
GONG D W, ZHANG Y K, GUO Y N, et al. Named entity recognition of Chinese electronic medical records based on multifeature embedding and attention mechanism[J]. Chinese Journal of Engineering, 2021, 43(9): 1190-1196.
[14] WEN S, ZENG B, LIAO W.Named entity recognition for instructions of Chinese medicine based on pre-trained language model[C]//2021 3rd International Conference on Natural Language Processing (ICNLP), 2021: 139-144.
[15] DEVLIN J, CHANG M W, LEE K, et al. Bert: pre-training of deep bidirectional transformers for language understanding[C]//Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (NAACL). Stroudsburg, PA: Association for Computational Linguistics (ACL), 2018: 4171-4186.
[16] LEE J, YOON W, KIM S, et al.BioBERT: a pre-trained biomedical language representation model for biomedical text mining[J].Bioinformatics, 2020, 36(4): 1234-1240.
[17] 黄梅根, 刘佳乐, 刘川.基于BERT的中文多关系抽取方法研究[J].计算机工程与应用, 2021, 57(21):234-240.
HUANG M G, LIU J L, LIU C. Research on improved BERT’s Chinese multi-relation extraction method[J]. Computer Engineering and Applications, 2021, 57(21): 234-240.
[18] NASEEM U, KHUSHI M, REDDY V, et al. BioALBERT: a simple and effective pre-trained language model for biomedical named entity recognition[C]//Proceedings of the 2021 International Joint Conference on Neural Networks, Shenzhen, Jul 18-22, 2021. Piscataway: IEEE, 2021: 1-7.
[19] GAN Z, LI Z, ZHANG B, et al.Enhance both text and label: combination strategies for improving the generalization ability of medical entity extraction[C]//China Conference on Knowledge Graph and Semantic Computing. Singapore:Springer, 2021: 92-101.
[20] RASMY L, XIANG Y, XIE Z, et al. Med-BERT: pretrained contextualized embeddings on large-scale structured electronic health records for disease prediction[J].NPJ Digital Medicine, 2021, 4(1): 1-13.
[21] MA R, ZHOU X, GUI T, et al. Template-free prompt tuning for few-shot NER[J]. arXiv:2109.13532, 2021.
[22] CHEN X, XU L, LIU Z, et al.Joint learning of character and word embeddings[C]//Proceedings of the 24th International Joint Conference on Artificial Intelligence, 2015:1236-1242.
[23] LIU Y Y, ZHONG Z Q, CHE C, et al. Recommendations with residual connections and negative sampling based on knowledge graphs[J]. Knowledge-Based Systems, 2022, 258.
[24] CUI Y M, CHE W X, LIU T, et al. Pre-training with whole word masking for Chinese bert[J]. IEEE/ACM Transactions on Audio, Speech, and Language Processing, 2021, 29: 3504-3514.
[25] STRUBELL E, VERGA P, BELANGER D, et al. Fast and accurate entity recognition with iterated dilated convolutions[C]//Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, Copenhagen, Sep 9-11, 2017. Stroudsburg: ACL, 2017: 2670-2680. |