[1] LI J, SUN A, HAN J, et al. A survey on deep learning for named entity recognition[J]. IEEE Transactions on Knowledge and Data Engineering, 2022, 34(1): 50-70.
[2] LI X, MENG Y, SUN X, et al. Is word segmentation necessary for deep learning of Chinese representations?[J]. arXiv:1905.
05526, 2019.
[3] LIU Z, ZHU C, ZHAO T. Chinese named entity recognition with a sequence labeling approach: based on characters, or based on words?[C]//Proceedings of the 6th International Conference on Intelligent Computing. Berlin, Heidelberg: Springer, 2010: 634-640.
[4] ZHANG Y, YANG J. Chinese NER using Lattice LSTM[J]. arXiv:1805.02023, 2018.
[5] LI X, YAN H, QIU X, et al. FLAT: Chinese NER using flat-lattice transformer[C]//Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, 2020: 6836-6842.
[6] VASWANI A, SHAZEER N, PARMAR N, et al. Attention is all you need[C]//Proceedings of the 31st International Conference on Neural Information Processing Systems, 2017: 6000-6010.
[7] LV H, DING Y. ALFLAT: Chinese NER using ALBERT, flat-lattice transformer, word segmentation and entity dictionary[C]//Proceedings of the 2nd EAI International Conference on Applied Cryptography in Computer and Communications. Cham: Springer, 2022: 216-227.
[8] JIN Z, HE X, WU X, et al. A hybrid Transformer approach for Chinese NER with features augmentation[J]. Expert Systems with Applications, 2022, 209: 118385.
[9] WU S, SONG X, FENG Z. MECT: multi-metadata embedding based cross transformer for Chinese named entity recognition[C]//Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing(Volume 1: Long Papers), 2021: 1529-1539.
[10] CAO P, CHEN Y, LIU K, et al. Adversarial transfer learning for Chinese named entity recognition with self-attention mechanism[C]//Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, 2018: 182-192.
[11] LI Y, DU G D, XIANG Y, et al. Towards Chinese clinical named entity recognition by dynamic embedding using domain-specific knowledge[J]. Journal of Biomedical Informatics, 2020, 106.
[12] LIU W, ZHOU P, ZHAO Z, et al. K-BERT: enabling language representation with knowledge graph[C]//Proceedings of the 34th AAAI Conference on Artificial Intelligence, 2020: 2901-2908.
[13] GRISHMAN R, SUNDHEIM B M. Message understanding conference-6: a brief history[C]//Proceedings of the 16th International Conference on Computational Linguistics, 1996.
[14] 杨锦锋, 于秋滨, 关毅, 等. 电子病历命名实体识别和实体关系抽取研究综述[J]. 自动化学报, 2014, 40(8): 1537-1562.
YANG J F, YU Q B, GUAN Y, et al. An overview of research on electronic medical recordoriented named entity recognition and entity relation extraction[J]. Acta Automatica Sinica, 2014, 40(8): 1537-1562.
[15] FRIEDMAN C, ALDERSON P O, AUSTIN J H M, et al. A general natural-language text processor for clinical radiology[J]. Journal of the American Medical Informatics Association, 1994, 1(2): 161-174.
[16] BERGER A L, DELLA PIETRA V J, DELLA PIETRA S A. A maximum entropy approach to natural language processing[J]. Computational Linguistics, 1996, 22(1): 39-71.
[17] HU W, TIAN G, KANG Y, et al. Dual sticky hierarchical dirichlet process hidden Markov model and its application to natural language description of motions[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2018, 40(10): 2355-2373.
[18] CHEN P H, LIN C J, SCHLKOPF B. A tutorial on v-support vector machines[J]. Applied Stochastic Models in Business and Industry, 2005, 21(2): 111-136.
[19] HUANG Z, XU W, YU K. Bidirectional LSTM-CRF models for sequence tagging[J]. arXiv:1508.01991, 2015.
[20] AKBIK A, BLYTHE D, VOLLGRAF R. Contextual string embeddings for sequence labeling[C]//Proceedings of the 27th International Conference on Computational Linguistics, 2018: 1638-1649.
[21] WANG X, JIANG Y, BACH N, et al. Improving named entity recognition by external context retrieving and cooperative learning[J]. arXiv:2105.03654, 2021.
[22] LIU L, SHANG J, XU F, et al. Empower sequence labeling with task-aware neural language model[C]//Proceedings of the 32nd AAAl Conference on Artificial Intelligence, 2018.
[23] ZHU Y, WANG G, KARLSSON B F. CAN-NER: convolutional attention network for Chinese named entity recognition[J]. arXiv:1904.02141, 2019.
[24] LI D, YAN L, YANG J, et al. Dependency syntax guided BERT-BiLSTM-GAM-CRF for Chinese NER[J]. Expert Systems with Applications, 2022, 196: 116682.
[25] DENG Z, TAO Y, LAN R, et al. KCR-FLAT: a Chinese-named entity recognition model with enhanced semantic information[J]. Sensors, 2023, 23(4): 1771.
[26] SHEN Y L, TAN Z Q, WU S H, et al. PromptNER: prompt locating and typing for named entity recognition[C]//Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics, 2023: 12492-12507.
[27] JI S, PAN S, CAMBRIA E, et al. A survey on knowledge graphs: representation, acquisition, and applications[J]. IEEE Transactions on Neural Networks and Learning Systems, 2021, 33(2): 494-514.
[28] ZHANG Z, HAN X, LIU Z, et al. ERNIE: enhanced language representation with informative entities[J]. arXiv:1905.07129, 2019.
[29] BOSSELUT A, RASHKIN H, SAP M, et al. COMET: commonsense transformers for automatic knowledge graph construction[J]. arXiv:1906.05317, 2019.
[30] WREN S. The cognitive foundations of learning to read[R]. Austin: Southwest Educational Development Laboratory, 2001.
[31] YAN H, DENG B, LI X, et al. TENER: adapting transformer encoder for named entity recognition[J]. arXiv:1911.04474, 2019.
[32] DAI Z, YANG Z, YANG Y, et al. Transformer-XL: attentive language models beyond a fixed-length context[J]. arXiv:1901.02860, 2019.
[33] PENG N, DREDZE M. Named entity recognition for Chinese social media with jointly trained embeddings[C]//Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, 2015: 548-554.
[34] CUI Y, CHE W, LIU T, et al. Pre-training with whole word masking for Chinese BERT[J]. IEEE/ACM Transactions on Audio, Speech, and Language Processing, 2021, 29: 3504-3514.
[35] LIU W, FU X, ZHANG Y, et al. Lexicon enhanced Chinese sequence labeling using BERT adapter[J]. arXiv:2105.07148, 2021.
[36] 吴炳潮, 邓成龙, 关贝, 等. 动态迁移实体块信息的跨领域中文实体识别模型[J]. 软件学报, 2022, 33(10): 3776-3792.
WU B C, DENG C L, GUAN B, et al. Dynamically transfer entity span information for cross-domain Chinese named entity recognition[J]. Journal of Software, 2022, 33(10): 3776-3792. |