[1] LI W Y, GE X L, LIU S, et al. Opportunities and challenges of traditional Chinese medicine doctors in the era of artificial intelligence[J]. Frontiers in Medicine, 2024, 10: 1336175.
[2] AMIRI Z, HEIDARI A, DARBANDI M, et al. The personal health applications of machine learning techniques in the Internet of behaviors[J]. Sustainability, 2023, 15(16): 12406.
[3] 曾江楠, 叶廷林, 李敏芳, 等. 新型冠状病毒感染后咳嗽中医证型和体质分布规律[J]. 辽宁中医杂志, 2025, 52(2): 1-4.
ZENG J N, YE T L, LI M F, et al. TCM syndrome types and constitution distribution of cough after CoronaVirus disease 2019[J]. Liaoning Journal of Traditional Chinese Medicine, 2025, 52(2): 1-4.
[4] CHEN Z, ZHANG D, LIU C X, et al. Traditional Chinese medicine diagnostic prediction model for holistic syndrome differentiation based on deep learning[J]. Integrative Medicine Research, 2024, 13(1): 101019.
[5] LI X C, CHEN K, YANG J X, et al. TLDA: a transfer learning based dual-augmentation strategy for traditional Chinese medicine syndrome differentiation in rare disease[J]. Computers in Biology and Medicine, 2024, 169: 107808.
[6] REN M C, HUANG H Y, ZHOU Y X, et al. TCM-SD: a benchmark for probing syndrome differentiation via natural language processing[C]//Proceedings of the 21st Chinese National Conference on Computational Linguistics Chinese. Cham: Springer International Publishing, 2022: 247-263.
[7] RICHTER T, NESTLER-PARR S, BABELA R, et al. Rare disease terminology and definitions: a systematic global review: report of the ISPOR rare disease special interest group[J]. Value in Health, 2015, 18(6): 906-914.
[8] ZHANG C B, JIANG P T, HOU Q B, et al. Delving deep into label smoothing[J]. IEEE Transactions on Image Processing, 2021, 30: 5984-5996.
[9] HU C Y, ZHANG S Y, GU T Y, et al. Multi-task joint learning model for Chinese word segmentation and syndrome differentiation in traditional Chinese medicine[J]. International Journal of Environmental Research and Public Health, 2022, 19(9): 5601.
[10] ZHANG H, ZHANG J J, NI W D, et al. Transformer-and generative adversarial network-based inpatient traditional Chinese medicine prescription recommendation: development study[J]. JMIR Medical Informatics, 2022, 10(5): 35239.
[11] LIU Z Q, HE H Y, YAN S X, et al. End-to-end models to imitate traditional Chinese medicine syndrome differentiation in lung cancer diagnosis: model development and validation[J]. JMIR Medical Informatics, 2020, 8(6): 17821.
[12] MIN B N, ROSS H, SULEM E, et al. Recent advances in natural language processing via large pre-trained language models: a survey[J]. ACM Computing Surveys, 2024, 56(2): 1-40.
[13] WANG H F, LI J W, WU H, et al. Pre-trained language models and their applications[J]. Engineering, 2023, 25: 51-65.
[14] YAO L, JIN Z, MAO C S, et al. Traditional Chinese medicine clinical records classification with BERT and domain specific corpora[J]. Journal of the American Medical Informatics Association, 2019, 26(12): 1632-1636.
[15] ZHANG X, CHEN Z K, GAO J, et al. A two-stage deep transfer learning model and its application for medical image processing in traditional Chinese medicine[J]. Knowledge-Based Systems, 2022, 239: 108060.
[16] LI J Y, TANG T Y, ZHAO W X, et al. Pre-trained language models for text generation: a survey[J]. ACM Computing Surveys, 2024, 56(9): 1-39.
[17] DEVLIN J, CHANG M W, LEE K, et al. BERT: pre-training of deep bidirectional transformers for language understanding[C]//Proceedings of the North American Chapter of the Association for Computational Linguistics, 2019.
[18] SUNG F, YANG Y X, ZHANG L, et al. Learning to compare: relation network for few-shot learning[C]//Proceedings of the 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition. Piscataway: IEEE, 2018: 1199-1208.
[19] 苏翀, 任曈, 王国品, 等. 利用决策树建立慢性阻塞性肺病中医诊断模型[J]. 计算机工程与应用, 2019, 55(3): 225-230.
SU C, REN T, WANG G P, et al. Using K-L divergence based decision tree to build traditional Chinese medicine diagnosis model on COPD[J]. Computer Engineering and Applications, 2019, 55(3): 225-230.
[20] LAN Z Z, CHEN M D, GOODMAN S, et al. ALBERT: a lite BERT for self-supervised learning of language representations[J]. arXiv:1909.11942, 2019.
[21] LIU Y H, OTT M, GOYAL N, et al. RoBERTa: a robustly optimized BERT pretraining approach[J]. arXiv:1907.11692, 2019. |