[1] YU M, YIN W, HASAN K S, et al. Improved neural relation detection for knowledge base question answering[C]//Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), 2017: 571-581.
[2] YOUNG T, CAMBRIA E, CHATURVEDI I, et al. Augmenting end-to-end dialogue systems with commonsense knowledge[C]//Proceedings of the AAAI Conference on Artificial Intelligence, 2018.
[3] ZHANG Y, ZHONG V, CHEN D, et al. Position-aware attention and supervised data improve slot filling[C]//Proceedings of the Conference on Empirical Methods in Natural Language Processing, 2017.
[4] ZENG D, LIU K, LAI S, et al. Relation classification via convolutional deep neural network[C]//Proceedings of the 25th International Conference on Computational Linguistics: Technical Papers, 2014: 2335-2344.
[5] NGUYEN T H, GRISHMAN R. Relation extraction: perspective from convolutional neural networks[C]//Proceedings of the 1st Workshop on Vector Space Modeling for Natural Language Processing, 2015: 39-48.
[6] SHEN Y, HUANG X J. Attention-based convolutional neural network for semantic relation extraction[C]//Proceedings of the 26th International Conference on Computational Linguistics: Technical Papers, 2016: 2526-2536.
[7] MIWA M, SASAKI Y. Modeling joint entity and relation extraction with table representation[C]//Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), 2014: 1858-1869.
[8] WANG J, LU W. Two are better than one: joint entity and relation extraction with table-sequence encoders[C]//Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), 2020: 1706-1721.
[9] LI J, FEI H, LIU J, et al. Unified named entity recognition as word-word relation classification[C]//Proceedings of the AAAI Conference on Artificial Intelligence, 2022: 10965-10973.
[10] KAMBHATLA N. Combining lexical, syntactic, and semantic features with maximum entropy models for information extraction[C]//Proceedings of the ACL Interactive Poster and Demonstration Sessions, 2004: 178-181.
[11] SONG L, ZHANG Y, WANG Z, et al. N-ary relation extraction using graph-state LSTM[C]//Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, 2018: 2226-2235.
[12] ZHANG Y, QI P, MANNING C D. Graph convolution over pruned dependency trees improves relation extraction[C]//Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, 2018: 2205-2215.
[13] GUO Z, ZHANG Y, LU W. Attention guided graph convolutional networks for relation extraction[C]//Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, 2019: 241-251.
[14] DEVLIN J, CHANG M W, LEE K, et al. BERT: pre-training of deep bidirectional transformers for language understanding[C]//Proceedings of NAACL-HLT, 2019: 4171-4186.
[15] CHEN Y, WANG K, YANG W, et al. A multi-channel deep neural network for relation extraction[J]. IEEE Access, 2020, 8: 13195-13203.
[16] 闫雄, 段跃兴, 张泽华. 采用自注意力机制和CNN融合的实体关系抽取[J]. 计算机工程与科学, 2020, 42(11): 2059-2066.
YAN X, DUAN Y X, ZHANG Z H. Entity relationship extraction fusing self-attention mechanism and CNN[J]. Computer Engineering & Science, 2020, 42(11): 2059-2066.
[17] WANG L, CAO Z, DEMELO G, et al. Relation classification via multi-level attention CNNs[C]//Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), 2016: 1298-1307.
[18] ZHANG D, WANG D. Relation classification via recurrent neural network[J]. arXiv:1508.01006, 2015.
[19] XIAO M, LIU C. Semantic relation classification via hierarchical recurrent neural network with attention[C]//Proceedings of the 26th International Conference on Computational Linguistics: Technical Papers, 2016: 1254-1263.
[20] ZHANG S, ZHENG D, HU X, et al. Bidirectional long short-term memory networks for relation classification[C]//Proceedings of the 29th Pacific Asia Conference on Language, Information and Computation, 2015: 73-78.
[21] SUN K, ZHANG R, MAO Y, et al. Relation extraction with convolutional network over learnable syntax-transport graph[C]//Proceedings of the AAAI Conference on Artificial Intelligence, 2020: 8928-8935.
[22] TIAN Y, CHEN G, SONG Y, et al. Dependency-driven relation extraction with attentive graph convolutional networks[C]//Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), 2021: 4458-4471.
[23] ZHOU L, WANG T, QU H, et al. A weighted GCN with logical adjacency matrix for relation extraction[C]//Proceedings of the European Conference on Artificial Intelligence, 2020: 2314-2321.
[24] REN F, ZHANG L, YIN S, et al. A novel global feature-oriented relational triple extraction model based on table filling[C]//Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, 2021: 2646-2656.
[25] MA Y, HIRAOKA T, OKAZAKI N. Joint entity and relation extraction based on table labeling using convolutional neural networks[C]//Proceedings of the Sixth Workshop on Structured Prediction for NLP, 2022: 11-21.
[26] MA Y, HIRAOKA T, OKAZAKI N. Named entity recognition and relation extraction using enhanced table filling by contextualized representations[J]. Journal of Natural Language Processing, 2022, 29(1): 187-223.
[27] WU S, HE Y. Enriching pre-trained language model with entity information for relation classification[C]//Proceedings of the 28th ACM International Conference on Information and Knowledge Management, 2019: 2361-2364.
[28] XU K, FENG Y, HUANG S, et al. Semantic relation classification via convolutional neural networks with simple negative sampling[C]//Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, 2015: 536-540.
[29] XUE F, SUN A, ZHANG H, et al. GDPNet: refining latent multi-view graph for relation extraction[C]//Proceedings of the AAAI Conference on Artificial Intelligence, 2021: 14194-14202.
[30] CHO C, CHOI Y S. Dependency tree positional encoding method for relation extraction[C]//Proceedings of the 36th Annual ACM Symposium on Applied Computing, 2021: 1012-1020.
[31] JOSHI M, CHEN D, LIU Y, et al. SpanBERT: improving pre-training by representing and predicting spans[J]. Transactions of the Association for Computational Linguistics, 2020, 8: 64-77. |