[1] ZHOU G D, SU J, ZHANG J, et al. Exploring various knowledge in relation extraction[C]//Proceedings of the 43rd Annual Meeting of the Association for Computational Linguistics, 2005: 427-434.
[2] YEE S C, DAN R. Exploiting syntactico-semantic structures for relation extraction[C]//Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies, 2011: 551-560.
[3] DMITRY Z, CHINATSU A, ANTHONY R. Kernel methods for relation extraction[J]. Journal of Machine Learning Research, 2003, 3: 1083-1106.
[4] BOWMAN S R, POTTS C, MANNING C D. Recursive neural networks for learning logical semantics[J]. arXiv:1406.1827, 2014.
[5] KATIYAR A, CARDIE C. Going out on a limb: joint extraction of entity mentions and relations without dependency trees[C]//Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics, 2017: 917-928.
[6] HONG Y, LIU Y X, YANG S Z, et al. Improving graph convolutional networks based on relation-aware attention for end-to-end relation extraction[J]. IEEE Access, 2020, 8: 51315-51323.
[7] YU B W, ZAHNG Z Y, SHU X B, et al. Joint extraction of entities and relations based on a novel decomposition strategy[C]//Proceedings of the 28th International Conference on Computational Linguistics, 2020: 1572-1582.
[8] ZHENG H Y, RUI W, CHEN X, et al. PRGC: potential relation and global correspondence based joint relational triple extraction[C]//Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing, 2021: 6225-6235.
[9] ZENG X R, ZENG D J, HE S Z, et al. Extracting relational facts by an end-to-end neural model with copy mechanism[C]//Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics, 2018: 506-514.
[10] FU T J, LI P H, MA W Y. GraphRel: modeling text as relational graphs for joint entity and relation extraction[C]//Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, 2019: 1409-1418.
[11] WEI Z P, SU J L, WANG Y, et al. A novel cascade binary tagging framework for relational triple extraction[C]//Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, 2020: 1476-1488.
[12] SHEN Y L, MA X Y, TANG Y C, et al. A trigger-sense memory flow framework for joint entity and relation extraction[C]//Proceedings of the Web Conference, 2021: 1704-1715.
[13] WANG L Y, XIONG C, DENG N. A research on overlapping relationship extraction based on multi-objective dependency[C]//Proceedings of the 15th International Conference on Computer Science & Education, 2020: 618-622.
[14] WANG Y C, YU B W, ZHANG Y Y, et al. TPLinker: single-stage joint extraction of entities and relations through token pair linking[C]//Proceedings of the 28th International Conference on Computational Linguistics, 2020: 1572-1582.
[15] ZHENG S C, WANG F, BAO H Y, et al. Joint extraction of entities and relations based on a novel tagging scheme[C]//Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics, 2017: 1227-1236.
[16] MIWA M, BANSAL M. End-to-end relation extraction using LSTMs on sequences and tree structures[C]//Proceedings of the Annual Meeting of the Association for Computational Linguistics, 2016: 1105-1116.
[17] ZHAO K, XU H, CHENG Y, et al. Representation iterative fusion based on heterogeneous graph neural network for joint entity and relation extraction[J]. Knowledge Based Systems, 2021, 219: 106888.
[18] ZHAO F B, JIANG Z R, KANG Y Y, et al. Adjacency list oriented relational fact extraction via adaptive multi-task learning[J]. arXiv:2106.01559, 2021.
[19] DEVLIN J, CHANG M W, LEE K, et al. BERT: pre-training of deep bidirectional transformers for language understanding[C]//Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, 2019: 4171-4186.
[20] ZHENG H L, FU J L, ZHA Z J, et al. Learning deep bilinear transformation for fine-grained image representation[C]//Proceedings of the 33rd International Conference on Neural Information Processing Systems, 2019: 4277-4286.
[21] VINYALS O, FORTUNATO M, JAITLY N. Pointer networks[C]//Proceedings of the 28th International Conference on Neural Information Processing Systems, 2015: 2692-2700.
[22] RIEDEL S, YAO L, MCCALLUM A. Modeling relations and their mentions without labeled text[C]//Proceedings of the 2010 European Conference on Machine Learning and Knowledge Discovery in Databases, 2010: 148-163.
[23] GARDENT C, SHIMORINA A, NARAYAN S, et al. Creating training corpora for NLG micro-planning[C]//Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics, 2017.
[24] ZENG X R, HE S Z, ZENG D J, et al. Learning the extraction order of multiple relational facts in a sentence with reinforcement learning[C]//Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing, 2019: 367-377.
[25] 郝小芳, 张超群, 李晓翔, 等. 融合交互注意力网络的实体和关系联合抽取模型[J]. 计算机工程与应用, 2024, 60(8): 156-164.
HAO X F, ZHANG C Q, LI X X, et al. Joint entity relation extraction model based on interactive attention[J]. Computer Engineering and Applications, 2024, 60(8): 156-164.
[26] XU B F, WANG Q, LYU Y J, et al. EmRel: joint representation of entities and embedded relations for multi-triple extraction[C]//Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, 2022: 659-665.
[27] WELLING M, KIPF T N. Semi-supervised classification with graph convolutional networks[C]//Proceedings of the International Conference on Learning Representations, 2017.
[28] SHANG Y M, HUANG H Y, MAO X L. OneRel: joint entity and relation extraction with one module in one step[C]//Proceedings of the AAAI Conference on Artificial Intelligence, 2022: 11285-11293.
[29] HSU I H, HUANG K H, BOSCHEE E, et al. Degree: a data-efficient generation-based event extraction model[C]//Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, 2022: 1890-1908. |