Computer Engineering and Applications ›› 2021, Vol. 57 ›› Issue (21): 234-240.DOI: 10.3778/j.issn.1002-8331.2011-0199

Previous Articles     Next Articles

Research on Improved BERT’s Chinese Multi-relation Extraction Method

HUANG Meigen, LIU Jiale, LIU Chuan   

  1. College of Computer Science and Technology, Chongqing University of Posts and Telecommunications, Chongqing 400065, China
  • Online:2021-11-01 Published:2021-11-04



  1. 重庆邮电大学 计算机科学与技术学院,重庆 400065


There are few studies on extracting multiple triples from text sentences when constructing triples, and most of them are based on English context. For this reason, a BERT-based Chinese multi-relation extraction model BCMRE is proposed, which consists of relation classification and element extraction. Two mission models are connected in series. BCMRE predicts the possible relationships through the relationship classification task, fuses the predicted relationship code into the word vector, copies an instance of each relationship, and then enters the element extraction task to predict the triplet through named entity recognition. BCMRE adds different pre-models based on the characteristics of the two tasks. Word vectors are designed to optimize the shortcomings of BERT in Chinese characters when processing Chinese. Different loss functions are designed to make the model better. BERT’s multi-head and self-attention mechanism are used to fully extract the feature completes the extraction of triples. BCMRE compares experiments with other models and changes to different pre-models. It has achieved relatively good results under the F1 evaluation, which proves that the model can effectively improve the effect of extracting multi-relational triples.

Key words: Named Entity Recognition(NER), relationship extraction, pre-model, classification, serial task, Bidirectional Encoder Representations from Transformers(BERT) model



关键词: 命名实体识别, 关系抽取, 前置模型, 分类, 串联任务, BERT模型