[1] BORDES A, CHOPRA S, WESTON J. Question answering with subgraph embeddings[C]//Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing. Doha: Association for Computational Linguistics, 2014: 615-620.
[2] XU J, LEI Z, WANG H, et al. Discovering dialog structure graph for coherent dialog generation[C]//Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), 2021: 1726-1739.
[3] ZENG C, LI S, LI Q, et al. A survey on machine reading comprehension—tasks, evaluation metrics and benchmark datasets[J]. Applied Sciences, 2020, 10(21): 7640.
[4] LIN Y, LIU Z, SUN M. Knowledge representation learning with entities, attributes and relations[J]. Ethnicity, 2016, 1: 41-52.
[5] BORDES A, USUNIER N, GARCIA-DURAN A, et al. Translating embeddings for modeling multi-relational data[C]//Advances in Neural Information Processing Systems. Cambridge: MIT Press, 2013: 2787-2795.
[6] ZHANG Z, ZHUANG F, QU M, et al. Knowledge graph embedding with hierarchical relation structure[C]//Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, 2018: 3198-3207.
[7] LIU X, TAN H, CHEN Q, et al. RAGAT: relation aware graph attention network for knowledge graph completion[J]. IEEE Access, 2021, 9: 20840-20849.
[8] CHEN Z, VULLAR S, CHEN L, et al. On the equivalence between graph isomorphism testing and function approximation with gnns[C]//Advances in Neural Information Processing Systems, 2019.
[9] TIAN A, ZHANG C, RANG M, et al. RA-GCN: relational aggregation graph convolutional network for knowledge graph completion[C]//Proceedings of the 2020 12th International Conference on Machine Learning and Computing, 2020: 580-586.
[10] SHANG C, TANG Y, HUANG J, et al. End-to-end structure-aware convolutional networks for knowledge base completion[C]//Proceedings of the AAAI Conference on Artificial Intelligence, 2019: 3060-3067.
[11] VASHISHTH S, SANYAL S, NITIN V, et al. Composition-based multi-relational graph convolutional networks[J]. arXiv:1911.03082, 2019.
[12] YANG H, LIU J. Knowledge graph representation learning as groupoid: unifying TransE, RotatE, QuatE, ComplEx[C]//Proceedings of the 30th ACM International Conference on Information & Knowledge Management, 2021: 2311-2320.
[13] YANG B, YIH W, HE X, et al. Embedding entities and relations for learning and inference in knowledge bases[J]. arXiv:1412.6575, 2014.
[14] DAI A, TU D, NGUYEN D, et al. A novel embedding model for knowledge base completion based on convolutional neural network[C]//Proceedings of North American Chapter of the Association for Computational Linguistics. Minneapolis: ACL, 2019: 2180-2189.
[15] BELLOMARINI L, FAYZRAKHMANOV R R, GOTTLOB G, et al. Data science with Vadalog: knowledge graphs with machine learning and reasoning in practice[J]. Future Generation Computer Systems, 2022, 129: 407-422.
[16] DAI Y, WANG S, CHEN X, et al. Generative adversarial networks based on Wasserstein distance for knowledge graph embeddings[J]. Knowledge-Based Systems, 2020, 190: 105165.
[17] SCHLICHTKRULL M, KIPF T N, BlOEM P, et al. Modeling relational data with graph convolutional networks[C]//European Semantic Web Conference. Cham: Springer, 2018: 593-607.
[18] HAMILTON W, YING Z, LESKOVEC J. Inductive representation learning on large graphs[C]//Advances in Neural Information Processing Systems, 2017.
[19] WU F, SOUZA A, ZHANG T, et al. Simplifying graph convolutional networks[C]//International Conference on Machine Learning, 2019: 6861-6871.
[20] VELICKOVIC P, CUCURULL G, CASANOVA A, et al. Graph attention networks[J]. arxiv:1710.10903, 2017.
[21] WANG X, JI H, SHI C, et al. Heterogeneous graph attention network[C]//The World Wide Web Conference, 2019: 2022-2032.
[22] WU J, SHI W, CAO X, et al. DisenKGAT: knowledge graph embedding with disentangled graph attention network[C]//Proceedings of the 30th ACM International Conference on Information & Knowledge Management, 2021: 2140-2149.
[23] BISWAS S S. Potential use of chat gpt in global warming[J]. Annals of Biomedical Engineering, 2023, 51(6): 1126-1127.
[24] SUN F, LIU J, WU J, et al. BERT4Rec: sequential recommendation with bidirectional encoder representations from transformer[C]//Proceedings of the 28th ACM International Conference on Information and Knowledge Management, 2019: 1441-1450.
[25] LI Z, ZHAO Y, ZHANG Y, et al. Multi-relational graph attention networks for knowledge graph completion[J]. Knowledge-Based Systems, 2022, 251: 109262.
[26] IOFFE S, SZEGEDY C. Batch normalization: accelerating deep network training by reducing internal covariate shift[C]//International Conference on Machine Learning, 2015: 448-456.
[27] SRIVASTAVA N, HINTON G, KRIZHEVSKY A, et al. Dropout: a simple way to prevent neural networks from overfitting[J]. The Journal of Machine Learning Research, 2014, 15(1): 1929-1958.
[28] KINGMA D, BA J. Adam: a method for stochastic optimization[J]. arXiv:1412.6980, 2014.
[29] BOLLACKER K, EVANS C, PARITOSH P, et al. Freebase: a collaboratively created graph database for structuring human knowledge[C]//Proceedings of the 2008 ACM SIGMOD International Conference on Management of Data, 2008: 1247-1250.
[30] MILLER, GEORGE A. WordNet: a lexical database for English[J]. Communications of the ACM, 1995, 38(11): 39-41.
[31] JIANG D, WANG R, YANG J, et al. Kernel multi-attention neural network for knowledge graph embedding[J]. Knowledge-Based Systems, 2021, 227: 107188.
[32] VASHISHTH S, SANYAL S, NITIN V, et al. InteractE: improving convolution-based knowledge graph embeddings by increasing feature interactions[C]//Proceedings of the AAAI Conference on Artificial Intelligence, 2020: 3009-3016.
[33] QU M, TANG J. Probabilistic logic neural networks for reasoning[C]//Advances in Neural Information Processing Systems, 2019. |