About This Document
- sl:arxiv_author :
- sl:arxiv_firstAuthor : Liang Yao
- sl:arxiv_num : 1909.03193
- sl:arxiv_published : 2019-09-07T06:09:25Z
- sl:arxiv_summary : Knowledge graphs are important resources for many artificial intelligence
tasks but often suffer from incompleteness. In this work, we propose to use
pre-trained language models for knowledge graph completion. We treat triples in
knowledge graphs as textual sequences and propose a novel framework named
Knowledge Graph Bidirectional Encoder Representations from Transformer
(KG-BERT) to model these triples. Our method takes entity and relation
descriptions of a triple as input and computes scoring function of the triple
with the KG-BERT language model. Experimental results on multiple benchmark
knowledge graphs show that our method can achieve state-of-the-art performance
in triple classification, link prediction and relation prediction tasks.@en
- sl:arxiv_title : KG-BERT: BERT for Knowledge Graph Completion@en
- sl:arxiv_updated : 2019-09-11T06:03:30Z
- sl:bookmarkOf : https://arxiv.org/abs/1909.03193
- sl:creationDate : 2020-03-22
- sl:creationTime : 2020-03-22T18:56:43Z
Documents with similar tags (experimental)