About This Document
- sl:arxiv_author :
- sl:arxiv_firstAuthor : Daniel Daza
- sl:arxiv_num : 2010.03496
- sl:arxiv_published : 2020-10-07T16:04:06Z
- sl:arxiv_summary : We present a method for learning representations of entities, that uses a
Transformer-based architecture as an entity encoder, and link prediction
training on a knowledge graph with textual entity descriptions. We demonstrate
that our approach can be applied effectively for link prediction in different
inductive settings involving entities not seen during training, outperforming
related state-of-the-art methods (22% MRR improvement on average). We provide
evidence that the learned representations transfer to other tasks that do not
require fine-tuning the entity encoder. In an entity classification task we
obtain an average improvement of 16% accuracy compared with baselines that also
employ pre-trained models. For an information retrieval task, significant
improvements of up to 8.8% in NDCG@10 were obtained for natural language
queries.@en
- sl:arxiv_title : Inductive Entity Representations from Text via Link Prediction@en
- sl:arxiv_updated : 2020-10-07T16:04:06Z
- sl:bookmarkOf : https://arxiv.org/abs/2010.03496
- sl:creationDate : 2020-11-03
- sl:creationTime : 2020-11-03T16:38:59Z
- sl:relatedDoc :
Documents with similar tags (experimental)