About This Document
- sl:arxiv_author :
- sl:arxiv_firstAuthor : Zhiqing Sun
- sl:arxiv_num : 1902.10197
- sl:arxiv_published : 2019-02-26T20:15:09Z
- sl:arxiv_summary : We study the problem of learning representations of entities and relations in
knowledge graphs for predicting missing links. The success of such a task
heavily relies on the ability of modeling and inferring the patterns of (or
between) the relations. In this paper, we present a new approach for knowledge
graph embedding called RotatE, which is able to model and infer various
relation patterns including: symmetry/antisymmetry, inversion, and composition.
Specifically, the RotatE model defines each relation as a rotation from the
source entity to the target entity in the complex vector space. In addition, we
propose a novel self-adversarial negative sampling technique for efficiently
and effectively training the RotatE model. Experimental results on multiple
benchmark knowledge graphs show that the proposed RotatE model is not only
scalable, but also able to infer and model various relation patterns and
significantly outperform existing state-of-the-art models for link prediction.@en
- sl:arxiv_title : RotatE: Knowledge Graph Embedding by Relational Rotation in Complex Space@en
- sl:arxiv_updated : 2019-02-26T20:15:09Z
- sl:bookmarkOf : https://arxiv.org/abs/1902.10197
- sl:creationDate : 2020-03-03
- sl:creationTime : 2020-03-03T13:27:48Z
Documents with similar tags (experimental)