About This Document
- sl:arxiv_author :
- sl:arxiv_firstAuthor : Quan Wang
- sl:arxiv_num : 1911.02168
- sl:arxiv_published : 2019-11-06T02:27:39Z
- sl:arxiv_summary : Knowledge graph embedding, which projects symbolic entities and relations
into continuous vector spaces, is gaining increasing attention. Previous
methods allow a single static embedding for each entity or relation, ignoring
their intrinsic contextual nature, i.e., entities and relations may appear in
different graph contexts, and accordingly, exhibit different properties. This
work presents Contextualized Knowledge Graph Embedding (CoKE), a novel paradigm
that takes into account such contextual nature, and learns dynamic, flexible,
and fully contextualized entity and relation embeddings. Two types of graph
contexts are studied: edges and paths, both formulated as sequences of entities
and relations. CoKE takes a sequence as input and uses a Transformer encoder to
obtain contextualized representations. These representations are hence
naturally adaptive to the input, capturing contextual meanings of entities and
relations therein. Evaluation on a wide variety of public benchmarks verifies
the superiority of CoKE in link prediction and path query answering. It
performs consistently better than, or at least equally well as current
state-of-the-art in almost every case, in particular offering an absolute
improvement of 21.0% in H@10 on path query answering. Our code is available at
\url{https://github.com/PaddlePaddle/Research/tree/master/KG/CoKE}.@en
- sl:arxiv_title : CoKE: Contextualized Knowledge Graph Embedding@en
- sl:arxiv_updated : 2020-04-04T07:22:20Z
- sl:bookmarkOf : https://arxiv.org/abs/1911.02168
- sl:creationDate : 2020-03-22
- sl:creationTime : 2020-03-22T17:34:10Z
Documents with similar tags (experimental)