About This Document
- sl:arxiv_author :
- sl:arxiv_firstAuthor : Sanxing Chen
- sl:arxiv_num : 2008.12813
- sl:arxiv_published : 2020-08-28T18:58:15Z
- sl:arxiv_summary : This paper examines the challenging problem of learning representations of
entities and relations in a complex multi-relational knowledge graph. We
propose HittER, a Hierarchical Transformer model to jointly learn
Entity-relation composition and Relational contextualization based on a source
entity's neighborhood. Our proposed model consists of two different Transformer
blocks: the bottom block extracts features of each entity-relation pair in the
local neighborhood of the source entity and the top block aggregates the
relational information from outputs of the bottom block. We further design a
masked entity prediction task to balance information from the relational
context and the source entity itself. Experimental results show that HittER
achieves new state-of-the-art results on multiple link prediction datasets. We
additionally propose a simple approach to integrate HittER into BERT and
demonstrate its effectiveness on two Freebase factoid question answering
datasets.@en
- sl:arxiv_title : HittER: Hierarchical Transformers for Knowledge Graph Embeddings@en
- sl:arxiv_updated : 2021-10-06T04:52:07Z
- sl:bookmarkOf : https://arxiv.org/abs/2008.12813
- sl:creationDate : 2022-06-30
- sl:creationTime : 2022-06-30T18:33:10Z
Documents with similar tags (experimental)