About This Document
- sl:arxiv_author :
- sl:arxiv_firstAuthor : Livio Baldini Soares
- sl:arxiv_num : 1906.03158
- sl:arxiv_published : 2019-06-07T15:26:50Z
- sl:arxiv_summary : General purpose relation extractors, which can model arbitrary relations, are
a core aspiration in information extraction. Efforts have been made to build
general purpose extractors that represent relations with their surface forms,
or which jointly embed surface forms with relations from an existing knowledge
graph. However, both of these approaches are limited in their ability to
generalize. In this paper, we build on extensions of Harris' distributional
hypothesis to relations, as well as recent advances in learning text
representations (specifically, BERT), to build task agnostic relation
representations solely from entity-linked text. We show that these
representations significantly outperform previous work on exemplar based
relation extraction (FewRel) even without using any of that task's training
data. We also show that models initialized with our task agnostic
representations, and then tuned on supervised relation extraction datasets,
significantly outperform the previous methods on SemEval 2010 Task 8, KBP37,
and TACRED.@en
- sl:arxiv_title : Matching the Blanks: Distributional Similarity for Relation Learning@en
- sl:arxiv_updated : 2019-06-07T15:26:50Z
- sl:bookmarkOf : https://arxiv.org/abs/1906.03158
- sl:creationDate : 2021-05-13
- sl:creationTime : 2021-05-13T00:39:03Z
Documents with similar tags (experimental)