About This Document
- sl:arxiv_author :
- sl:arxiv_firstAuthor : Shuang Chen
- sl:arxiv_num : 2001.01447
- sl:arxiv_published : 2020-01-06T09:18:29Z
- sl:arxiv_summary : Existing state of the art neural entity linking models employ attention-based
bag-of-words context model and pre-trained entity embeddings bootstrapped from
word embeddings to assess topic level context compatibility. However, the
latent entity type information in the immediate context of the mention is
neglected, which causes the models often link mentions to incorrect entities
with incorrect type. To tackle this problem, we propose to inject latent entity
type information into the entity embeddings based on pre-trained BERT. In
addition, we integrate a BERT-based entity similarity score into the local
context model of a state-of-the-art model to better capture latent entity type
information. Our model significantly outperforms the state-of-the-art entity
linking models on standard benchmark (AIDA-CoNLL). Detailed experiment analysis
demonstrates that our model corrects most of the type errors produced by the
direct baseline.@en
- sl:arxiv_title : Improving Entity Linking by Modeling Latent Entity Type Information@en
- sl:arxiv_updated : 2020-01-06T09:18:29Z
- sl:bookmarkOf : https://arxiv.org/abs/2001.01447
- sl:creationDate : 2020-01-09
- sl:creationTime : 2020-01-09T02:37:01Z
Documents with similar tags (experimental)