About This Document
- sl:arxiv_author :
- sl:arxiv_firstAuthor : Ikuya Yamada
- sl:arxiv_num : 1909.00426
- sl:arxiv_published : 2019-09-01T16:29:53Z
- sl:arxiv_summary : We propose a global entity disambiguation (ED) model based on BERT. To
capture global contextual information for ED, our model treats not only words
but also entities as input tokens, and solves the task by sequentially
resolving mentions to their referent entities and using resolved entities as
inputs at each step. We train the model using a large entity-annotated corpus
obtained from Wikipedia. We achieve new state-of-the-art results on five
standard ED datasets: AIDA-CoNLL, MSNBC, AQUAINT, ACE2004, and WNED-WIKI. The
source code and model checkpoint are available at
https://github.com/studio-ousia/luke.@en
- sl:arxiv_title : Global Entity Disambiguation with BERT@en
- sl:arxiv_updated : 2022-04-12T05:37:42Z
- sl:bookmarkOf : https://arxiv.org/abs/1909.00426
- sl:creationDate : 2022-04-18
- sl:creationTime : 2022-04-18T19:49:22Z
Documents with similar tags (experimental)