Parents:
Contextualised word-representations
replacement of the vectorial representation of words with a matrix representation where each word’s representation includes information about its context Embedding words through a language model Language-model-based encoders > The key idea underneath is to train a contextual encoder with a language model objective on a large unannotated text corpus. During the training, part of the text is masked and the goal is to encode the remaining context and predict the missing part. During the training, part of the text is masked and the goal is to encode the remaining context and predict the missing part. ([source](/doc/?uri=https%3A%2F%2Farxiv.org%2Fabs%2F1902.11269))
Related Tags:
ExpandDescendants
6 Documents (Long List
Properties