Home Page ELMo
> Deep contextualized word representations each word is assigned a representation which is a function of the entire corpus sentences to which they belong. The embeddings are computed from the internal states of a two-layers bidirectional Language Model, hence the name “ELMo”: Embeddings from Language Models. [Github](https://github.com/allenai/bilm-tf)
Related Tags:
ExpandDescendants
5 Documents (Long List
Properties