BPEmb: Subword Embeddings
Tags:
a collection of pre-trained subword embeddings in 275 languages, based on Byte-Pair Encoding (BPE) and trained on Wikipedia
About This Document
File info