[1805.04032] From Word to Sense Embeddings: A Survey on Vector Representations of Meaning (2018)(About) Survey focused on semantic representation of meaning (methods that try to directly model individual meanings of words).
Pb with word embeddings: the meaning conflation deficiency (representing a word with all its possible meanings as a single vector). Can be addressed by a method for modelling unambiguous lexical meaning.
two main branches of sense representation :
The Current Best of Universal Word Embeddings and Sentence Embeddings (2018)(About) Word embeddings SOTA: [ELMo](/tag/elmo)
Sentence embeddings: While unsupervised representation learning of sentences had been the
norm for quite some time, with simple baselines like averaging word embeddings, a few novel unsupervised and supervised
approaches, as well as multi-task learning schemes, have emerged in late