Sanjeev Arora on "A theoretical approach to semantic representations" - YouTube (2016)(About) Why do low-dimensional word vectors exist?
> a text corpus is imagined as being generated by a random walk in a latent variable space, and the word production is via a loglinear distribution. This model is shown to imply several empirically discovered past methods for word embedding like word2vec, GloVe, PMI etc
Representations for Language: From Word Embeddings to Sentence Meanings (2017) - YouTube(About) [Slides](/doc/?uri=https%3A%2F%2Fnlp.stanford.edu%2Fmanning%2Ftalks%2FSimons-Institute-Manning-2017.pdf)
**What's special about human language? the only hope for explainable intelligence**.
Symbols are not just an invention of logic / classical AI.
Meaning: a solution via distributional similarity based representations. One of the most successfull ideas of modern NLP.
> You shall know a word by the company it keeps (JR Firth 1957)
The BiLSTM hegemony
Neural Bag of words
> "Surprisingly effective for many tasks :-(" [cf "DAN", Deep Averaging Network, Iyyver et al.](/doc/?uri=http%3A%2F%2Fwww.cs.cornell.edu%2Fcourses%2Fcs5740%2F2016sp%2Fresources%2Fdans.pdf)
Christopher Manning - "Building Neural Network Models That Can Reason" (TCSDLS 2017-2018) - YouTube(About) Goal: to enhance DL systems with reasoning capabilities from the ground-up
- allowing them to perform transparent multi-step reasoning processes
- while retaining end-to-end differentiability and scalability to real-world problems
> I get the feeling that if we're going to make further progress in AI, we actually have to get back to some of these problems of knowledge representation reasoning
- From ML to machine reasoning
- the CLEVR task
- Memory-Attention-Composition Networks
What is reasoning? (Bottou 2011)
- manipulating previously acquired knowledge in order to answer a question
- not necessarily achieved by making logical inference (eg: algebraic manipulations of matrices)
- composition rules -> combination of operations to address new tasks