4th Workshop on Representation Learning for NLP
Talks: - Language emergence as representation learning (Marco Baroni) > language emergence among deep neural network agents that have to jointly solve a task. Recent findings suggest that the language-like code developed by such agents both differs from and resembles natural language in interesting ways. For example, the emergent code does not naturally represent general concepts, but rather very specific invariances in the perceptual input - Representations shaped by dialogue interaction (Raquel Fernández) > When we use language to communicate with each other in conversation, we build an internal representation of our evolving common ground. Traditionally, in dialogue systems this is captured by an explicit dialogue state defined a priori. Can we develop dialogue agents that learn their own (joint) representations? - Knowledgeable and Adversarially-Robust Representation Learning (Mohit Bansal) - Modeling Output Spaces in Continuous-Output Language Generation (Yulia Tsvetkov)
About This Document
File info