Reasoning With Neural Tensor Networks for Knowledge Base Completion (2013)(About) **Predicting the likely truth of additional facts based on existing facts in the knowledge base.**
> we introduce an expressive neural
tensor network suitable for reasoning over relationships between two entities.
Most similar work: [Bordes et al.](http://127.0.0.1:8080/semanlink/doc/2019/08/learning_structured_embeddings_) (2011)
1. new neural tensor
network (NTN) suitable for reasoning over relationships between two entities.. Generalizes several previous neural network models and provides a more
powerful way to model relational information than a standard neural network layer.
2. a new way to represent entities in knowledge bases, as the
average of their constituting word vectorss, allowing the sharing of statistical strength between the words describing
each entity (e.g., Bank of China and China).
3. incorporation of word vectors which are trained on large unlabeled text
> We learn to modify word representations
via grounding in world knowledge. This essentially allows us to analyze word embeddings and
query them for specific relations. Furthermore, the resulting vectors could be used in other tasks
such as named entity recognition or relation classification in natural language
Learned in translation: contextualized word vectors (Salesforce Research)(About) Models that use pretrained word vectors must learn how to use them. Our work picks up where word vectors left off by looking to improve over randomly initialized methods for contextualizing word vectors through training on an intermediate task -> We teach a neural network how to understand words in context by first teaching it how to translate English to German