Reasoning With Neural Tensor Networks for Knowledge Base Completion (2013)(About) **Predicting the likely truth of additional facts based on existing facts in the knowledge base.**
> we introduce an expressive neural
tensor network suitable for reasoning over relationships between two entities.
Most similar work: [Bordes et al.](http://127.0.0.1:8080/semanlink/doc/2019/08/learning_structured_embeddings_) (2011)
1. new neural tensor
network (NTN) suitable for reasoning over relationships between two entities.. Generalizes several previous neural network models and provides a more
powerful way to model relational information than a standard neural network layer.
2. a new way to represent entities in knowledge bases, as the
average of their constituting word vectorss, allowing the sharing of statistical strength between the words describing
each entity (e.g., Bank of China and China).
3. incorporation of word vectors which are trained on large unlabeled text
> We learn to modify word representations
via grounding in world knowledge. This essentially allows us to analyze word embeddings and
query them for specific relations. Furthermore, the resulting vectors could be used in other tasks
such as named entity recognition or relation classification in natural language
A2N: Attending to Neighbors for Knowledge Graph Inference - ACL 2019(About) > State-of-the-art models for knowledge graph completion aim at learning a fixed embedding representation of entities in a multi-relational graph which can generalize to infer unseen entity relationships at test time. This can be sub-optimal as it requires memorizing and generalizing to all possible entity relationships using these fixed representations. We thus propose a novel **attention-based method to learn query-dependent representation of entities** which adaptively combines the relevant graph neighborhood of an entity leading to more accurate KG completion.
Romain Vial (Hyperlex) at Paris NLP meetup, slides(About) > Hyperlex is a contract analytics and management solution powered by artificial intelligence. Hyperlex helps companies manage and make the most of their contract portfolio by identifying relevant information and data to manage key contractual commitments.
Knowledge Graph and Text Jointly Embedding (2014)(About) method of jointly embedding knowledge graphs and a text corpus so that entities and words/phrases are represented in the same vector space.
Promising improvement in the accuracy of predicting facts, compared to separately embedding knowledge graphs and text (in particular, enables the prediction of facts containing entities out of the knowledge graph)
[cité par J. Moreno](/doc/?uri=https%3A%2F%2Fhal.archives-ouvertes.fr%2Fhal-01626196%2Fdocument)
Traversing Knowledge Graphs in Vector Space (2015)(About) Knowledge graphs often have missing facts (edges) which disrupts path queries. Recent models for knowledge base completion impute missing facts by embedding knowledge graphs in vector spaces. We show that these models can be recursively applied to answer path queries, but that they suffer from cascading errors. This motivates a new "compositional" training objective, which dramatically improves all models' ability to answer path queries, in some cases more than doubling accuracy.