Knowledge graph embeddings

How can we use knowledge graph in computing? A knowledge graph is a symbolic and logical system while applications often involve numerical computing in continuous spaces. Formal logic is neither tractable nor robust when dealing with knowledge graph. Hence the idea of Knowledge graph embeddings.
A knowledge graph is embedded into a low-dimensional continuous vector space while certain properties of it are preserved ([Bordes et al., 2013](http://papers.nips.cc/paper/5071-translating-embeddings-for-modeling-multi-rela), Socher et al., 2013; Chang et al., 2013; Wang et
al., 2014). Generally, each entity is represented
as a point in that space while each relation is interpreted as an operation over entity embeddings ((eg. in Bordes et al., a translation). The embedding representations are usually learnt by minimizing a global loss function involving all entities and relations so that each entity
embedding encodes both local and global connectivity patterns of the original graph. Thus, we can reason new facts from learnt embeddings

10 Documents (Long List)

- Accenture/AmpliGraph: Python library for Representation Learning on Knowledge Graphs
*(About)*

Open source Python library that predicts links between concepts in a knowledge graph.

2019-03-25 - D2KLab/entity2rec: entity2rec generates item recommendation from knowledge graphs
*(About)*

2018-06-04 - Workshop on Deep Learning for Knowledge Graphs and Semantic Technologies
*(About)*

> A challenging but paramount task for problems ranging from entity classification to entity recommendation or entity linking is that of learning features representing entities in the knowledge graph (building “knowledge graph embeddings”) that can be fed into machine learning algorithms

2018-02-01 - Knowledge Graph Embedding by Translating on Hyperplanes (2014)
*(About)*

> we start by analyzing the problems of TransE on reflexive/one-to-many/many-to-one/many-to-many relations. Accordingly we propose a method named translation on hyperplanes (TransH) which interprets a relation as a translating operation on a hyperplane

2018-01-30 - WORKSHOP: BigNet @ WWW 2018 Workshop on Learning Representations for Big Networks
*(About)*

2018-01-27 - Knowledge base completion by learning pairwise-interaction differentiated embeddings | SpringerLink (2015)
*(About)*

Embedding entities and relations in the knowledge base to low dimensional vector representations and then predict the possible truth of additional facts to extend the knowledge base

2018-01-27 - LEARNING GRAPH EMBEDDINGS FOR NODE LABELING AND INFORMATION DIFFUSION IN SOCIAL NETWORKS (2017)
*(About)*

2018-01-23 - Knowledge Graph and Text Jointly Embedding (2014)
*(About)*

method of jointly embedding knowledge graphs and a text corpus so that entities and words/phrases are represented in the same vector space. Promising improvement in the accuracy of predicting facts, compared to separately embedding knowledge graphs and text (in particular, enables the prediction of facts containing entities out of the knowledge graph) [cité par J. Moreno](/doc/?uri=https%3A%2F%2Fhal.archives-ouvertes.fr%2Fhal-01626196%2Fdocument)

2018-01-05 - Awesome Knowledge Graph Embedding Approaches
*(About)*

lists libraries and approaches for knowledge graph embeddings

2018-01-03 - Traversing Knowledge Graphs in Vector Space (2015)
*(About)*

Knowledge graphs often have missing facts (edges) which disrupts path queries. Recent models for knowledge base completion impute missing facts by embedding knowledge graphs in vector spaces. We show that these models can be recursively applied to answer path queries, but that they suffer from cascading errors. This motivates a new "compositional" training objective, which dramatically improves all models' ability to answer path queries, in some cases more than doubling accuracy.

2015-10-31

Properties

- sl:creationDate : 2018-01-03
- sl:creationTime : 2018-01-03T16:42:41Z
- rdf:type : sl:Tag
- skos:prefLabel : Knowledge graph embeddings