]> Percy Liang Knowledge graphs often have missing facts (edges) which disrupts path queries. Recent models for knowledge base completion impute missing facts by embedding knowledge graphs in vector spaces. We show that these models can be recursively applied to answer path queries, but that they suffer from cascading errors. This motivates a new "compositional" training objective, which dramatically improves all models' ability to answer path queries, in some cases more than doubling accuracy. John Miller Traversing Knowledge Graphs in Vector Space 2015-08-19T05:16:24Z [1506.01094] Traversing Knowledge Graphs in Vector Space 1506.01094 2015-10-31 Kelvin Guu Kelvin Guu 2015-06-03T00:38:25Z 2015-10-31T00:11:12Z Path queries on a knowledge graph can be used to answer compositional questions such as "What languages are spoken by people living in Lisbon?". However, knowledge graphs often have missing facts (edges) which disrupts path queries. Recent models for knowledge base completion impute missing facts by embedding knowledge graphs in vector spaces. We show that these models can be recursively applied to answer path queries, but that they suffer from cascading errors. This motivates a new "compositional" training objective, which dramatically improves all models' ability to answer path queries, in some cases more than doubling accuracy. On a standard knowledge base completion task, we also demonstrate that compositional training acts as a novel form of structural regularization, reliably improving performance across all base models (reducing errors by up to 43%) and achieving new state-of-the-art results. 2015-10-31 2015-10-31T23:53:56Z The great chain of being sure about things | The Economist Le fact-checking peut-il s’automatiser ? | J'ai du bon data 2015-10-31 2015-10-31T10:16:58Z