About This Document
- sl:arxiv_author :
- sl:arxiv_firstAuthor : Denis Mazur
- sl:arxiv_num : 1910.03524
- sl:arxiv_published : 2019-10-08T16:31:11Z
- sl:arxiv_summary : Learning useful representations is a key ingredient to the success of modern
machine learning. Currently, representation learning mostly relies on embedding
data into Euclidean space. However, recent work has shown that data in some
domains is better modeled by non-euclidean metric spaces, and inappropriate
geometry can result in inferior performance. In this paper, we aim to eliminate
the inductive bias imposed by the embedding space geometry. Namely, we propose
to map data into more general non-vector metric spaces: a weighted graph with a
shortest path distance. By design, such graphs can model arbitrary geometry
with a proper configuration of edges and weights. Our main contribution is
PRODIGE: a method that learns a weighted graph representation of data
end-to-end by gradient descent. Greater generality and fewer model assumptions
make PRODIGE more powerful than existing embedding-based approaches. We confirm
the superiority of our method via extensive experiments on a wide range of
tasks, including classification, compression, and collaborative filtering.@en
- sl:arxiv_title : Beyond Vector Spaces: Compact Data Representation as Differentiable Weighted Graphs@en
- sl:arxiv_updated : 2019-10-16T16:43:20Z
- sl:bookmarkOf : https://arxiv.org/abs/1910.03524
- sl:creationDate : 2019-10-09
- sl:creationTime : 2019-10-09T23:21:08Z
Documents with similar tags (experimental)