About This Document

- sl:arxiv_author :
- sl:arxiv_firstAuthor : Yair Movshovitz-Attias
- sl:arxiv_num : 1703.07464
- sl:arxiv_published : 2017-03-21T23:11:56Z
- sl:arxiv_summary : We address the problem of distance metric learning (DML), defined as learning a distance consistent with a notion of semantic similarity. Traditionally, for this problem supervision is expressed in the form of sets of points that follow an ordinal relationship -- an anchor point $x$ is similar to a set of positive points $Y$, and dissimilar to a set of negative points $Z$, and a loss defined over these distances is minimized. While the specifics of the optimization differ, in this work we collectively call this type of supervision Triplets and all methods that follow this pattern Triplet-Based methods. These methods are challenging to optimize. A main issue is the need for finding informative triplets, which is usually achieved by a variety of tricks such as increasing the batch size, hard or semi-hard triplet mining, etc. Even with these tricks, the convergence rate of such methods is slow. In this paper we propose to optimize the triplet loss on a different space of triplets, consisting of an anchor data point and similar and dissimilar proxy points which are learned as well. These proxies approximate the original data points, so that a triplet loss over the proxies is a tight upper bound of the original loss. This proxy-based loss is empirically better behaved. As a result, the proxy-loss improves on state-of-art results for three standard zero-shot learning datasets, by up to 15% points, while converging three times as fast as other triplet-based losses.@en
- sl:arxiv_title : No Fuss Distance Metric Learning using Proxies@en
- sl:arxiv_updated : 2017-08-01T19:52:13Z
- sl:bookmarkOf : https://arxiv.org/abs/1703.07464
- sl:creationDate : 2020-02-09
- sl:creationTime : 2020-02-09T18:44:26Z
- sl:relatedDoc : http://www.semanlink.net/doc/2020/01/training_a_speaker_embedding_fr

Documents with similar tags (experimental)

Tags:

2020-07-09 About

2020-06-30 About

2020-06-29 About

2020-05-10 About

2020-01-23 About

2020-01-22 About

2019-06-18 About

Tags:

2018-05-29 About