Metric Learning

557 papers with code • 8 benchmarks • 32 datasets

The goal of Metric Learning is to learn a representation function that maps objects into an embedded space. The distance in the embedded space should preserve the objects’ similarity — similar objects get close and dissimilar objects get far away. Various loss functions have been developed for Metric Learning. For example, the contrastive loss guides the objects from the same class to be mapped to the same point and those from different classes to be mapped to different points whose distances are larger than a margin. Triplet loss is also popular, which requires the distance between the anchor sample and the positive sample to be smaller than the distance between the anchor sample and the negative sample.

Source: Road Network Metric Learning for Estimated Time of Arrival

Libraries

Use these libraries to find Metric Learning models and implementations

Most implemented papers

In Defense of the Triplet Loss for Person Re-Identification

layumi/Person_reID_baseline_pytorch 22 Mar 2017

In the past few years, the field of computer vision has gone through a revolution fueled mainly by the advent of large datasets and the adoption of deep convolutional neural networks for end-to-end learning.

Matching Networks for One Shot Learning

oscarknagg/few-shot NeurIPS 2016

Our algorithm improves one-shot accuracy on ImageNet from 87. 6% to 93. 2% and from 88. 0% to 93. 8% on Omniglot compared to competing approaches.

Circle Loss: A Unified Perspective of Pair Similarity Optimization

layumi/Person_reID_baseline_pytorch CVPR 2020

This paper provides a pair similarity optimization viewpoint on deep feature learning, aiming to maximize the within-class similarity $s_p$ and minimize the between-class similarity $s_n$.

Additive Margin Softmax for Face Verification

happynear/AMSoftmax 17 Jan 2018

In this work, we introduce a novel additive angular margin for the Softmax loss, which is intuitively appealing and more interpretable than the existing works.

Semantic Instance Segmentation with a Discriminative Loss Function

Wizaron/instance-segmentation-pytorch 8 Aug 2017

In this work we propose to tackle the problem with a discriminative loss function, operating at the pixel level, that encourages a convolutional network to produce a representation of the image that can easily be clustered into instances with a simple post-processing step.

Revisiting Training Strategies and Generalization Performance in Deep Metric Learning

Confusezius/Deep-Metric-Learning-Baselines ICML 2020

Deep Metric Learning (DML) is arguably one of the most influential lines of research for learning visual similarities with many proposed approaches every year.

Time-Contrastive Networks: Self-Supervised Learning from Video

tensorflow/models 23 Apr 2017

While representations are learned from an unlabeled collection of task-related videos, robot behaviors such as pouring are learned by watching a single 3rd-person demonstration by a human.

Sampling Matters in Deep Embedding Learning

CompVis/metric-learning-divide-and-conquer ICCV 2017

In addition, we show that a simple margin based loss is sufficient to outperform all other loss functions.

metric-learn: Metric Learning Algorithms in Python

scikit-learn-contrib/metric-learn 13 Aug 2019

metric-learn is an open source Python package implementing supervised and weakly-supervised distance metric learning algorithms.

Batch DropBlock Network for Person Re-identification and Beyond

daizuozhuo/batch-feature-erasing-network ICCV 2019

In this paper, we propose the Batch DropBlock (BDB) Network which is a two branch network composed of a conventional ResNet-50 as the global branch and a feature dropping branch.