A Simple Approach for Zero-Shot Learning based on Triplet Distribution Embeddings

29 Mar 2021  ·  Vivek Chalumuri, Bac Nguyen ·

Given the semantic descriptions of classes, Zero-Shot Learning (ZSL) aims to recognize unseen classes without labeled training data by exploiting semantic information, which contains knowledge between seen and unseen classes. Existing ZSL methods mainly use vectors to represent the embeddings to the semantic space. Despite the popularity, such vector representation limits the expressivity in terms of modeling the intra-class variability for each class. We address this issue by leveraging the use of distribution embeddings. More specifically, both image embeddings and class embeddings are modeled as Gaussian distributions, where their similarity relationships are preserved through the use of triplet constraints. The key intuition which guides our approach is that for each image, the embedding of the correct class label should be closer than that of any other class label. Extensive experiments on multiple benchmark data sets show that the proposed method achieves highly competitive results for both traditional ZSL and more challenging Generalized Zero-Shot Learning (GZSL) settings.

PDF Abstract

Datasets


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here