Search Results for author: Jiayan Qiu

Found 8 papers, 1 papers with code

Hallucinating Visual Instances in Total Absentia

no code implementations ECCV 2020 Jiayan Qiu, Yiding Yang, Xinchao Wang, DaCheng Tao

This seemingly minor difference in fact makes the HVITA a much challenging task, as the restoration algorithm would have to not only infer the category of the object in total absentia, but also hallucinate an object of which the appearance is consistent with the background.

Hallucination Image Inpainting +1

Scene Essence

no code implementations CVPR 2021 Jiayan Qiu, Yiding Yang, Xinchao Wang, DaCheng Tao

What scene elements, if any, are indispensable for recognizing a scene?

Scene Recognition

Unsupervised Person Re-identification via Simultaneous Clustering and Consistency Learning

no code implementations1 Apr 2021 Junhui Yin, Jiayan Qiu, Siqing Zhang, Jiyang Xie, Zhanyu Ma, Jun Guo

Unsupervised person re-identification (re-ID) has become an important topic due to its potential to resolve the scalability problem of supervised re-ID models.

Clustering Unsupervised Person Re-Identification

Learning Propagation Rules for Attribution Map Generation

no code implementations ECCV 2020 Yiding Yang, Jiayan Qiu, Mingli Song, DaCheng Tao, Xinchao Wang

Prior gradient-based attribution-map methods rely on handcrafted propagation rules for the non-linear/activation layers during the backward pass, so as to produce gradients of the input and then the attribution map.

SSKD: Self-Supervised Knowledge Distillation for Cross Domain Adaptive Person Re-Identification

no code implementations13 Sep 2020 Junhui Yin, Jiayan Qiu, Siqing Zhang, Zhanyu Ma, Jun Guo

To this end, we propose a Self-Supervised Knowledge Distillation (SSKD) technique containing two modules, the identity learning and the soft label learning.

Clustering Domain Adaptive Person Re-Identification +2

Distilling Knowledge from Graph Convolutional Networks

1 code implementation CVPR 2020 Yiding Yang, Jiayan Qiu, Mingli Song, DaCheng Tao, Xinchao Wang

To enable the knowledge transfer from the teacher GCN to the student, we propose a local structure preserving module that explicitly accounts for the topological semantics of the teacher.

Knowledge Distillation Transfer Learning

Towards Evolutional Compression

no code implementations25 Jul 2017 Yunhe Wang, Chang Xu, Jiayan Qiu, Chao Xu, DaCheng Tao

In contrast to directly recognizing subtle weights or filters as redundant in a given CNN, this paper presents an evolutionary method to automatically eliminate redundant convolution filters.

Cannot find the paper you are looking for? You can Submit a new open access paper.