Attention Regularized Laplace Graph for Domain Adaptation

15 Oct 2022  ·  Lingkun Luo, Liming Chen, Shiqiang Hu ·

In leveraging manifold learning in domain adaptation (DA), graph embedding-based DA methods have shown their effectiveness in preserving data manifold through the Laplace graph. However, current graph embedding DA methods suffer from two issues: 1). they are only concerned with preservation of the underlying data structures in the embedding and ignore sub-domain adaptation, which requires taking into account intra-class similarity and inter-class dissimilarity, thereby leading to negative transfer; 2). manifold learning is proposed across different feature/label spaces separately, thereby hindering unified comprehensive manifold learning. In this paper, starting from our previous DGA-DA, we propose a novel DA method, namely Attention Regularized Laplace Graph-based Domain Adaptation (ARG-DA), to remedy the aforementioned issues. Specifically, by weighting the importance across different sub-domain adaptation tasks, we propose the Attention Regularized Laplace Graph for class-aware DA, thereby generating the attention regularized DA. Furthermore, using a specifically designed FEEL strategy, our approach dynamically unifies alignment of the manifold structures across different feature/label spaces, thus leading to comprehensive manifold learning. Comprehensive experiments are carried out to verify the effectiveness of the proposed DA method, which consistently outperforms the state-of-the-art DA methods on 7 standard DA benchmarks, i.e., 37 cross-domain image classification tasks including object, face, and digit images. An in-depth analysis of the proposed DA method is also discussed, including sensitivity, convergence, and robustness.

PDF Abstract

Datasets


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here