Search Results for author: Haohui Wang

Found 5 papers, 2 papers with code

HeroLT: Benchmarking Heterogeneous Long-Tailed Learning

1 code implementation17 Jul 2023 Haohui Wang, Weijie Guan, Jianpeng Chen, Zi Wang, Dawei Zhou

To achieve this, we develop the most comprehensive (to the best of our knowledge) long-tailed learning benchmark named HeroLT, which integrates 13 state-of-the-art algorithms and 6 evaluation metrics on 14 real-world benchmark datasets across 4 tasks from 3 domains.

Benchmarking

GPatcher: A Simple and Adaptive MLP Model for Alleviating Graph Heterophily

no code implementations25 Jun 2023 Shuaicheng Zhang, Haohui Wang, Si Zhang, Dawei Zhou

While graph heterophily has been extensively studied in recent years, a fundamental research question largely remains nascent: How and to what extent will graph heterophily affect the prediction performance of graph neural networks (GNNs)?

Node Classification

Mastering Long-Tail Complexity on Graphs: Characterization, Learning, and Generalization

no code implementations17 May 2023 Haohui Wang, Baoyu Jing, Kaize Ding, Yada Zhu, Wei Cheng, Si Zhang, Yonghui Fan, Liqing Zhang, Dawei Zhou

To bridge this gap, we propose a generalization bound for long-tail classification on graphs by formulating the problem in the fashion of multi-task learning, i. e., each task corresponds to the prediction of one particular class.

Classification Contrastive Learning +1

EvoluNet: Advancing Dynamic Non-IID Transfer Learning on Graphs

no code implementations1 May 2023 Haohui Wang, Yuzhen Mao, Yujun Yan, Yaoqing Yang, Jianhui Sun, Kevin Choi, Balaji Veeramani, Alison Hu, Edward Bowen, Tyler Cody, Dawei Zhou

To answer it, we propose a generalization bound for dynamic non-IID transfer learning on graphs, which implies the generalization performance is dominated by domain evolution and domain discrepancy between source and target graphs.

Transfer Learning

A Benchmark for Federated Hetero-Task Learning

1 code implementation7 Jun 2022 Liuyi Yao, Dawei Gao, Zhen Wang, Yuexiang Xie, Weirui Kuang, Daoyuan Chen, Haohui Wang, Chenhe Dong, Bolin Ding, Yaliang Li

To investigate the heterogeneity in federated learning in real-world scenarios, we generalize the classic federated learning to federated hetero-task learning, which emphasizes the inconsistency across the participants in federated learning in terms of both data distribution and learning tasks.

Federated Learning Meta-Learning +2

Cannot find the paper you are looking for? You can Submit a new open access paper.