Search Results for author: Yikun Miao

Found 1 papers, 0 papers with code

Shared Growth of Graph Neural Networks via Prompted Free-direction Knowledge Distillation

no code implementations2 Jul 2023 Kaituo Feng, Yikun Miao, Changsheng Li, Ye Yuan, Guoren Wang

Knowledge distillation (KD) has shown to be effective to boost the performance of graph neural networks (GNNs), where the typical objective is to distill knowledge from a deeper teacher GNN into a shallower student GNN.

Knowledge Distillation Transfer Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.