Search Results for author: Shuyang Yu

Found 6 papers, 5 papers with code

CADE: Cosine Annealing Differential Evolution for Spiking Neural Network

1 code implementation4 Jun 2024 Runhua Jiang, Guodong Du, Shuyang Yu, Yifei Guo, Sim Kuan Goh, Ho-Kin Tang

This paper attempts to tackle the challenges by introducing Cosine Annealing Differential Evolution (CADE), designed to modulate the mutation factor (F) and crossover rate (CR) of differential evolution (DE) for the SNN model, i. e., Spiking Element Wise (SEW) ResNet.

Transfer Learning

Towards Stability of Parameter-free Optimization

no code implementations7 May 2024 Yijiang Pang, Shuyang Yu, Bao Hoang, Jiayu Zhou

To tackle this challenge, in this paper, we propose a novel parameter-free optimizer, \textsc{AdamG} (Adam with the golden step size), designed to automatically adapt to diverse optimization problems without manual tuning.

Safe and Robust Watermark Injection with a Single OoD Image

1 code implementation4 Sep 2023 Shuyang Yu, Junyuan Hong, Haobo Zhang, Haotao Wang, Zhangyang Wang, Jiayu Zhou

Training a high-performance deep neural network requires large amounts of data and computational resources.

Model extraction

Revisiting Data-Free Knowledge Distillation with Poisoned Teachers

1 code implementation4 Jun 2023 Junyuan Hong, Yi Zeng, Shuyang Yu, Lingjuan Lyu, Ruoxi Jia, Jiayu Zhou

Data-free knowledge distillation (KD) helps transfer knowledge from a pre-trained model (known as the teacher model) to a smaller model (known as the student model) without access to the original training data used for training the teacher model.

Backdoor Defense for Data-Free Distillation with Poisoned Teachers Data-free Knowledge Distillation

Cannot find the paper you are looking for? You can Submit a new open access paper.