Search Results for author: Zhu Liao

Found 3 papers, 1 papers with code

The Simpler The Better: An Entropy-Based Importance Metric To Reduce Neural Networks' Depth

no code implementations27 Apr 2024 Victor Quétu, Zhu Liao, Enzo Tartaglione

While deep neural networks are highly effective at solving complex tasks, large pre-trained models are commonly employed even to solve consistently simpler downstream tasks, which do not necessarily require a large model's complexity.

Image Classification

NEPENTHE: Entropy-Based Pruning as a Neural Network Depth's Reducer

no code implementations24 Apr 2024 Zhu Liao, Victor Quétu, Van-Tam Nguyen, Enzo Tartaglione

While deep neural networks are highly effective at solving complex tasks, their computational demands can hinder their usefulness in real-time applications and with limited-resources systems.

Can Unstructured Pruning Reduce the Depth in Deep Neural Networks?

1 code implementation12 Aug 2023 Zhu Liao, Victor Quétu, Van-Tam Nguyen, Enzo Tartaglione

Pruning is a widely used technique for reducing the size of deep neural networks while maintaining their performance.

Cannot find the paper you are looking for? You can Submit a new open access paper.