no code implementations • 19 Apr 2022 • Niv Nayman, Avram Golbert, Asaf Noy, Tan Ping, Lihi Zelnik-Manor
Encouraged by the recent transferability results of self-supervised models, we propose a method that combines self-supervised and supervised pretraining to generate models with both high diversity and high accuracy, and as a result high transferability.
1 code implementation • 24 Oct 2021 • Niv Nayman, Yonathan Aflalo, Asaf Noy, Rong Jin, Lihi Zelnik-Manor
Practical use of neural networks often involves requirements on latency, energy and memory among others.
2 code implementations • 23 Feb 2021 • Niv Nayman, Yonathan Aflalo, Asaf Noy, Lihi Zelnik-Manor
Realistic use of neural networks often requires adhering to multiple constraints on latency, energy and memory among others.
Ranked #21 on Neural Architecture Search on ImageNet
1 code implementation • NeurIPS 2021 • Jian Tan, Niv Nayman, Mengchang Wang
These virtual points, along with the means and variances of their unknown function values estimated using the simple kernel of the first stage, are fitted to a more sophisticated kernel model in the second stage.
2 code implementations • NeurIPS 2019 • Niv Nayman, Asaf Noy, Tal Ridnik, Itamar Friedman, Rong Jin, Lihi Zelnik-Manor
This paper introduces a novel optimization method for differential neural architecture search, based on the theory of prediction with expert advice.
1 code implementation • 8 Apr 2019 • Asaf Noy, Niv Nayman, Tal Ridnik, Nadav Zamir, Sivan Doveh, Itamar Friedman, Raja Giryes, Lihi Zelnik-Manor
In this paper, we propose a differentiable search space that allows the annealing of architecture weights, while gradually pruning inferior operations.