2 code implementations • ICCV 2023 • Ahmad Sajedi, Samir Khaki, Ehsan Amjadian, Lucy Z. Liu, Yuri A. Lawryshyn, Konstantinos N. Plataniotis
Emerging research on dataset distillation aims to reduce training costs by creating a small synthetic set that contains the information of a larger real dataset and ultimately achieves test accuracy equivalent to a model trained on the whole dataset.
1 code implementation • 14 Jun 2023 • Abdulrahman Diaa, Lucas Fenaux, Thomas Humphries, Marian Dietz, Faezeh Ebrahimianghazani, Bailey Kacsmar, Xinda Li, Nils Lukas, Rasoul Akhavan Mahdavi, Simon Oya, Ehsan Amjadian, Florian Kerschbaum
Motivated by the success of previous work co-designing machine learning and MPC, we develop an activation function co-design.
no code implementations • WS 2016 • Ehsan Amjadian, Diana Inkpen, Tahereh Paribakht, Farahnaz Faez
The present paper explores a novel method that integrates efficient distributed representations with terminology extraction.