Search Results for author: Hossein Zakerinia

Found 4 papers, 1 papers with code

MG-BERT: Multi-Graph Augmented BERT for Masked Language Modeling

no code implementations NAACL (TextGraphs) 2021 Parishad BehnamGhader, Hossein Zakerinia, Mahdieh Soleymani Baghshah

Pre-trained models like Bidirectional Encoder Representations from Transformers (BERT), have recently made a big leap forward in Natural Language Processing (NLP) tasks.

Knowledge Graphs Language Modelling +2

PeFLL: Personalized Federated Learning by Learning to Learn

1 code implementation8 Jun 2023 Jonathan Scott, Hossein Zakerinia, Christoph H. Lampert

We present PeFLL, a new personalized federated learning algorithm that improves over the state-of-the-art in three aspects: 1) it produces more accurate models, especially in the low-data regime, and not only for clients present during its training phase, but also for any that may emerge in the future; 2) it reduces the amount of on-client computation and client-server communication by providing future clients with ready-to-use personalized models that require no additional finetuning or optimization; 3) it comes with theoretical guarantees that establish generalization from the observed clients to future ones.

Personalized Federated Learning

Communication-Efficient Federated Learning With Data and Client Heterogeneity

no code implementations20 Jun 2022 Hossein Zakerinia, Shayan Talaei, Giorgi Nadiradze, Dan Alistarh

Federated Learning (FL) enables large-scale distributed training of machine learning models, while still allowing individual nodes to maintain data locally.

Federated Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.