Search Results for author: Lingzhi Gao

Found 2 papers, 0 papers with code

Text-to-Model: Text-Conditioned Neural Network Diffusion for Train-Once-for-All Personalization

no code implementations23 May 2024 Zexi Li, Lingzhi Gao, Chao Wu

Generative artificial intelligence (GenAI) has made significant progress in understanding world knowledge and generating content from human languages across various modalities, like text-to-text large language models, text-to-image stable diffusion, and text-to-video Sora.

Out-of-Distribution Generalization World Knowledge

FediOS: Decoupling Orthogonal Subspaces for Personalization in Feature-skew Federated Learning

no code implementations30 Nov 2023 Lingzhi Gao, Zexi Li, Yang Lu, Chao Wu

A typical way of pFL focuses on label distribution skew, and they adopt a decoupling scheme where the model is split into a common feature extractor and two prediction heads (generic and personalized).

Personalized Federated Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.