Search Results for author: Kai Fong Ernest Chong

Found 9 papers, 5 papers with code

Spectral Co-Distillation for Personalized Federated Learning

1 code implementation NeurIPS 2023 Zihan Chen, Howard H. Yang, Tony Q. S. Quek, Kai Fong Ernest Chong

Personalized federated learning (PFL) has been widely investigated to address the challenge of data heterogeneity, especially when a single generic model is inadequate in satisfying the diverse performance requirements of local clients simultaneously.

Personalized Federated Learning

The Role of Federated Learning in a Wireless World with Foundation Models

no code implementations6 Oct 2023 Zihan Chen, Howard H. Yang, Y. C. Tay, Kai Fong Ernest Chong, Tony Q. S. Quek

Foundation models (FMs) are general-purpose artificial intelligence (AI) models that have recently enabled multiple brand-new generative AI applications.

Federated Learning

GenKL: An Iterative Framework for Resolving Label Ambiguity and Label Non-conformity in Web Images Via a New Generalized KL Divergence

1 code implementation19 Jul 2023 Xia Huang, Kai Fong Ernest Chong

To tackle the limitation of entropy maximization, we propose $(\alpha, \beta)$-generalized KL divergence, $\mathcal{D}_{\text{KL}}^{\alpha, \beta}(p\|q)$, which can be used to identify significantly more NC instances.

FedCorr: Multi-Stage Federated Learning for Label Noise Correction

1 code implementation CVPR 2022 Jingyi Xu, Zihan Chen, Tony Q. S. Quek, Kai Fong Ernest Chong

Although there exist methods in centralized learning for tackling label noise, such methods do not perform well on heterogeneous label noise in FL settings, due to the typically smaller sizes of client datasets and data privacy requirements in FL.

Federated Learning Privacy Preserving

Dynamic Attention-based Communication-Efficient Federated Learning

no code implementations12 Aug 2021 Zihan Chen, Kai Fong Ernest Chong, Tony Q. S. Quek

Federated learning (FL) offers a solution to train a global machine learning model while still maintaining data privacy, without needing access to data stored locally at the clients.

Federated Learning

Training Classifiers that are Universally Robust to All Label Noise Levels

1 code implementation27 May 2021 Jingyi Xu, Tony Q. S. Quek, Kai Fong Ernest Chong

In particular, we shall assume that a small subset of any given noisy dataset is known to have correct labels, which we treat as "positive", while the remaining noisy subset is treated as "unlabeled".

Ranked #7 on Image Classification on Clothing1M (using clean data) (using extra training data)

Image Classification

An information-theoretic framework for learning models of instance-independent label noise

no code implementations1 Jan 2021 Xia Huang, Kai Fong Ernest Chong

At the heart of our framework is a discriminator that predicts whether an input dataset has maximum Shannon entropy, which shall be used on multiple new datasets $\hat{\mathcal{D}}$ synthesized from $\mathcal{D}$ via the insertion of additional label noise.

A closer look at the approximation capabilities of neural networks

no code implementations ICLR 2020 Kai Fong Ernest Chong

(ii) There exists some $\lambda>0$ (depending only on $f$ and $\sigma$), such that the UAP still holds if we restrict all non-bias weights $w$ in the first layer to satisfy $|w|>\lambda$.

Cannot find the paper you are looking for? You can Submit a new open access paper.