no code implementations • 6 Feb 2024 • Xiaoxin Su, Yipeng Zhou, Laizhong Cui, Song Guo
Recently, federated learning (FL) has gained momentum because of its capability in preserving data privacy.
no code implementations • 6 Feb 2024 • Xiaoxin Su, Yipeng Zhou, Laizhong Cui, John C. S. Lui, Jiangchuan Liu
In Federated Learning (FL) paradigm, a parameter server (PS) concurrently communicates with distributed participating clients for model collection, update aggregation, and model distribution over multiple rounds, without touching private data owned by individual clients.
no code implementations • 13 Nov 2023 • Rongwei Lu, Yutong Jiang, Yinan Mao, Chen Tang, Bin Chen, Laizhong Cui, Zhi Wang
Assigning varying compression ratios to workers with distinct data distributions and volumes is thus a promising solution.
no code implementations • 5 Sep 2022 • Dongyuan Su, Yipeng Zhou, Laizhong Cui
To boost the convergence of DFL, a vehicle tunes the aggregation weight of each data source by minimizing the KL divergence of its state vector, and its effectiveness in diversifying data sources can be theoretically proved.
no code implementations • 12 Aug 2022 • Laizhong Cui, Xiaoxin Su, Yipeng Zhou
Recently, blockchain-based federated learning (BFL) has attracted intensive research attention due to that the training process is auditable and the architecture is serverless avoiding the single point failure of the parameter server in vanilla federated learning (VFL).
1 code implementation • 21 Jul 2022 • Kui Jiang, Zhongyuan Wang, Chen Chen, Zheng Wang, Laizhong Cui, Chia-Wen Lin
Convolutional neural network (CNN) and Transformer have achieved great success in multimedia applications.
no code implementations • 13 Dec 2021 • Laizhong Cui, Xiaoxin Su, Yipeng Zhou, Jiangchuan Liu
Federated Learning (FL) incurs high communication overhead, which can be greatly alleviated by compression for model updates.
no code implementations • 10 May 2021 • Laizhong Cui, Xiaoxin Su, Yipeng Zhou, Yi Pan
Then, we further propose the boosted MUCSC (B-MUCSC) algorithm, a biased compression algorithm that can achieve an extremely high compression rate by grouping insignificant model updates into a super cluster.