Privacy-Preserving Distributed Optimisation using Stochastic PDMM

13 Dec 2023  ·  Sebastian O. Jordan, Qiongxiu Li, Richard Heusdens ·

Privacy-preserving distributed processing has received considerable attention recently. The main purpose of these algorithms is to solve certain signal processing tasks over a network in a decentralised fashion without revealing private/secret data to the outside world. Because of the iterative nature of these distributed algorithms, computationally complex approaches such as (homomorphic) encryption are undesired. Recently, an information theoretic method called subspace perturbation has been introduced for synchronous update schemes. The main idea is to exploit a certain structure in the update equations for noise insertion such that the private data is protected without compromising the algorithm's accuracy. This structure, however, is absent in asynchronous update schemes. In this paper we will investigate such asynchronous schemes and derive a lower bound on the noise variance after random initialisation of the algorithm. This bound shows that the privacy level of asynchronous schemes is always better than or at least equal to that of synchronous schemes. Computer simulations are conducted to consolidate our theoretical results.

PDF Abstract
No code implementations yet. Submit your code now

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here