SwiftAgg+: Achieving Asymptotically Optimal Communication Loads in Secure Aggregation for Federated Learning

03/24/2022
by   Tayyebeh Jahani-Nezhad, et al.
0

We propose SwiftAgg+, a novel secure aggregation protocol for federated learning systems, where a central server aggregates local models of N ∈ℕ distributed users, each of size L ∈ℕ, trained on their local data, in a privacy-preserving manner. SwiftAgg+ can significantly reduce the communication overheads without any compromise on security, and achieve optimal communication loads within diminishing gaps. Specifically, in presence of at most D dropout users, SwiftAgg+ achieves a per-user communication load of (1+𝒪(1/N))L and a server communication load of (1+𝒪(1/N))L, with a worst-case information-theoretic security guarantee, against any subset of up to T semi-honest users who may also collude with the curious server. Moreover, the proposed SwiftAgg+ allows for a flexible trade-off between communication loads and the number of active communication links. In particular, for any K∈ℕ, SwiftAgg+ can achieve the server communication load of (1+T/K)L, and per-user communication load of up to (1+T+D/K)L, where the number of pair-wise active connections in the network is N/2(K+T+D+1).

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset