Compressed-VFL: Communication-Efficient Learning with Vertically Partitioned Data

06/16/2022
by   Timothy Castiglia, et al.
13

We propose Compressed Vertical Federated Learning (C-VFL) for communication-efficient training on vertically partitioned data. In C-VFL, a server and multiple parties collaboratively train a model on their respective features utilizing several local iterations and sharing compressed intermediate results periodically. Our work provides the first theoretical analysis of the effect message compression has on distributed training over vertically partitioned data. We prove convergence of non-convex objectives at a rate of O(1/√(T)) when the compression error is bounded over the course of training. We provide specific requirements for convergence with common compression techniques, such as quantization and top-k sparsification. Finally, we experimentally show compression can reduce communication by over 90% without a significant decrease in accuracy over VFL without compression.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/06/2021

Multi-Tier Federated Learning for Vertically Partitioned Data

We consider decentralized model training in tiered communication network...
research
11/19/2019

On the Discrepancy between the Theoretical Analysis and Practical Implementations of Compressed Communication for Distributed Deep Learning

Compressed communication, in the form of sparsification or quantization ...
research
08/19/2021

Cross-Silo Federated Learning for Multi-Tier Networks with Vertical and Horizontal Data Partitioning

We consider federated learning in tiered communication networks. Our net...
research
07/20/2021

CANITA: Faster Rates for Distributed Convex Optimization with Communication Compression

Due to the high communication cost in distributed and federated learning...
research
11/25/2022

Analysis of Error Feedback in Federated Non-Convex Optimization with Biased Compression

In federated learning (FL) systems, e.g., wireless networks, the communi...
research
08/26/2022

Flexible Vertical Federated Learning with Heterogeneous Parties

We propose Flexible Vertical Federated Learning (Flex-VFL), a distribute...
research
05/27/2019

Natural Compression for Distributed Deep Learning

Due to their hunger for big data, modern deep learning models are traine...

Please sign up or login with your details

Forgot password? Click here to reset