On Second-order Optimization Methods for Federated Learning

09/06/2021
by   Sebastian Bischoff, et al.
0

We consider federated learning (FL), where the training data is distributed across a large number of clients. The standard optimization method in this setting is Federated Averaging (FedAvg), which performs multiple local first-order optimization steps between communication rounds. In this work, we evaluate the performance of several second-order distributed methods with local steps in the FL setting which promise to have favorable convergence properties. We (i) show that FedAvg performs surprisingly well against its second-order competitors when evaluated under fair metrics (equal amount of local computations)-in contrast to the results of previous work. Based on our numerical study, we propose (ii) a novel variant that uses second-order local information for updates and a global line search to counteract the resulting local specificity.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/20/2023

Preconditioned Federated Learning

Federated Learning (FL) is a distributed machine learning approach that ...
research
03/29/2022

Over-the-Air Federated Learning via Second-Order Optimization

Federated learning (FL) is a promising learning paradigm that can tackle...
research
09/06/2022

Faster federated optimization under second-order similarity

Federated learning (FL) is a subfield of machine learning where multiple...
research
07/14/2022

Accelerated Federated Learning with Decoupled Adaptive Optimization

The federated learning (FL) framework enables edge clients to collaborat...
research
09/01/2022

Versatile Single-Loop Method for Gradient Estimator: First and Second Order Optimality, and its Application to Federated Learning

While variance reduction methods have shown great success in solving lar...
research
02/11/2022

A Newton-type algorithm for federated learning based on incremental Hessian eigenvector sharing

There is a growing interest in the decentralized optimization framework ...
research
11/02/2021

Basis Matters: Better Communication-Efficient Second Order Methods for Federated Learning

Recent advances in distributed optimization have shown that Newton-type ...

Please sign up or login with your details

Forgot password? Click here to reset