Optimization using Parallel Gradient Evaluations on Multiple Parameters

02/06/2023
by   Yash Chandak, et al.
0

We propose a first-order method for convex optimization, where instead of being restricted to the gradient from a single parameter, gradients from multiple parameters can be used during each step of gradient descent. This setup is particularly useful when a few processors are available that can be used in parallel for optimization. Our method uses gradients from multiple parameters in synergy to update these parameters together towards the optima. While doing so, it is ensured that the computational and memory complexity is of the same order as that of gradient descent. Empirical results demonstrate that even using gradients from as low as two parameters, our method can often obtain significant acceleration and provide robustness to hyper-parameter settings. We remark that the primary goal of this work is less theoretical, and is instead aimed at exploring the understudied case of using multiple gradients during each step of optimization.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/25/2019

Complexity of Highly Parallel Non-Smooth Convex Optimization

A landmark result of non-smooth convex optimization is that gradient des...
research
11/29/2022

Mirror descent of Hopfield model

Mirror descent is a gradient descent method that uses a dual space of pa...
research
05/25/2023

DoWG Unleashed: An Efficient Universal Parameter-Free Gradient Descent Method

This paper proposes a new easy-to-implement parameter-free gradient-base...
research
06/20/2021

Memory Augmented Optimizers for Deep Learning

Popular approaches for minimizing loss in data-driven learning often inv...
research
04/21/2023

Gradient Derivation for Learnable Parameters in Graph Attention Networks

This work provides a comprehensive derivation of the parameter gradients...
research
01/09/2019

The Lingering of Gradients: How to Reuse Gradients over Time

Classically, the time complexity of a first-order method is estimated by...
research
12/14/2018

Parallel and Scalable Heat Methods for Geodesic Distance Computation

In this paper, we propose a parallel and scalable approach for geodesic ...

Please sign up or login with your details

Forgot password? Click here to reset