Stochastic Gradient Variance Reduction by Solving a Filtering Problem

12/22/2020
by   Xingyi Yang, et al.
14

Deep neural networks (DNN) are typically optimized using stochastic gradient descent (SGD). However, the estimation of the gradient using stochastic samples tends to be noisy and unreliable, resulting in large gradient variance and bad convergence. In this paper, we propose Filter Gradient Decent (FGD), an efficient stochastic optimization algorithm that makes the consistent estimation of the local gradient by solving an adaptive filtering problem with different design of filters. Our method reduces variance in stochastic gradient descent by incorporating the historical states to enhance the current estimation. It is able to correct noisy gradient direction as well as to accelerate the convergence of learning. We demonstrate the effectiveness of the proposed Filter Gradient Descent on numerical optimization and training neural networks, where it achieves superior and robust performance compared with traditional momentum-based methods. To the best of our knowledge, we are the first to provide a practical solution that integrates filtering into gradient estimation by making the analogy between gradient estimation and filtering problems in signal processing. (The code is provided in https://github.com/Adamdad/Filter-Gradient-Decent)

READ FULL TEXT

page 1

page 5

research
10/29/2018

Kalman Gradient Descent: Adaptive Variance Reduction in Stochastic Optimization

We introduce Kalman Gradient Descent, a stochastic optimization algorith...
research
03/26/2018

On the Performance of Preconditioned Stochastic Gradient Descent

This paper studies the performance of preconditioned stochastic gradient...
research
07/07/2021

KaFiStO: A Kalman Filtering Framework for Stochastic Optimization

Optimization is often cast as a deterministic problem, where the solutio...
research
10/30/2019

Lsh-sampling Breaks the Computation Chicken-and-egg Loop in Adaptive Stochastic Gradient Estimation

Stochastic Gradient Descent or SGD is the most popular optimization algo...
research
07/19/2018

A unified theory of adaptive stochastic gradient descent as Bayesian filtering

There are a diverse array of schemes for adaptive stochastic gradient de...
research
07/23/2018

Particle Filtering Methods for Stochastic Optimization with Application to Large-Scale Empirical Risk Minimization

There is a recent interest in developing statistical filtering methods f...
research
09/21/2023

Soft Merging: A Flexible and Robust Soft Model Merging Approach for Enhanced Neural Network Performance

Stochastic Gradient Descent (SGD), a widely used optimization algorithm ...

Please sign up or login with your details

Forgot password? Click here to reset