Adaptive First- and Second-Order Algorithms for Large-Scale Machine Learning

11/29/2021
by   Sanae Lotfi, et al.
0

In this paper, we consider both first- and second-order techniques to address continuous optimization problems arising in machine learning. In the first-order case, we propose a framework of transition from deterministic or semi-deterministic to stochastic quadratic regularization methods. We leverage the two-phase nature of stochastic optimization to propose a novel first-order algorithm with adaptive sampling and adaptive step size. In the second-order case, we propose a novel stochastic damped L-BFGS method that improves on previous algorithms in the highly nonconvex context of deep learning. Both algorithms are evaluated on well-known deep learning datasets and exhibit promising performance.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/12/2016

Second-Order Stochastic Optimization for Machine Learning in Linear Time

First-order stochastic methods are the state-of-the-art in large-scale m...
research
11/03/2022

Extra-Newton: A First Approach to Noise-Adaptive Accelerated Second-Order Methods

This work proposes a universal and adaptive second-order method for mini...
research
09/11/2021

Doubly Adaptive Scaled Algorithm for Machine Learning Using Second-Order Information

We present a novel adaptive optimization algorithm for large-scale machi...
research
08/08/2018

Random directions stochastic approximation with deterministic perturbations

We introduce deterministic perturbation schemes for the recently propose...
research
06/15/2016

Optimization Methods for Large-Scale Machine Learning

This paper provides a review and commentary on the past, present, and fu...
research
06/06/2019

Blackwell dominance in large samples

We study repeated independent Blackwell experiments; standard examples i...
research
11/28/2022

Stochastic Steffensen method

Is it possible for a first-order method, i.e., only first derivatives al...

Please sign up or login with your details

Forgot password? Click here to reset