A Randomised Subspace Gauss-Newton Method for Nonlinear Least-Squares

by   Coralia Cartis, et al.

We propose a Randomised Subspace Gauss-Newton (R-SGN) algorithm for solving nonlinear least-squares optimization problems, that uses a sketched Jacobian of the residual in the variable domain and solves a reduced linear least-squares on each iteration. A sublinear global rate of convergence result is presented for a trust-region variant of R-SGN, with high probability, which matches deterministic counterpart results in the order of the accuracy tolerance. Promising preliminary numerical results are presented for R-SGN on logistic regression and on nonlinear regression problems from the CUTEst collection.


page 1

page 2

page 3

page 4


A doubly relaxed minimal-norm Gauss-Newton method for nonlinear least-squares

When a physical system is modeled by a nonlinear function, the unknown p...

Symbolic Regression with Fast Function Extraction and Nonlinear Least Squares Optimization

Fast Function Extraction (FFX) is a deterministic algorithm for solving ...

Global linear convergence of Newton's method without strong-convexity or Lipschitz gradients

We show that Newton's method converges globally at a linear rate for obj...

The q-Gauss-Newton method for unconstrained nonlinear optimization

A q-Gauss-Newton algorithm is an iterative procedure that solves nonline...

Continuation Newton method with the trust-region time-stepping scheme

For the problem of nonlinear equations, the homotopy methods (continuati...

Scalable Derivative-Free Optimization for Nonlinear Least-Squares Problems

Derivative-free - or zeroth-order - optimization (DFO) has gained recent...

A note on solving nonlinear optimization problems in variable precision

This short note considers an efficient variant of the trust-region algor...

Please sign up or login with your details

Forgot password? Click here to reset