Accelerating Derivative-Free Optimization with Dimension Reduction and Hyperparameter Learning

01/18/2021
by   jordanrhall, et al.
0

We consider convex, black-box objective functions with additive or multiplicative noise with a high-dimensional parameter space and a data space of lower dimension, where gradients of the map exist, but may be inaccessible. We investigate Derivative-Free Optimization (DFO) in this setting and propose a novel method, Active STARS (ASTARS), blending the DFO algorithm STARS (Chen and Wild, 2015) and dimension reduction in parameter space via Active Subspace (AS) methods (Constantine, 2015). STARS hyperparmeters are inversely proportional to the known dimension of parameter space, resulting in heavy smoothing and small step sizes for large dimensions. When possible, ASTARS leverages a lower-dimensional AS, defining a set of directions in parameter space causing the majority of the variance in function values. ASTARS iterates are updated with steps only taken in the AS, reducing the value of the objective function more efficiently than STARS, which updates iterates in the full parameter space. Computational costs may be reduced further by estimating ASTARS hyperparameters and the AS, reducing the total evaluations of the objective function and eliminating the requirement that the user specify hyperparameters and AS's, which may be unknown. We call this method Fully Automated ASTARS (FAASTARS). We show that STARS and ASTARS will both converge -- with a certain complexity -- even with inexact, estimated hyperparemters; we also find that FAASTARS converges with the use of estimated AS's and hyperparameters. We explore the effectiveness of ASTARS and FAASTARS in numerical examples which compare ASTARS and FAASTARS to STARS.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/22/2021

A local approach to parameter space reduction for regression and classification tasks

Frequently, the parameter space, chosen for shape design or other applic...
research
01/18/2016

Reducing local minima in fitness landscapes of parameter estimation by using piecewise evaluation and state estimation

Ordinary differential equations (ODE) are widely used for modeling in Sy...
research
02/26/2021

Derivative-Free Multiobjective Trust Region Descent Method Using Radial Basis Function Surrogate Models

We present a flexible trust region descend algorithm for unconstrained a...
research
11/30/2022

Plateau-free Differentiable Path Tracing

Current differentiable renderers provide light transport gradients with ...
research
07/02/2020

Efficient estimation of the ANOVA mean dimension, with an application to neural net classification

The mean dimension of a black box function of d variables is a convenien...
research
07/18/2022

Gradient-based data and parameter dimension reduction for Bayesian models: an information theoretic perspective

We consider the problem of reducing the dimensions of parameters and dat...
research
05/15/2020

High-dimensional Bayesian Optimization of Personalized Cardiac Model Parameters via an Embedded Generative Model

The estimation of patient-specific tissue properties in the form of mode...

Please sign up or login with your details

Forgot password? Click here to reset