Global sensitivity analysis for optimization with variable selection

11/12/2018
by   Adrien Spagnol, et al.
0

The optimization of high dimensional functions is a key issue in engineering problems but it frequently comes at a cost that is not acceptable since it usually involves a complex and expensive computer code. Engineers often overcome this limitation by first identifying which parameters drive the most the function variations: non-influential variables are set to a fixed value and the optimization procedure is carried out with the remaining influential variables. Such variable selection is performed through influence measures that are meaningful for regression problems. However it does not account for the specific structure of optimization problems where we would like to identify which variables most lead to constraints satisfaction and low values of the objective function. In this paper, we propose a new sensitivity analysis that accounts for the specific aspects of optimization problems. In particular, we introduce an influence measure based on the Hilbert-Schmidt Independence Criterion to characterize whether a design variable matters to reach low values of the objective function and to satisfy the constraints. This sensitivity measure makes it possible to sort the inputs and reduce the problem dimension. We compare a random and a greedy strategies to set the values of the non-influential variables before conducting a local optimization. Applications to several test-cases show that this variable selection and the greedy strategy significantly reduce the number of function evaluations at a limited cost in terms of solution performance.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/18/2018

Variable Importance Assessments and Backward Variable Selection for High-Dimensional Data

Variable selection in high-dimensional scenarios is of great interested ...
research
04/27/2018

Sequential Optimization in Locally Important Dimensions

Optimizing a black-box function is challenging when the underlying funct...
research
06/27/2012

Joint Optimization and Variable Selection of High-dimensional Gaussian Processes

Maximizing high-dimensional, non-convex functions through noisy observat...
research
02/12/2016

Scale-free network optimization: foundations and algorithms

We investigate the fundamental principles that drive the development of ...
research
03/04/2020

The discrete optimization problems with interval objective function on graphs and hypergraphs and the interval greedy algorithm

We consider the discrete optimization problems with interval objective f...
research
09/30/2022

Experts in the Loop: Conditional Variable Selection for Accelerating Post-Silicon Analysis Based on Deep Learning

Post-silicon validation is one of the most critical processes in modern ...
research
09/13/2023

Effect of hyperparameters on variable selection in random forests

Random forests (RFs) are well suited for prediction modeling and variabl...

Please sign up or login with your details

Forgot password? Click here to reset