BREATHE: Second-Order Gradients and Heteroscedastic Emulation based Design Space Exploration

08/16/2023
by   Shikhar Tuli, et al.
0

Researchers constantly strive to explore larger and more complex search spaces in various scientific studies and physical experiments. However, such investigations often involve sophisticated simulators or time-consuming experiments that make exploring and observing new design samples challenging. Previous works that target such applications are typically sample-inefficient and restricted to vector search spaces. To address these limitations, this work proposes a constrained multi-objective optimization (MOO) framework, called BREATHE, that searches not only traditional vector-based design spaces but also graph-based design spaces to obtain best-performing graphs. It leverages second-order gradients and actively trains a heteroscedastic surrogate model for sample-efficient optimization. In a single-objective vector optimization application, it leads to 64.1 random forest regression. In graph-based search, BREATHE outperforms the next-best baseline, i.e., a graphical version of Gaussian-process-based Bayesian optimization, with up to 64.9 achieves up to 21.9× higher hypervolume than the state-of-the-art method, multi-objective Bayesian optimization (MOBOpt). BREATHE also outperforms the baseline methods on most standard MOO benchmark applications.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/22/2021

Multi-Objective Bayesian Optimization over High-Dimensional Search Spaces

The ability to optimize multiple competing objective functions with high...
research
06/01/2023

Large-Batch, Neural Multi-Objective Bayesian Optimization

Bayesian optimization provides a powerful framework for global optimizat...
research
06/09/2020

Differentiable Expected Hypervolume Improvement for Parallel Multi-Objective Bayesian Optimization

In many real-world scenarios, decision makers seek to efficiently optimi...
research
06/22/2021

Latency-Aware Neural Architecture Search with Multi-Objective Bayesian Optimization

When tuning the architecture and hyperparameters of large machine learni...
research
03/23/2022

Accelerating Bayesian Optimization for Biological Sequence Design with Denoising Autoencoders

Bayesian optimization is a gold standard for query-efficient continuous ...
research
03/28/2023

ytopt: Autotuning Scientific Applications for Energy Efficiency at Large Scales

As we enter the exascale computing era, efficiently utilizing power and ...

Please sign up or login with your details

Forgot password? Click here to reset