Kernel Distributionally Robust Optimization

06/12/2020
by   Jia-Jie Zhu, et al.
0

This paper is an in-depth investigation of using kernel methods to immunize optimization solutions against distributional ambiguity. We propose kernel distributionally robust optimization (K-DRO) using insights from the robust optimization theory and functional analysis. Our method uses reproducing kernel Hilbert spaces (RKHS) to construct ambiguity sets. It can be reformulated as a tractable program by using the conic duality of moment problems and an extension of the RKHS representer theorem. Our insights reveal that universal RKHSs are large enough for K-DRO to be effective. This paper provides both theoretical analyses that extend the robustness properties of kernel methods, as well as practical algorithms that can be applied to general optimization problems, not limited to kernelized models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/31/2020

Worst-Case Risk Quantification under Distributional Ambiguity using Kernel Mean Embedding in Moment Problem

In order to anticipate rare and impactful events, we propose to quantify...
research
01/27/2021

Reproducing kernel Hilbert C*-module and kernel mean embeddings

Kernel methods have been among the most popular techniques in machine le...
research
04/27/2023

Propagating Kernel Ambiguity Sets in Nonlinear Data-driven Dynamics Models

This paper provides answers to an open problem: given a nonlinear data-d...
research
08/31/2022

A general framework for the analysis of kernel-based tests

Kernel-based tests provide a simple yet effective framework that use the...
research
05/19/2019

An Online Stochastic Kernel Machine for Robust Signal Classification

We present a novel variation of online kernel machines in which we explo...
research
02/15/2023

Distributionally-Robust Optimization with Noisy Data for Discrete Uncertainties Using Total Variation Distance

Stochastic programs where the uncertainty distribution must be inferred ...
research
07/21/2014

Scalable Kernel Methods via Doubly Stochastic Gradients

The general perception is that kernel methods are not scalable, and neur...

Please sign up or login with your details

Forgot password? Click here to reset