Convergence rate of Tsallis entropic regularized optimal transport

04/13/2023
by   Takeshi Suguro, et al.
0

In this paper, we consider Tsallis entropic regularized optimal transport and discuss the convergence rate as the regularization parameter ε goes to 0. In particular, we establish the convergence rate of the Tsallis entropic regularized optimal transport using the quantization and shadow arguments developed by Eckstein–Nutz. We compare this to the convergence rate of the entropic regularized optimal transport with Kullback–Leibler (KL) divergence and show that KL is the fastest convergence rate in terms of Tsallis relative entropy.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/30/2022

Convergence Rates for Regularized Optimal Transport via Quantization

We study the convergence of divergence-regularized optimal transport as ...
research
09/20/2023

Error estimate for regularized optimal transport problems via Bregman divergence

Regularization by the Shannon entropy enables us to efficiently and appr...
research
12/01/2021

Convergence of Batch Greenkhorn for Regularized Multimarginal Optimal Transport

In this work we propose a batch version of the Greenkhorn algorithm for ...
research
09/05/2012

Learning Probability Measures with respect to Optimal Transport Metrics

We study the problem of estimating, in the sense of optimal transport me...
research
03/21/2019

Learning with Batch-wise Optimal Transport Loss for 3D Shape Recognition

Deep metric learning is essential for visual recognition. The widely use...
research
01/06/2022

An Homogeneous Unbalanced Regularized Optimal Transport model with applications to Optimal Transport with Boundary

This work studies how the introduction of the entropic regularization te...
research
10/18/2017

A Sinkhorn-Newton method for entropic optimal transport

We consider the entropic regularization of discretized optimal transport...

Please sign up or login with your details

Forgot password? Click here to reset