The Survival Bandit Problem

06/07/2022
by   Charles Riou, et al.
0

We study the survival bandit problem, a variant of the multi-armed bandit problem introduced in an open problem by Perotto et al. (2019), with a constraint on the cumulative reward; at each time step, the agent receives a (possibly negative) reward and if the cumulative reward becomes lower than a prespecified threshold, the procedure stops, and this phenomenon is called ruin. This is the first paper studying a framework where the ruin might occur but not always. We first discuss that a sublinear regret is unachievable under a naive definition of the regret. Next, we provide tight lower bounds on the probability of ruin (as well as matching policies). Based on this lower bound, we define the survival regret as an objective to minimize and provide a policy achieving a sublinear survival regret (at least in the case of integral rewards) when the time horizon T is known.

READ FULL TEXT

page 1

page 2

page 3

page 4

10/23/2021

Multi-armed Bandit Algorithm against Strategic Replication

We consider a multi-armed bandit problem in which a set of arms is regis...
07/22/2011

Robustness of Anytime Bandit Policies

This paper studies the deviations of the regret in a stochastic multi-ar...
04/08/2020

A Dynamic Observation Strategy for Multi-agent Multi-armed Bandit Problem

We define and analyze a multi-agent multi-armed bandit problem in which ...
10/27/2011

The multi-armed bandit problem with covariates

We consider a multi-armed bandit problem in a setting where each arm pro...
11/27/2022

Rectified Pessimistic-Optimistic Learning for Stochastic Continuum-armed Bandit with Constraints

This paper studies the problem of stochastic continuum-armed bandit with...
06/18/2016

On Reward Function for Survival

Obtaining a survival strategy (policy) is one of the fundamental problem...
05/17/2019

Pair Matching: When bandits meet stochastic block model

The pair-matching problem appears in many applications where one wants t...

Please sign up or login with your details

Forgot password? Click here to reset