Adaptive Hedging under Delayed Feedback

02/27/2019
by   Alexander Korotin, et al.
0

The article is devoted to investigating the application of hedging strategies to online expert weight allocation under delayed feedback. As the main result, we develop the General Hedging algorithm G based on the exponential reweighing of experts' losses. We build the artificial probabilistic framework and use it to prove the adversarial loss bounds for the algorithm G in the delayed feedback setting. The designed algorithm G can be applied to both countable and continuous sets of experts. We also show how algorithm G extends classical Hedge (Multiplicative Weights) and adaptive Fixed Share algorithms to the delayed feedback and derive their regret bounds for the delayed setting by using our main result.

READ FULL TEXT
research
04/13/2022

Second Order Regret Bounds Against Generalized Expert Sequences under Partial Bandit Feedback

We study the problem of expert advice under partial bandit feedback sett...
research
02/15/2012

Mirror Descent Meets Fixed Share (and feels no regret)

Mirror descent with an entropic regularizer is known to achieve shifting...
research
07/22/2012

Optimal discovery with probabilistic expert advice: finite time analysis and macroscopic optimality

We consider an original problem that arises from the issue of security a...
research
03/09/2020

Robust Learning from Discriminative Feature Feedback

Recent work introduced the model of learning from discriminative feature...
research
12/01/2022

Regret Loss of Prejudiced Algorithms

The behavior of predictive algorithms built on data generated by a preju...
research
08/02/2018

Online Aggregation of Unbounded Losses Using Shifting Experts with Confidence

We develop the setting of sequential prediction based on shifting expert...
research
02/22/2022

No-Regret Learning with Unbounded Losses: The Case of Logarithmic Pooling

For each of T time steps, m experts report probability distributions ove...

Please sign up or login with your details

Forgot password? Click here to reset