Asymptotically Optimal Stochastic Lossy Coding of Markov Sources

11/08/2022
by   Ahmed Elshafiy, et al.
0

An effective 'on-the-fly' mechanism for stochastic lossy coding of Markov sources using string matching techniques is proposed in this paper. Earlier work has shown that the rate-distortion bound can be asymptotically achieved by a 'natural type selection' (NTS) mechanism which iteratively encodes asymptotically long source strings (from an unknown source distribution P) and regenerates the codebook according to a maximum likelihood distribution framework, after observing a set of K codewords to 'd-match' (i.e., satisfy the distortion constraint for) a respective set of K source words. This result was later generalized for sources with memory under the assumption that the source words must contain a sequence of asymptotic-length vectors (or super-symbols) over the source super-alphabet, i.e., the source is considered a vector source. However, the earlier result suffers from a significant practical flaw, more specifically, it requires expanding the super-symbols (and correspondingly the super-alphabet) lengths to infinity in order to achieve the rate-distortion bound, even for finite memory sources, e.g., Markov sources. This implies that the complexity of the NTS iteration will explode beyond any practical capabilities, thus compromising the promise of the NTS algorithm in practical scenarios for sources with memory. This work describes a considerably more efficient and tractable mechanism to achieve asymptotically optimal performance given a prescribed memory constraint, within a practical framework tailored to Markov sources. More specifically, the algorithm finds asymptotically the optimal codebook reproduction distribution, within a constrained set of distributions having Markov property with a prescribed order, that achieves the minimum per letter coding rate while maintaining a specified distortion level.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/27/2021

Universal Randomized Guessing Subjected to Distortion

In this paper, we consider the problem of guessing a sequence subject to...
research
04/25/2018

The Dispersion of the Gauss-Markov Source

The Gauss-Markov source produces U_i = aU_i-1 + Z_i for i≥ 1, where U_0 ...
research
04/10/2018

The Sum-Rate-Distortion Region of Correlated Gauss-Markov Sources

We derive the sum-rate-distortion region for a generic number of success...
research
12/23/2022

A Universal Random Coding Ensemble for Sample-wise Lossy Compression

We propose a universal ensemble for random selection of rate-distortion ...
research
02/01/2018

Redundancy of Markov Family with Unbounded Memory

We study the redundancy of universally compressing strings X_1,..., X_n ...
research
02/01/2018

Redundancy of unbounded memory Markov classes with continuity conditions

We study the redundancy of universally compressing strings X_1,..., X_n ...
research
05/11/2021

An Efficient Bayes Coding Algorithm for the Non-Stationary Source in Which Context Tree Model Varies from Interval to Interval

The context tree source is a source model in which the occurrence probab...

Please sign up or login with your details

Forgot password? Click here to reset