Estimation from Quantized Gaussian Measurements: When and How to Use Dither

11/16/2018
by   Joshua Rapp, et al.
0

Subtractive dither is a powerful method for removing the signal dependence of quantization noise for coarsely-quantized signals. However, estimation from dithered measurements often naively applies the sample mean or midrange, even when the total noise is not well described with a Gaussian or uniform distribution. We show that the generalized Gaussian distribution approximately describes subtractively-dithered, quantized samples of a Gaussian distribution. Furthermore, a generalized Gaussian fit leads to simple estimators based on order statistics that match the performance of more complicated maximum likelihood estimators requiring iterative solvers. The order statistics-based estimators outperform both the sample mean and midrange for nontrivial sums of Gaussian and uniform noise. Additional analysis of the generalized Gaussian approximation yields rules of thumb for determining when and how to apply dither to quantized measurements.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset