Information Extraction Under Privacy Constraints

11/07/2015
by   Shahab Asoodeh, et al.
0

A privacy-constrained information extraction problem is considered where for a pair of correlated discrete random variables (X,Y) governed by a given joint distribution, an agent observes Y and wants to convey to a potentially public user as much information about Y as possible without compromising the amount of information revealed about X. To this end, the so-called rate-privacy function is introduced to quantify the maximal amount of information (measured in terms of mutual information) that can be extracted from Y under a privacy constraint between X and the extracted information, where privacy is measured using either mutual information or maximal correlation. Properties of the rate-privacy function are analyzed and information-theoretic and estimation-theoretic interpretations of it are presented for both the mutual information and maximal correlation privacy measures. It is also shown that the rate-privacy function admits a closed-form expression for a large family of joint distributions of (X,Y). Finally, the rate-privacy function under the mutual information privacy measure is considered for the case where (X,Y) has a joint probability density function by studying the problem where the extracted information is a uniform quantization of Y corrupted by additive Gaussian noise. The asymptotic behavior of the rate-privacy function is studied as the quantization resolution grows without bound and it is observed that not all of the properties of the rate-privacy function carry over from the discrete to the continuous case.

READ FULL TEXT

page 1

page 2

research
12/22/2017

On Perfect Privacy and Maximal Correlation

The problem of private data disclosure is studied from an information th...
research
11/12/2020

Bottleneck Problems: Information and Estimation-Theoretic View

Information bottleneck (IB) and privacy funnel (PF) are two closely rela...
research
12/30/2021

Studying the Interplay between Information Loss and Operation Loss in Representations for Classification

Information-theoretic measures have been widely adopted in the design of...
research
12/07/2018

Information-Distilling Quantizers

Let X and Y be dependent random variables. This paper considers the prob...
research
01/27/2021

The Most Informative Order Statistic and its Application to Image Denoising

We consider the problem of finding the subset of order statistics that c...
research
11/22/2020

Autonomous learning of nonlocal stochastic neuron dynamics

Neuronal dynamics is driven by externally imposed or internally generate...
research
05/09/2022

The Compound Information Bottleneck Outlook

We formulate and analyze the compound information bottleneck programming...

Please sign up or login with your details

Forgot password? Click here to reset