Optimal Rates of Teaching and Learning Under Uncertainty

04/14/2021
by   Yan Hao Ling, et al.
0

In this paper, we consider a recently-proposed model of teaching and learning under uncertainty, in which a teacher receives independent observations of a single bit corrupted by binary symmetric noise, and sequentially transmits to a student through another binary symmetric channel based on the bits observed so far. After a given number n of transmissions, the student outputs an estimate of the unknown bit, and we are interested in the exponential decay rate of the error probability as n increases. We propose a novel block-structured teaching strategy in which the teacher encodes the number of 1s received in each block, and show that the resulting error exponent is the binary relative entropy D(1/2max(p,q)), where p and q are the noise parameters. This matches a trivial converse result based on the data processing inequality, and settles two conjectures of [Jog and Loh, 2021] and [Huleihel et al., 2019]. In addition, we show that the computation time required by the teacher and student is linear in n. We also study a more general setting in which the binary symmetric channels are replaced by general binary-input discrete memoryless channels. We provide an achievability bound and a converse bound, and show that the two coincide in certain cases, including (i) when the two channels are identical, and (ii) when the student-teacher channel is a binary symmetric channel. More generally, we give sufficient conditions under which our achievable learning rate is the best possible for block-structured protocols.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset