Federated Inference with Reliable Uncertainty Quantification over Wireless Channels via Conformal Prediction

by   Meiyi Zhu, et al.

Consider a setting in which devices and a server share a pre-trained model. The server wishes to make an inference on a new input given the model. Devices have access to data, previously not used for training, and can communicate to the server over a common wireless channel. If the devices have no access to the new input, can communication from devices to the server enhance the quality of the inference decision at the server? Recent work has introduced federated conformal prediction (CP), which leverages devices-to-server communication to improve the reliability of the server's decision. With federated CP, devices communicate to the server information about the loss accrued by the shared pre-trained model on the local data, and the server leverages this information to calibrate a decision interval, or set, so that it is guaranteed to contain the correct answer with a pre-defined target reliability level. Previous work assumed noise-free communication, whereby devices can communicate a single real number to the server. In this paper, we study for the first time federated CP in a wireless setting. We introduce a novel protocol, termed wireless federated conformal prediction (WFCP), which builds on type-based multiple access (TBMA) and on a novel quantile correction strategy. WFCP is proved to provide formal reliability guarantees in terms of coverage of the predicted set produced by the server. Using numerical results, we demonstrate the significant advantages of WFCP against digital implementations of existing federated CP schemes, especially in regimes with limited communication resources and/or large number of devices.


page 1

page 2

page 3

page 4


Where to Begin? Exploring the Impact of Pre-Training and Initialization in Federated Learning

An oft-cited challenge of federated learning is the presence of data het...

Federated Linear Bandit Learning via Over-the-Air Computation

In this paper, we investigate federated contextual linear bandit learnin...

One-Bit Over-the-Air Aggregation for Communication-Efficient Federated Edge Learning: Design and Convergence Analysis

Federated edge learning (FEEL) is a popular framework for model training...

Federated Learning over Wireless Fading Channels

We study federated machine learning at the wireless network edge, where ...

Where to Begin? On the Impact of Pre-Training and Initialization in Federated Learning

An oft-cited challenge of federated learning is the presence of heteroge...

Wireless Distributed Edge Learning: How Many Edge Devices Do We Need?

We consider distributed machine learning at the wireless edge, where a p...

Please sign up or login with your details

Forgot password? Click here to reset