Differentially Private Secure Multi-Party Computation for Federated Learning in Financial Applications

by   David Byrd, et al.

Federated Learning enables a population of clients, working with a trusted server, to collaboratively learn a shared machine learning model while keeping each client's data within its own local systems. This reduces the risk of exposing sensitive data, but it is still possible to reverse engineer information about a client's private data set from communicated model parameters. Most federated learning systems therefore use differential privacy to introduce noise to the parameters. This adds uncertainty to any attempt to reveal private client data, but also reduces the accuracy of the shared model, limiting the useful scale of privacy-preserving noise. A system can further reduce the coordinating server's ability to recover private client information, without additional accuracy loss, by also including secure multiparty computation. An approach combining both techniques is especially relevant to financial firms as it allows new possibilities for collaborative learning without exposing sensitive client data. This could produce more accurate models for important tasks like optimal trade execution, credit origination, or fraud detection. The key contributions of this paper are: We present a privacy-preserving federated learning protocol to a non-specialist audience, demonstrate it using logistic regression on a real-world credit card fraud data set, and evaluate it using an open-source simulation platform which we have adapted for the development of federated learning systems.


page 1

page 2

page 3

page 4


Collusion Resistant Federated Learning with Oblivious Distributed Differential Privacy

Privacy-preserving federated learning enables a population of distribute...

Privacy-Preserving Collaborative Chinese Text Recognition with Federated Learning

In Chinese text recognition, to compensate for the insufficient local da...

Secure Federated Submodel Learning

Federated learning was proposed with an intriguing vision of achieving c...

Shuffled Differentially Private Federated Learning for Time Series Data Analytics

Trustworthy federated learning aims to achieve optimal performance while...

Differential Secrecy for Distributed Data and Applications to Robust Differentially Secure Vector Summation

Computing the noisy sum of real-valued vectors is an important primitive...

AnoFel: Supporting Anonymity for Privacy-Preserving Federated Learning

Federated learning enables users to collaboratively train a machine lear...

Dancing in the Dark: Private Multi-Party Machine Learning in an Untrusted Setting

Distributed machine learning (ML) systems today use an unsophisticated t...

Please sign up or login with your details

Forgot password? Click here to reset