Bicoptor: Two-round Secure Three-party Non-linear Computation without Preprocessing for Privacy-preserving Machine Learning

10/05/2022
by   Lijing Zhou, et al.
0

The overhead of non-linear functions dominates the performance of the secure multiparty computation (MPC) based privacy-preserving machine learning (PPML). This work introduces two sets of novel secure three-party computation (3PC) protocols, using additive and replicated secret sharing schemes respectively. We name the whole family of protocols as Bicoptor, its basis is a new sign determination protocol, which relies on a clever use of the truncation protocol proposed in SecureML (S P 2017). Our 3PC sign determination protocol only requires two communication rounds, and does not involve any preprocessing. Such sign determination protocol is well-suited for computing non-linear functions in PPML, e.g. the activation function ReLU, Maxpool, and their variants. We develop suitable protocols for these non-linear functions, which form a family of GPU-friendly protocols, Bicoptor. All Bicoptor protocols only require two communication rounds without preprocessing. We evaluate the protocols using additive secret sharing under a 3-party LAN network over a public cloud, and achieve 90,000 DReLU/ReLU or 3,200 Maxpool (find the maximum value of nine inputs) operations per second. Under the same settings and environment, our ReLU protocol has a one or even two order(s) of magnitude improvement to the state-of-the-art works, Edabits (CRYPTO 2020) or Falcon (PETS 2021), respectively without batch processing.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro