论文标题

PrivColl:实用隐私的协作机器学习

PrivColl: Practical Privacy-Preserving Collaborative Machine Learning

论文作者

Zhang, Yanjun, Bai, Guangdong, Li, Xue, Curtis, Caitlin, Chen, Chen, Ko, Ryan K L

论文摘要

协作学习使两个或更多参与者都有自己的培训数据集,可以协作学习联合模型。希望协作不应导致每个所有者的原始数据集或对其训练的本地模型参数的披露。通过差异隐私机制,同态加密(HE)和安全的多方计算(MPC),已经达成了这种隐私保护要求,但是现有的尝试可能会引入模型准确性的丧失,或者暗示明显的计算和/或通信间接费用。在这项工作中,我们通过轻巧的添加秘密共享技术解决了这个问题。我们建议使用PrivColl,这是一个保护本地数据和本地模型的框架,同时确保培训过程的正确性。 PrivColl采用秘密共享技术来安全地评估多方计算环境中的加法操作,并仅通过仅采用同构添加操作来实现实用性。我们正式证明,即使参与者的大多数(N-2)损坏了,它也可以保证隐私保护。通过对现实世界数据集的实验,我们进一步证明了Privcoll保留了高效率。在基于训练线性/逻辑回归的最先进的MPC/HE方案上,它实现了超过45倍以上的速度,用于训练神经网络的速度更快。

Collaborative learning enables two or more participants, each with their own training dataset, to collaboratively learn a joint model. It is desirable that the collaboration should not cause the disclosure of either the raw datasets of each individual owner or the local model parameters trained on them. This privacy-preservation requirement has been approached through differential privacy mechanisms, homomorphic encryption (HE) and secure multiparty computation (MPC), but existing attempts may either introduce the loss of model accuracy or imply significant computational and/or communicational overhead. In this work, we address this problem with the lightweight additive secret sharing technique. We propose PrivColl, a framework for protecting local data and local models while ensuring the correctness of training processes. PrivColl employs secret sharing technique for securely evaluating addition operations in a multiparty computation environment, and achieves practicability by employing only the homomorphic addition operations. We formally prove that it guarantees privacy preservation even though the majority (n-2 out of n) of participants are corrupted. With experiments on real-world datasets, we further demonstrate that PrivColl retains high efficiency. It achieves a speedup of more than 45X over the state-of-the-art MPC/HE based schemes for training linear/logistic regression, and 216X faster for training neural network.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源