论文标题
差异私有随机协调下降
Differentially Private Stochastic Coordinate Descent
论文作者
论文摘要
在本文中,我们应对将随机坐标下降算法差异化的挑战。与经典的梯度下降算法相比,更新在单个模型向量和该矢量的受控噪声上运行足以隐藏有关个人的关键信息,随机坐标下降至关重要地依赖于在训练过程中将辅助信息保持在记忆中。此辅助信息提供了额外的隐私泄漏,并提出了本工作中提出的主要挑战。在洞察力添加的洞察力的驱动下,辅助信息的一致性在预期中存在,我们提出了DP-SCD,这是第一个差异私有随机坐标的下降算法。我们从理论上分析了我们的新方法,并认为将其解耦和并行化坐标更新对于其实用程序至关重要。在经验方面,我们证明了与流行的随机梯度下降替代方案(DP-SGD)的竞争性能,同时需要较小的调整。
In this paper we tackle the challenge of making the stochastic coordinate descent algorithm differentially private. Compared to the classical gradient descent algorithm where updates operate on a single model vector and controlled noise addition to this vector suffices to hide critical information about individuals, stochastic coordinate descent crucially relies on keeping auxiliary information in memory during training. This auxiliary information provides an additional privacy leak and poses the major challenge addressed in this work. Driven by the insight that under independent noise addition, the consistency of the auxiliary information holds in expectation, we present DP-SCD, the first differentially private stochastic coordinate descent algorithm. We analyze our new method theoretically and argue that decoupling and parallelizing coordinate updates is essential for its utility. On the empirical side we demonstrate competitive performance against the popular stochastic gradient descent alternative (DP-SGD) while requiring significantly less tuning.