论文标题

广义线性模型中的隐私成本:算法和最小值下限

The Cost of Privacy in Generalized Linear Models: Algorithms and Minimax Lower Bounds

论文作者

Cai, T. Tony, Wang, Yichen, Zhang, Linjun

论文摘要

我们通过构造投影梯度下降的私有版本,提出了在低维和高维稀疏广义线性模型(GLM)中进行参数估计的差异算法。我们表明,所提出的算法几乎是最佳的,它表征了其统计性能并确定GLM的隐私限制的最小值下限。下限是通过新型技术获得的,该技术基于Stein的引理,并概括了针对隐私受限的下限的跟踪攻击技术。由于适用于一般参数模型,因此这种较低的参数可能具有独立的兴趣。进行了模拟和实际数据实验,以证明我们算法的数值性能。

We propose differentially private algorithms for parameter estimation in both low-dimensional and high-dimensional sparse generalized linear models (GLMs) by constructing private versions of projected gradient descent. We show that the proposed algorithms are nearly rate-optimal by characterizing their statistical performance and establishing privacy-constrained minimax lower bounds for GLMs. The lower bounds are obtained via a novel technique, which is based on Stein's Lemma and generalizes the tracing attack technique for privacy-constrained lower bounds. This lower bound argument can be of independent interest as it is applicable to general parametric models. Simulated and real data experiments are conducted to demonstrate the numerical performance of our algorithms.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源