论文标题
与无界的协变量差异私人回归
Differentially Private Regression with Unbounded Covariates
论文作者
论文摘要
我们为具有无限协变量的最小二乘拟合,二进制回归和线性回归的经典回归设置提供了计算高效的私有算法。在我们的工作之前,根据协变量的先验范围研究了此类回归设置中的隐私限制。我们考虑了高斯边际的案例,并将最新的私人技术扩展到平均值和协方差估计上(Kamath等,2019; Karwa and Vadhan,2018),延伸到高斯以下政策。我们提供了一种新颖的技术分析,为上述经典回归设置提供了不同的私有算法。通过二进制回归的情况,我们捕获了逻辑回归和线性可分离的SVM的基本和广泛研究的模型,学习了对真实回归矢量的无偏估计,最多达到了缩放系数。
We provide computationally efficient, differentially private algorithms for the classical regression settings of Least Squares Fitting, Binary Regression and Linear Regression with unbounded covariates. Prior to our work, privacy constraints in such regression settings were studied under strong a priori bounds on covariates. We consider the case of Gaussian marginals and extend recent differentially private techniques on mean and covariance estimation (Kamath et al., 2019; Karwa and Vadhan, 2018) to the sub-gaussian regime. We provide a novel technical analysis yielding differentially private algorithms for the above classical regression settings. Through the case of Binary Regression, we capture the fundamental and widely-studied models of logistic regression and linearly-separable SVMs, learning an unbiased estimate of the true regression vector, up to a scaling factor.