论文标题

本地私人机制的收缩

Contraction of Locally Differentially Private Mechanisms

论文作者

Asoodeh, Shahab, Zhang, Huanyu

论文摘要

我们研究了局部私人机制的收缩特性。更具体地说,我们在$ pk $和$ pk $和$ε$ -LDP机制$ k $的差异之间的差异方面得出了紧密的上限。我们的第一个主要技术结果在$χ^2 $ -Divergence $χ^2(pk} \ | qk)$上呈现出鲜明的上限,这是$χ^2(p \ | q)$和$ \ varepsilon $。我们还表明,相同的结果也适用于包括KL-Divergence和Squared Hellinger距离在内的大家族。第二个主要技术结果在$χ^2(pk \ | qk)$上给出了上限,以总变化距离$ \ mathsf {tv}(p,q)$和$ε$。然后,我们利用这些界限来建立货车不平等,le cam,Assouad和相互信息方法的本地私人版本,它们是界限最小值估计风险的强大工具。在几个统计问题(例如熵和离散的分布估计,非参数密度估计和假设检验)中,这些结果与最新问题相比,这些结果可导致更好的隐私分析。

We investigate the contraction properties of locally differentially private mechanisms. More specifically, we derive tight upper bounds on the divergence between $PK$ and $QK$ output distributions of an $ε$-LDP mechanism $K$ in terms of a divergence between the corresponding input distributions $P$ and $Q$, respectively. Our first main technical result presents a sharp upper bound on the $χ^2$-divergence $χ^2(PK}\|QK)$ in terms of $χ^2(P\|Q)$ and $\varepsilon$. We also show that the same result holds for a large family of divergences, including KL-divergence and squared Hellinger distance. The second main technical result gives an upper bound on $χ^2(PK\|QK)$ in terms of total variation distance $\mathsf{TV}(P, Q)$ and $ε$. We then utilize these bounds to establish locally private versions of the van Trees inequality, Le Cam's, Assouad's, and the mutual information methods, which are powerful tools for bounding minimax estimation risks. These results are shown to lead to better privacy analyses than the state-of-the-arts in several statistical problems such as entropy and discrete distribution estimation, non-parametric density estimation, and hypothesis testing.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源