论文标题

对文本到图像生成模型的会员推断攻击

Membership Inference Attacks Against Text-to-image Generation Models

论文作者

Wu, Yixin, Yu, Ning, Li, Zheng, Backes, Michael, Zhang, Yang

论文摘要

文本到图像的生成模型最近引起了前所未有的关注,因为它们在生活的各个领域都没有构想富有想象力的应用。但是,开发此类模型需要大量数据,这些数据可能包含对隐私敏感的信息,例如面对身份。尽管在图像分类和GAN发电域中广泛证明了隐私风险,但文本到图像生成域中的隐私风险在很大程度上没有探索。在本文中,我们通过成员推理的角度对文本到图像生成模型进行了第一个隐私分析。具体来说,我们提出了有关会员信息的三个关键直觉,并相应地设计了四种攻击方法。我们对两个主流文本对图像生成模型进行全面评估,包括序列到序列建模和基于扩散的建模。经验结果表明,所有提出的攻击都可以实现出色的性能,在某些情况下甚至接近1的准确性,因此相应的风险比现有的成员推理攻击所显示的要严重得多。我们进一步进行了广泛的消融研究,以分析可能影响攻击性能的因素,这可以指导开发人员和研究人员警惕文本到图像生成模型中的脆弱性。所有这些发现表明,我们提出的攻击对文本到图像生成模型构成了现实的隐私威胁。

Text-to-image generation models have recently attracted unprecedented attention as they unlatch imaginative applications in all areas of life. However, developing such models requires huge amounts of data that might contain privacy-sensitive information, e.g., face identity. While privacy risks have been extensively demonstrated in the image classification and GAN generation domains, privacy risks in the text-to-image generation domain are largely unexplored. In this paper, we perform the first privacy analysis of text-to-image generation models through the lens of membership inference. Specifically, we propose three key intuitions about membership information and design four attack methodologies accordingly. We conduct comprehensive evaluations on two mainstream text-to-image generation models including sequence-to-sequence modeling and diffusion-based modeling. The empirical results show that all of the proposed attacks can achieve significant performance, in some cases even close to an accuracy of 1, and thus the corresponding risk is much more severe than that shown by existing membership inference attacks. We further conduct an extensive ablation study to analyze the factors that may affect the attack performance, which can guide developers and researchers to be alert to vulnerabilities in text-to-image generation models. All these findings indicate that our proposed attacks pose a realistic privacy threat to the text-to-image generation models.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源