论文标题
通过差分激活编辑室外gan倒置
Editing Out-of-domain GAN Inversion via Differential Activations
论文作者
论文摘要
尽管在预验证的GAN模型的潜在空间中表现出的编辑能力,但倒置现实世界的图像仍处于困境中,重建不能忠于原始输入。这样做的主要原因是,训练和现实数据之间的分布未对准,因此,它不稳定gan倒置以进行真实图像编辑。在本文中,我们提出了一个基于GAN的新型编辑框架,以解决构图分解范式解决室外反转问题。特别是,在组成阶段,我们引入了一个差分激活模块,用于从全局角度\ ie检测语义变化,即编辑和未编辑的图像的特征之间的相对差距。借助生成的diff-cam面膜,可以通过配对的原始图像和编辑图像直观地进行粗略重建。通过这种方式,属性 - 近距离地区几乎可以幸存,而这种中间结果的质量仍然受到不可避免的幽灵效应的限制。因此,在分解阶段,我们进一步提出了一个基于GAN的基于GAN的DEGHOSTING网络,用于将最终的精细编辑图像与粗糙重建分开。在定性和定量评估方面,广泛的实验比最新方法具有优势。我们方法的鲁棒性和灵活性也在单个属性和多属性操作的两种情况下得到了验证。
Despite the demonstrated editing capacity in the latent space of a pretrained GAN model, inverting real-world images is stuck in a dilemma that the reconstruction cannot be faithful to the original input. The main reason for this is that the distributions between training and real-world data are misaligned, and because of that, it is unstable of GAN inversion for real image editing. In this paper, we propose a novel GAN prior based editing framework to tackle the out-of-domain inversion problem with a composition-decomposition paradigm. In particular, during the phase of composition, we introduce a differential activation module for detecting semantic changes from a global perspective, \ie, the relative gap between the features of edited and unedited images. With the aid of the generated Diff-CAM mask, a coarse reconstruction can intuitively be composited by the paired original and edited images. In this way, the attribute-irrelevant regions can be survived in almost whole, while the quality of such an intermediate result is still limited by an unavoidable ghosting effect. Consequently, in the decomposition phase, we further present a GAN prior based deghosting network for separating the final fine edited image from the coarse reconstruction. Extensive experiments exhibit superiorities over the state-of-the-art methods, in terms of qualitative and quantitative evaluations. The robustness and flexibility of our method is also validated on both scenarios of single attribute and multi-attribute manipulations.