论文标题

自我训练的无监督可控的一代

Unsupervised Controllable Generation with Self-Training

论文作者

Chrysos, Grigorios G, Kossaifi, Jean, Yu, Zhiding, Anandkumar, Anima

论文摘要

最近的生成对抗网络(GAN)能够产生令人印象深刻的照片现实图像。但是,使用gan的可控生成仍然是一个具有挑战性的研究问题。实现可控制的生成需要语义上可解释和分离的变异因素。使用简单的固定分布(例如高斯分布)实现此目标是一项挑战。取而代之的是,我们提出了一个无监督的框架,以学习通过自我训练控制发电机的潜在代码的分布。自训练提供了GAN培训的迭代反馈,从鉴别器到发电机,并随着培训的进行,逐渐改善了潜在代码的建议。潜在代码是从歧视器特征空间中学到的潜在变量模型中采样的。我们考虑了一个归一化的独立组件分析模型,并通过张量分解高阶矩来学习其参数。与其他变体(如变异自动编码器)相比,我们的框架表现出更好的分离,并且能够在没有任何监督的情况下发现语义上有意义的潜在代码。我们在凭经验上进行了验证和面对数据集的经验证明,学到的代码中的每个元素都以语义含义(例如姿势或背景更改。我们还用定量指标证明,与其他方法相比,我们的方法产生更好的结果。

Recent generative adversarial networks (GANs) are able to generate impressive photo-realistic images. However, controllable generation with GANs remains a challenging research problem. Achieving controllable generation requires semantically interpretable and disentangled factors of variation. It is challenging to achieve this goal using simple fixed distributions such as Gaussian distribution. Instead, we propose an unsupervised framework to learn a distribution of latent codes that control the generator through self-training. Self-training provides an iterative feedback in the GAN training, from the discriminator to the generator, and progressively improves the proposal of the latent codes as training proceeds. The latent codes are sampled from a latent variable model that is learned in the feature space of the discriminator. We consider a normalized independent component analysis model and learn its parameters through tensor factorization of the higher-order moments. Our framework exhibits better disentanglement compared to other variants such as the variational autoencoder, and is able to discover semantically meaningful latent codes without any supervision. We demonstrate empirically on both cars and faces datasets that each group of elements in the learned code controls a mode of variation with a semantic meaning, e.g. pose or background change. We also demonstrate with quantitative metrics that our method generates better results compared to other approaches.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源