论文标题
精灵:高阶非降级扩散求解器
GENIE: Higher-Order Denoising Diffusion Solvers
论文作者
论文摘要
去核扩散模型(DDMS)已成为强大的生成模型类别。正向扩散过程慢慢地散布了数据,而深层模型学会了逐渐变形。综合等于求解由学习模型定义的微分方程(DE)。解决DE需要缓慢的迭代求解器才能获得高质量的生成。在这项工作中,我们提出了高阶denoising扩散求解器(Genie):基于截短的泰勒方法,我们得出了一种新型的高阶求解器,可显着加速合成。我们的求解器依赖于扰动数据分布的高阶梯度,即高阶分数函数。实际上,仅需要Jacobian-Vector产品(JVP),我们建议通过自动分化从一阶得分网络中提取它们。然后,我们将JVP提炼成一个单独的神经网络,使我们能够在合成过程中为我们的新型采样器有效计算必要的高阶项。我们只需要在一阶得分网络之上训练一个额外的头。我们在多个图像生成基准上验证了精灵,并证明精灵表现优于所有以前的求解器。与最近从根本上改变DDMS中生成过程的方法不同,我们的精灵解决了真正的生成DE,并且仍然可以启用应用程序,例如编码和引导采样。项目页面和代码:https://nv-tlabs.github.io/genie。
Denoising diffusion models (DDMs) have emerged as a powerful class of generative models. A forward diffusion process slowly perturbs the data, while a deep model learns to gradually denoise. Synthesis amounts to solving a differential equation (DE) defined by the learnt model. Solving the DE requires slow iterative solvers for high-quality generation. In this work, we propose Higher-Order Denoising Diffusion Solvers (GENIE): Based on truncated Taylor methods, we derive a novel higher-order solver that significantly accelerates synthesis. Our solver relies on higher-order gradients of the perturbed data distribution, that is, higher-order score functions. In practice, only Jacobian-vector products (JVPs) are required and we propose to extract them from the first-order score network via automatic differentiation. We then distill the JVPs into a separate neural network that allows us to efficiently compute the necessary higher-order terms for our novel sampler during synthesis. We only need to train a small additional head on top of the first-order score network. We validate GENIE on multiple image generation benchmarks and demonstrate that GENIE outperforms all previous solvers. Unlike recent methods that fundamentally alter the generation process in DDMs, our GENIE solves the true generative DE and still enables applications such as encoding and guided sampling. Project page and code: https://nv-tlabs.github.io/GENIE.