论文标题
梯度增强归一流的流动
Gradient Boosted Normalizing Flows
论文作者
论文摘要
通过链接一系列可区分变换,归一化流(NF)提供了一种表达性的后近似方法,精确的密度评估和采样。正常流量文献的趋势是设计更深,更复杂的转换以实现更大的灵活性。我们提出了一种替代方案:通过连续添加带有梯度增强的新的NF组件来增强归一化流量(GBNF)模型A密度。在增强框架下,每个新的NF组件都优化了样品加权可能性目标,从而导致新组件适合以前训练的组件的残差。 GBNF配方会导致混合模型结构,随着添加更多组件的增加,其柔韧性会增加。此外,GBNF提供了更广泛的方法,而不是更深入的方法,可以以额外的培训为代价改善现有的NFS,而不是更复杂的转变。我们证明了该技术对密度估计的有效性,并通过将GBNF与图像的变异自动编码器耦合。我们的结果表明,GBNF的表现优于其非促进类似物,在某些情况下,以较小,更简单的流量产生更好的结果。
By chaining a sequence of differentiable invertible transformations, normalizing flows (NF) provide an expressive method of posterior approximation, exact density evaluation, and sampling. The trend in normalizing flow literature has been to devise deeper, more complex transformations to achieve greater flexibility. We propose an alternative: Gradient Boosted Normalizing Flows (GBNF) model a density by successively adding new NF components with gradient boosting. Under the boosting framework, each new NF component optimizes a sample weighted likelihood objective, resulting in new components that are fit to the residuals of the previously trained components. The GBNF formulation results in a mixture model structure, whose flexibility increases as more components are added. Moreover, GBNFs offer a wider, as opposed to strictly deeper, approach that improves existing NFs at the cost of additional training---not more complex transformations. We demonstrate the effectiveness of this technique for density estimation and, by coupling GBNF with a variational autoencoder, generative modeling of images. Our results show that GBNFs outperform their non-boosted analog, and, in some cases, produce better results with smaller, simpler flows.