论文标题

统计保证和算法融合问题的变化提升问题

Statistical Guarantees and Algorithmic Convergence Issues of Variational Boosting

论文作者

Guha, Biraj Subhra, Bhattacharya, Anirban, Pati, Debdeep

论文摘要

我们通过提出一种新型的小带宽高斯混合物变化家族来为贝叶斯变分的增强提供统计保证。我们采用功能版本的Frank-Wolfe优化作为我们的变异算法,并研究了迭代增强更新的频繁属性。比较与最近有关增强的文献进行了比较,描述了变异家族的选择和差异措施如何影响优化常规的收敛和有限样本统计特性。具体而言,我们首先证明了相对于数据生成分布的迭代的随机界限。接下来,我们将其集成到我们的算法中,以提供明确的收敛率,结束于所需的增强更新数量的结果。

We provide statistical guarantees for Bayesian variational boosting by proposing a novel small bandwidth Gaussian mixture variational family. We employ a functional version of Frank-Wolfe optimization as our variational algorithm and study frequentist properties of the iterative boosting updates. Comparisons are drawn to the recent literature on boosting, describing how the choice of the variational family and the discrepancy measure affect both convergence and finite-sample statistical properties of the optimization routine. Specifically, we first demonstrate stochastic boundedness of the boosting iterates with respect to the data generating distribution. We next integrate this within our algorithm to provide an explicit convergence rate, ending with a result on the required number of boosting updates.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源