论文标题

kl差异的最佳估计用于连续分布

Minimax Optimal Estimation of KL Divergence for Continuous Distributions

论文作者

Zhao, Puning, Lai, Lifeng

论文摘要

在各个领域中,估计与相同和独立分布样本的kullback-leibler差异是一个重要的问题。一个简单有效的估计器基于这些样品之间的K最近邻居距离。在本文中,我们分析了该估计器的偏差和方差的收敛速率。此外,我们得出了Minimax均方根误差的下限,并表明KNN方法在渐近上是最佳速率。

Estimating Kullback-Leibler divergence from identical and independently distributed samples is an important problem in various domains. One simple and effective estimator is based on the k nearest neighbor distances between these samples. In this paper, we analyze the convergence rates of the bias and variance of this estimator. Furthermore, we derive a lower bound of the minimax mean square error and show that kNN method is asymptotically rate optimal.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源