论文标题

在成对各向同性高斯 - 马尔科夫随机字段之间的kullback-leibler差异

On the Kullback-Leibler divergence between pairwise isotropic Gaussian-Markov random fields

论文作者

Levada, Alexandre L. M.

论文摘要

kullback-leibler差异或相对熵是统计模型之间的信息理论措施,在测量随机变量之间的距离中起着重要作用。在对复杂系统的研究中,随机场是数学结构,通过反向温度参数对这些变量之间的相互作用进行建模,负责控制沿场的空间依赖结构。在本文中,我们在单变量和多元病例中,在两个成对的各向同性高斯随机场之间得出了kullback-leibler差异的封闭形式表达式。所提出的方程式允许在图像处理和机器学习应用中开发新的相似性度量,例如图像denoising和无监督的度量学习。

The Kullback-Leibler divergence or relative entropy is an information-theoretic measure between statistical models that play an important role in measuring a distance between random variables. In the study of complex systems, random fields are mathematical structures that models the interaction between these variables by means of an inverse temperature parameter, responsible for controlling the spatial dependence structure along the field. In this paper, we derive closed-form expressions for the Kullback-Leibler divergence between two pairwise isotropic Gaussian-Markov random fields in both univariate and multivariate cases. The proposed equation allows the development of novel similarity measures in image processing and machine learning applications, such as image denoising and unsupervised metric learning.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源