论文标题

全kullback-leibler-Divergence损失无参数标签分布学习

Full Kullback-Leibler-Divergence Loss for Hyperparameter-free Label Distribution Learning

论文作者

Günder, Maurice, Piatkowski, Nico, Bauckhage, Christian

论文摘要

标签分布学习(LDL)的概念是一种通过模棱两可和/或不平衡标签稳定分类和回归问题的技术。 LDL的原型用例是基于剖面图像的人类年龄估计。关于这个回归问题,已经开发了一种所谓的深标签分布学习(DLDL)方法。主要思想是标签分布的联合回归及其期望值。但是,原始的DLDL方法使用具有不同数学动机的损耗组件,因此是不同的量表,这就是为什么必须使用超参数的原因。在这项工作中,我们引入了DLDL的损失函数,其组件是由Kullback-Leibler(KL)差异完全定义的,因此,无需其他超参数即可直接相互作用。它概括了DLDL关于进一步用例的概念,尤其是对于多维或多尺度的分配学习任务。

The concept of Label Distribution Learning (LDL) is a technique to stabilize classification and regression problems with ambiguous and/or imbalanced labels. A prototypical use-case of LDL is human age estimation based on profile images. Regarding this regression problem, a so called Deep Label Distribution Learning (DLDL) method has been developed. The main idea is the joint regression of the label distribution and its expectation value. However, the original DLDL method uses loss components with different mathematical motivation and, thus, different scales, which is why the use of a hyperparameter becomes necessary. In this work, we introduce a loss function for DLDL whose components are completely defined by Kullback-Leibler (KL) divergences and, thus, are directly comparable to each other without the need of additional hyperparameters. It generalizes the concept of DLDL with regard to further use-cases, in particular for multi-dimensional or multi-scale distribution learning tasks.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源