论文标题
学习以最佳运输的重新体重示例,以进行分类不平衡
Learning to Re-weight Examples with Optimal Transport for Imbalanced Classification
论文作者
论文摘要
基于深度学习的分类模型不平衡的数据构成挑战。解决不平衡数据的最广泛使用的方法之一是重新加权,其中训练样本与损耗函数的不同权重相关。大多数现有的重新加权方法都将示例权重视为可学习的参数,并优化了元集中的权重,因此需要昂贵的二元优化。在本文中,我们从分布的角度提出了一种基于最佳运输(OT)的新型重新加权方法。具体而言,我们将训练集视为其样品上的不平衡分布,该分布由OT运输到从元集中获得的平衡分布。训练样品的权重是分布不平衡的概率质量,并通过最大程度地减少两个分布之间的ot距离来学习。与现有方法相比,我们提出的一种方法可以脱离每次迭代时的体重学习对相关分类器的依赖性。图像,文本和点云数据集的实验表明,我们提出的重新加权方法具有出色的性能,在许多情况下实现了最新的结果,并提供了一种有希望的工具来解决不平衡的分类问题。
Imbalanced data pose challenges for deep learning based classification models. One of the most widely-used approaches for tackling imbalanced data is re-weighting, where training samples are associated with different weights in the loss function. Most of existing re-weighting approaches treat the example weights as the learnable parameter and optimize the weights on the meta set, entailing expensive bilevel optimization. In this paper, we propose a novel re-weighting method based on optimal transport (OT) from a distributional point of view. Specifically, we view the training set as an imbalanced distribution over its samples, which is transported by OT to a balanced distribution obtained from the meta set. The weights of the training samples are the probability mass of the imbalanced distribution and learned by minimizing the OT distance between the two distributions. Compared with existing methods, our proposed one disengages the dependence of the weight learning on the concerned classifier at each iteration. Experiments on image, text and point cloud datasets demonstrate that our proposed re-weighting method has excellent performance, achieving state-of-the-art results in many cases and providing a promising tool for addressing the imbalanced classification issue.