论文标题

全球扩展,本地收缩:判别多标签学习,缺少标签

Expand Globally, Shrink Locally: Discriminant Multi-label Learning with Missing Labels

论文作者

Ma, Zhongchen, Chen, Songcan

论文摘要

在多标签学习中,缺少标签的问题带来了一个重大挑战。许多方法试图通过利用标签矩阵的低排名结构来恢复丢失标签。但是,这些方法仅利用全球低位标签结构,忽略了本地低级标签结构,并在某种程度上忽略了标签判别信息,从而为进一步的性能提供了空间。在本文中,我们开发了一种简单而有效的判别多标签学习(DM2L)方法,用于具有缺失标签的多标签学习。具体而言,我们将低级结构强加于来自同一标签(局部缩减局部缩小)的实例的所有预测,以及最大分离的结构(高级结构)对来自不同标签的实例的预测(等级的全局扩展)。通过这种方式,这些强加的低级结构可以帮助建模本地和全球低位标签结构,而施加的高级结构可以帮助提供更多潜在的可辨别性。我们随后的理论分析也支持这些直觉。此外,我们通过使用内核技巧来增强DM2L并建立一个学习这些模型的目标。与其他方法相比,我们的方法涉及最少的假设,只有一个高参数。即便如此,广泛的实验表明,我们的方法仍然优于最先进的方法。

In multi-label learning, the issue of missing labels brings a major challenge. Many methods attempt to recovery missing labels by exploiting low-rank structure of label matrix. However, these methods just utilize global low-rank label structure, ignore both local low-rank label structures and label discriminant information to some extent, leaving room for further performance improvement. In this paper, we develop a simple yet effective discriminant multi-label learning (DM2L) method for multi-label learning with missing labels. Specifically, we impose the low-rank structures on all the predictions of instances from the same labels (local shrinking of rank), and a maximally separated structure (high-rank structure) on the predictions of instances from different labels (global expanding of rank). In this way, these imposed low-rank structures can help modeling both local and global low-rank label structures, while the imposed high-rank structure can help providing more underlying discriminability. Our subsequent theoretical analysis also supports these intuitions. In addition, we provide a nonlinear extension via using kernel trick to enhance DM2L and establish a concave-convex objective to learn these models. Compared to the other methods, our method involves the fewest assumptions and only one hyper-parameter. Even so, extensive experiments show that our method still outperforms the state-of-the-art methods.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源