论文标题

几乎没有课堂知识转移的学习

Few-Shot Learning with Intra-Class Knowledge Transfer

论文作者

Roy, Vivek, Xu, Yan, Wang, Yu-Xiong, Kitani, Kris, Salakhutdinov, Ruslan, Hebert, Martial

论文摘要

我们考虑使用不平衡的数据集的几个射击分类任务,其中某些课程具有足够的培训样本,而其他课程仅具有有限的培训样本。最近的作品提出了通过使用将几种训练样本的生成模型作为种子增强几杆类别的训练数据来解决这项任务的。但是,由于少数几种种子的数量有限,生成的样品通常具有较小的多样性,因此很难训练几阶段的判别分类器。为了丰富生成的样本的多样性,我们建议通过邻居类共享相似的统计信息的直觉来利用邻居多摄像类的类内知识。这种类内的信息是通过两步机制获得的。首先,仅在许多拍摄类别上训练的回归器用于评估仅几个样本中的少量射击类均值。其次,超类是聚集的,每个超类的统计平均值和特征差异被用作儿童少数类别的可转移知识。然后,生成器将使用此类知识来增强稀疏训练数据,以帮助下游分类任务。广泛的实验表明,我们的方法在不同数据集和$ n $ shot设置中实现了最新的实验。

We consider the few-shot classification task with an unbalanced dataset, in which some classes have sufficient training samples while other classes only have limited training samples. Recent works have proposed to solve this task by augmenting the training data of the few-shot classes using generative models with the few-shot training samples as the seeds. However, due to the limited number of the few-shot seeds, the generated samples usually have small diversity, making it difficult to train a discriminative classifier for the few-shot classes. To enrich the diversity of the generated samples, we propose to leverage the intra-class knowledge from the neighbor many-shot classes with the intuition that neighbor classes share similar statistical information. Such intra-class information is obtained with a two-step mechanism. First, a regressor trained only on the many-shot classes is used to evaluate the few-shot class means from only a few samples. Second, superclasses are clustered, and the statistical mean and feature variance of each superclass are used as transferable knowledge inherited by the children few-shot classes. Such knowledge is then used by a generator to augment the sparse training data to help the downstream classification tasks. Extensive experiments show that our method achieves state-of-the-art across different datasets and $n$-shot settings.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源