论文标题

跨域几乎没有学习的大幅度机制和伪查询

Large Margin Mechanism and Pseudo Query Set on Cross-Domain Few-Shot Learning

论文作者

Yeh, Jia-Fong, Lee, Hsin-Ying, Tsai, Bing-Chen, Chen, Yi-Rong, Huang, Ping-Chia, Hsu, Winston H.

论文摘要

近年来,很少有学习问题引起了很多关注。虽然大多数以前的作品中的方法在一个单个领域的数据集上进行了培训和测试,但跨域几乎没有学习是一个崭新的分支,这些分支却有很少的学习问题,其中模型在培训和测试阶段之间处理不同域中的数据集。 In this paper, to solve the problem that the model is pre-trained (meta-trained) on a single dataset while fine-tuned on datasets in four different domains, including common objects, satellite images, and medical images, we propose a novel large margin fine-tuning method (LMM-PQS), which generates pseudo query images from support images and fine-tunes the feature extraction modules with a large margin mechanism inspired by methods in面部识别。根据实验结果,LMM-PQS超过了基线模型,并表明我们的方法是强大的,并且可以轻松地将预训练的模型调整到很少的数据的新领域。

In recent years, few-shot learning problems have received a lot of attention. While methods in most previous works were trained and tested on datasets in one single domain, cross-domain few-shot learning is a brand-new branch of few-shot learning problems, where models handle datasets in different domains between training and testing phases. In this paper, to solve the problem that the model is pre-trained (meta-trained) on a single dataset while fine-tuned on datasets in four different domains, including common objects, satellite images, and medical images, we propose a novel large margin fine-tuning method (LMM-PQS), which generates pseudo query images from support images and fine-tunes the feature extraction modules with a large margin mechanism inspired by methods in face recognition. According to the experiment results, LMM-PQS surpasses the baseline models by a significant margin and demonstrates that our approach is robust and can easily adapt pre-trained models to new domains with few data.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源