论文标题

学习学习各种语义记忆

Learning to Learn Variational Semantic Memory

论文作者

Zhen, Xiantong, Du, Yingjun, Xiong, Huan, Qiu, Qiang, Snoek, Cees G. M., Shao, Ling

论文摘要

在本文中,我们将各种语义记忆引入元学习中,以获取长期知识以进行几次学习。变分的语义记忆积累并存储语义信息,以在层次贝叶斯框架中进行类原型的概率推断。语义记忆是从头开始的,并通过吸收从其经历的任务中吸收信息而逐渐合并。通过这样做,它可以积累长期的一般知识,使其能够学习对象的新概念。我们将记忆召回为从地址内容的潜在内存变量的变异推断,该变量提供了一种将知识调整到各个任务的原则方法。作为新的长期记忆模块,我们的各种语义记忆都赋予了原则性的召回和更新机制,使语义信息能够有效地计算并适应几次学习。实验表明,与确定性矢量相比,原型的概率建模可实现对象类别的信息。四个基准上的一致新的最新性能一致,这表明了变化语义记忆的好处,从而提高了很少的识别。

In this paper, we introduce variational semantic memory into meta-learning to acquire long-term knowledge for few-shot learning. The variational semantic memory accrues and stores semantic information for the probabilistic inference of class prototypes in a hierarchical Bayesian framework. The semantic memory is grown from scratch and gradually consolidated by absorbing information from tasks it experiences. By doing so, it is able to accumulate long-term, general knowledge that enables it to learn new concepts of objects. We formulate memory recall as the variational inference of a latent memory variable from addressed contents, which offers a principled way to adapt the knowledge to individual tasks. Our variational semantic memory, as a new long-term memory module, confers principled recall and update mechanisms that enable semantic information to be efficiently accrued and adapted for few-shot learning. Experiments demonstrate that the probabilistic modelling of prototypes achieves a more informative representation of object classes compared to deterministic vectors. The consistent new state-of-the-art performance on four benchmarks shows the benefit of variational semantic memory in boosting few-shot recognition.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源