论文标题

具有共享摊销变异推理的元学习

Meta-Learning with Shared Amortized Variational Inference

论文作者

Iakovleva, Ekaterina, Verbeek, Jakob, Alahari, Karteek

论文摘要

我们为经验贝叶斯元学习模型提出了一种新型的摊销变异推理方案,其中模型参数被视为潜在变量。我们通过使用变异自动编码器方法以有限培训数据为条件的模型参数学习先前的分布。我们的框架建议在模型参数上共享相同的摊销推理网络。虽然后验利用标记的支持数据和查询数据,但条件先验仅基于标记的支持数据。我们表明,在较早的工作中,依靠蒙特 - 卡洛近似值,条件先验崩溃到了狄拉克(Dirac Delta)函数。相比之下,我们的变分方法可以防止这种崩溃并保留模型参数的不确定性。我们评估了我们在迷你胶原,CIFAR-FS和FC100数据集上的方法,并将结果证明其优势比以前的工作。

We propose a novel amortized variational inference scheme for an empirical Bayes meta-learning model, where model parameters are treated as latent variables. We learn the prior distribution over model parameters conditioned on limited training data using a variational autoencoder approach. Our framework proposes sharing the same amortized inference network between the conditional prior and variational posterior distributions over the model parameters. While the posterior leverages both the labeled support and query data, the conditional prior is based only on the labeled support data. We show that in earlier work, relying on Monte-Carlo approximation, the conditional prior collapses to a Dirac delta function. In contrast, our variational approach prevents this collapse and preserves uncertainty over the model parameters. We evaluate our approach on the miniImageNet, CIFAR-FS and FC100 datasets, and present results demonstrating its advantages over previous work.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源