论文标题

使用神经普通微分方程转移学习

Transfer Learning using Neural Ordinary Differential Equations

论文作者

S, Rajath, K, Sumukh Aithal, Subramanyam, Natarajan

论文摘要

已经引入了使用神经常规微分方程(节点)进行转移学习的概念。在本文中,我们使用有效网络探索CIFAR-10数据集上的转移学习。我们使用节点来微调模型。在训练和验证过程中,使用节点进行微调提供了更大的稳定性。这些连续的深度块也可以在数值精度和速度之间取得权衡。通过转移学习的神经ODE,导致了损失函数的稳定收敛。

A concept of using Neural Ordinary Differential Equations(NODE) for Transfer Learning has been introduced. In this paper we use the EfficientNets to explore transfer learning on CIFAR-10 dataset. We use NODE for fine-tuning our model. Using NODE for fine tuning provides more stability during training and validation.These continuous depth blocks can also have a trade off between numerical precision and speed .Using Neural ODEs for transfer learning has resulted in much stable convergence of the loss function.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源