论文标题

泰勒 - 拉格朗日神经普通微分方程:快速训练和评估神经odes

Taylor-Lagrange Neural Ordinary Differential Equations: Toward Fast Training and Evaluation of Neural ODEs

论文作者

Djeumou, Franck, Neary, Cyrus, Goubault, Eric, Putot, Sylvie, Topcu, Ufuk

论文摘要

神经普通微分方程(节点) - 使用神经网络的微分方程的参数化 - 在学习来自数据的未知连续时间动态系统的学习模型中表现出了巨大的希望。但是,对节点的每个正向评估都需要用于捕获系统动态的神经网络的数值整合,从而使训练非常昂贵。现有作品依赖于现成的自适应步进大小的数值集成方案,该方案通常需要对基础动态网络进行过多的评估,以获得足够的培训准确性。相比之下,我们通过提出数据驱动的数值集成方法来加速节点的评估和培训。提出的泰勒 - 拉格朗日节点(TL节点)使用固定顺序的泰勒扩展进行数值集成,同时还学习估计扩展的近似误差。结果,所提出的方法在仅采用低阶泰勒膨胀的同时,达到了与自适应阶梯尺寸方案相同的准确性,从而大大降低了整合节点所需的计算成本。一组数值实验,包括对动态系统,图像分类和密度估计进行建模,表明,TL节点可以比最先进的方法更快地训练tl节点,而不会出现任何性能损失。

Neural ordinary differential equations (NODEs) -- parametrizations of differential equations using neural networks -- have shown tremendous promise in learning models of unknown continuous-time dynamical systems from data. However, every forward evaluation of a NODE requires numerical integration of the neural network used to capture the system dynamics, making their training prohibitively expensive. Existing works rely on off-the-shelf adaptive step-size numerical integration schemes, which often require an excessive number of evaluations of the underlying dynamics network to obtain sufficient accuracy for training. By contrast, we accelerate the evaluation and the training of NODEs by proposing a data-driven approach to their numerical integration. The proposed Taylor-Lagrange NODEs (TL-NODEs) use a fixed-order Taylor expansion for numerical integration, while also learning to estimate the expansion's approximation error. As a result, the proposed approach achieves the same accuracy as adaptive step-size schemes while employing only low-order Taylor expansions, thus greatly reducing the computational cost necessary to integrate the NODE. A suite of numerical experiments, including modeling dynamical systems, image classification, and density estimation, demonstrate that TL-NODEs can be trained more than an order of magnitude faster than state-of-the-art approaches, without any loss in performance.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源