论文标题
Torchdyn:神经微分方程库
TorchDyn: A Neural Differential Equations Library
论文作者
论文摘要
连续深度学习最近成为了深度学习的新观点,改善了与动态系统和密度估计相关的任务的性能。这些方法的核心是神经微分方程,其正向通过是由神经网络参数参数的初始值问题的解决方案。与标准离散神经网络相比,由于特殊的差异,连续深度模型的全部潜力需要不同的软件工具,例如,必须通过数值求解器进行推理。我们介绍了Torchdyn,这是一个专门用于连续深入学习的Pytorch库,旨在提升神经微分方程,以与常规的插入插件深度学习原始图一样易于访问。通过将不同的变体识别为共同的基本组件,可以将其组合在一起并自由重新使用以获得复杂的组成体系结构来实现这一目标。 Torchdyn进一步提供了旨在指导研究人员和贡献者的分步教程和基准。
Continuous-depth learning has recently emerged as a novel perspective on deep learning, improving performance in tasks related to dynamical systems and density estimation. Core to these approaches is the neural differential equation, whose forward passes are the solutions of an initial value problem parametrized by a neural network. Unlocking the full potential of continuous-depth models requires a different set of software tools, due to peculiar differences compared to standard discrete neural networks, e.g inference must be carried out via numerical solvers. We introduce TorchDyn, a PyTorch library dedicated to continuous-depth learning, designed to elevate neural differential equations to be as accessible as regular plug-and-play deep learning primitives. This objective is achieved by identifying and subdividing different variants into common essential components, which can be combined and freely repurposed to obtain complex compositional architectures. TorchDyn further offers step-by-step tutorials and benchmarks designed to guide researchers and contributors.