论文标题

对原子神经网络的轻巧有效的张量灵敏度

Lightweight and Effective Tensor Sensitivity for Atomistic Neural Networks

论文作者

Chigaev, Michael, Smith, Justin S., Anaya, Steven, Nebgen, Benjamin, Bettencourt, Matthew, Barros, Kipton, Lubbers, Nicholas

论文摘要

原子机器学习的重点是创建模型,这些模型遵守原子配置的基本对称性,例如排列,翻译和旋转不变。在许多这些方案中,通过在标量不变的(例如原子对之间的距离)上构建来实现翻译和旋转不变性。对与较高等级旋转张量(例如原子之间的矢量位移及其张量产物)内部合作的分子表示,越来越感兴趣。在这里,我们提出了一个框架,用于从每个局部原子环境中使用张量灵敏度信息(hip-nn-ts)扩展层次相互作用的粒子神经网络(hip-nn)。至关重要的是,该方法采用了一种重量绑定策略,该策略允许直接合并多体信息,同时添加很少的模型参数。我们表明,对于几个数据集和网络尺寸,hip-nn-ts比hip-nn更准确,参数计数的增加可忽略不计。随着数据集变得更加复杂,张量敏感性为模型准确性提供了更大的改进。特别是,HIP-NN-TS在具有挑战性的COMP6基准上实现了记录的平均绝对误差为0.927 kcal/mol,其中包括一组广泛的有机分子。我们还比较了嘻哈-NN-TS与文献中其他模型的计算性能。

Atomistic machine learning focuses on the creation of models which obey fundamental symmetries of atomistic configurations, such as permutation, translation, and rotation invariances. In many of these schemes, translation and rotation invariance are achieved by building on scalar invariants, e.g., distances between atom pairs. There is growing interest in molecular representations that work internally with higher rank rotational tensors, e.g., vector displacements between atoms, and tensor products thereof. Here we present a framework for extending the Hierarchically Interacting Particle Neural Network (HIP-NN) with Tensor Sensitivity information (HIP-NN-TS) from each local atomic environment. Crucially, the method employs a weight tying strategy that allows direct incorporation of many-body information while adding very few model parameters. We show that HIP-NN-TS is more accurate than HIP-NN, with negligible increase in parameter count, for several datasets and network sizes. As the dataset becomes more complex, tensor sensitivities provide greater improvements to model accuracy. In particular, HIP-NN-TS achieves a record mean absolute error of 0.927 kcal/mol for conformational energy variation on the challenging COMP6 benchmark, which includes a broad set of organic molecules. We also compare the computational performance of HIP-NN-TS to HIP-NN and other models in the literature.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源