论文标题

使用传输学习有限的数据来改善材料特性的神经网络预测

Improving neural network predictions of material properties with limited data using transfer learning

论文作者

Krawczuk, Schuyler, Venturi, Daniele

论文摘要

我们开发了新的转移学习算法,以基于密度功能理论(DFT)的从头算模拟加速材料特性的预测。转移学习已被成功地用于材料科学以外的其他应用程序中的数据有效建模,并且允许从大型数据集中学到的可转让表示,即使使用小型数据集也可以重新学习新任务。在材料科学的背景下,这打开了可以开发可推广的神经网络模型的可能性,这些模型可以在其他材料上重新利用,而无需产生大型(计算昂贵)的材料属性训练集。提出的转移学习算法在预测光过渡金属氧化物的吉布斯自由能。

We develop new transfer learning algorithms to accelerate prediction of material properties from ab initio simulations based on density functional theory (DFT). Transfer learning has been successfully utilized for data-efficient modeling in applications other than materials science, and it allows transferable representations learned from large datasets to be repurposed for learning new tasks even with small datasets. In the context of materials science, this opens the possibility to develop generalizable neural network models that can be repurposed on other materials, without the need of generating a large (computationally expensive) training set of materials properties. The proposed transfer learning algorithms are demonstrated on predicting the Gibbs free energy of light transition metal oxides.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源