2023-07-28 –, 26-100
Scaling up atomistic simulation models is hampered by expensive calculations of interatomic forces. Machine learning potentials address this challenge and promise the accuracy of first-principles methods at a lower computational cost. This talk presents, as part of the research activities of the CESMIX project, how Julia is used to facilitate automating the composition of a novel neural potential based on the Atomic Cluster Expansion.
Simplifying the composition of machine learning (ML) interatomic potentials is key to finding combinations, between data, descriptors, and learning methods, that exceed the accuracy and performance of the state-of-the-art. The Julia programming language, and its burgeoning atomistic ecosystem, can facilitate the composition of neural networks and other ML models with cutting-edge interatomic potentials, through mechanisms such as multiple dispatch, differentiable programming, ML and GPU abstractions, as well as specialized scientific computing libraries. Here, the use of Julia to automatize the composition of a novel neural potential based on the Atomic Cluster Expansion (ACE) is presented as part of the research activities of the Center for the Exascale Simulation of Materials in Extreme Environments (CESMIX). The proposed scheme aims to facilitate the execution of parallel fitting experiments that search for hyper-parameter values that significantly improve the accuracy in training and test metrics (e.g., MAE, MSE, RSQ, mean cos) of energies and forces with respect to different Density Functional Theory (DFT) data sets.
More information about composing ML potentials in Julia here.
Take a look at our growing atomistic CESMIX suite in GitHub here.
Postdoctoral Associate at Julia Lab, MIT.