Cluster expansion by transfer learning for phase stability predictions

Recent progress towards universal machine-learned interatomic potentials holds considerable promise for materials discovery. Yet the accuracy of these potentials for predicting phase stability may still be limited. In contrast, cluster expansions provide accurate phase stability predictions but are...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Computational materials science 2024-06, Vol.242, p.113073, Article 113073
Hauptverfasser: Dana, A., Mu, L., Gelin, S., Sinnott, S.B., Dabo, I.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Recent progress towards universal machine-learned interatomic potentials holds considerable promise for materials discovery. Yet the accuracy of these potentials for predicting phase stability may still be limited. In contrast, cluster expansions provide accurate phase stability predictions but are computationally demanding to parameterize from first principles, especially for structures of low dimension or with a large number of components, such as interfaces or multimetal catalysts. We overcome this trade-off via transfer learning. Using Bayesian inference, we incorporate prior statistical knowledge from machine-learned and physics-based potentials, enabling us to sample the most informative configurations and to efficiently fit first-principles cluster expansions. This algorithm is tested on Pt:Ni, showing robust convergence of the mixing energies as a function of sample size with reduced statistical fluctuations. [Display omitted]
ISSN:0927-0256
1879-0801
DOI:10.1016/j.commatsci.2024.113073