Premium
Size‐Extensive Molecular Machine Learning with Global Representations
Author(s) -
Jung Hyunwook,
Stocker Sina,
Kunkel Christian,
Oberhofer Harald,
Han Byungchan,
Reuter Karsten,
Margraf Johannes T.
Publication year - 2020
Publication title -
chemsystemschem
Language(s) - English
Resource type - Journals
ISSN - 2570-4206
DOI - 10.1002/syst.201900052
Subject(s) - computer science , representation (politics) , range (aeronautics) , set (abstract data type) , property (philosophy) , tensor (intrinsic definition) , theoretical computer science , machine learning , artificial intelligence , statistical physics , mathematics , physics , materials science , geometry , philosophy , epistemology , politics , political science , law , composite material , programming language
Machine learning (ML) models are increasingly used in combination with electronic structure calculations to predict molecular properties at a much lower computational cost in high‐throughput settings. Such ML models require representations that encode the molecular structure, which are generally designed to respect the symmetries and invariances of the target property. However, size‐extensivity is usually not guaranteed for so‐called global representations. In this contribution, we show how extensivity can be built into global ML models using, e. g ., the Many‐Body Tensor Representation. Properties of extensive and non‐extensive models for the atomization energy are systematically explored by training on small molecules and testing on small, medium and large molecules. Our results show that non‐extensive models are only useful in the size‐range of their training set, whereas extensive models provide reasonable predictions across large size differences. Remaining sources of error for extensive models are discussed.