Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 4 de 4
Filter
Add more filters










Database
Language
Publication year range
1.
J Chem Phys ; 153(4): 044123, 2020 Jul 28.
Article in English | MEDLINE | ID: mdl-32752663

ABSTRACT

The emergence of machine learning methods in quantum chemistry provides new methods to revisit an old problem: Can the predictive accuracy of electronic structure calculations be decoupled from their numerical bottlenecks? Previous attempts to answer this question have, among other methods, given rise to semi-empirical quantum chemistry in minimal basis representation. We present an adaptation of the recently proposed SchNet for Orbitals (SchNOrb) deep convolutional neural network model [K. T. Schütt et al., Nat. Commun. 10, 5024 (2019)] for electronic wave functions in an optimized quasi-atomic minimal basis representation. For five organic molecules ranging from 5 to 13 heavy atoms, the model accurately predicts molecular orbital energies and wave functions and provides access to derived properties for chemical bonding analysis. Particularly for larger molecules, the model outperforms the original atomic-orbital-based SchNOrb method in terms of accuracy and scaling. We conclude by discussing the future potential of this approach in quantum chemical workflows.

2.
Nat Commun ; 10(1): 5024, 2019 11 15.
Article in English | MEDLINE | ID: mdl-31729373

ABSTRACT

Machine learning advances chemistry and materials science by enabling large-scale exploration of chemical space based on quantum chemical calculations. While these models supply fast and accurate predictions of atomistic chemical properties, they do not explicitly capture the electronic degrees of freedom of a molecule, which limits their applicability for reactive chemistry and chemical analysis. Here we present a deep learning framework for the prediction of the quantum mechanical wavefunction in a local basis of atomic orbitals from which all other ground-state properties can be derived. This approach retains full access to the electronic structure via the wavefunction at force-field-like efficiency and captures quantum mechanics in an analytically differentiable representation. On several examples, we demonstrate that this opens promising avenues to perform inverse design of molecular structures for targeting electronic property optimisation and a clear path towards increased synergy of machine learning and quantum chemistry.

3.
J Chem Theory Comput ; 15(1): 448-455, 2019 Jan 08.
Article in English | MEDLINE | ID: mdl-30481453

ABSTRACT

SchNetPack is a toolbox for the development and application of deep neural networks that predict potential energy surfaces and other quantum-chemical properties of molecules and materials. It contains basic building blocks of atomistic neural networks, manages their training, and provides simple access to common benchmark datasets. This allows for an easy implementation and evaluation of new models. For now, SchNetPack includes implementations of (weighted) atom-centered symmetry functions and the deep tensor neural network SchNet, as well as ready-to-use scripts that allow one to train these models on molecule and material datasets. Based on the PyTorch deep learning framework, SchNetPack allows one to efficiently apply the neural networks to large datasets with millions of reference calculations, as well as parallelize the model across multiple GPUs. Finally, SchNetPack provides an interface to the Atomic Simulation Environment in order to make trained models easily accessible to researchers that are not yet familiar with neural networks.

4.
J Chem Phys ; 148(24): 241709, 2018 Jun 28.
Article in English | MEDLINE | ID: mdl-29960372

ABSTRACT

We introduce weighted atom-centered symmetry functions (wACSFs) as descriptors of a chemical system's geometry for use in the prediction of chemical properties such as enthalpies or potential energies via machine learning. The wACSFs are based on conventional atom-centered symmetry functions (ACSFs) but overcome the undesirable scaling of the latter with an increasing number of different elements in a chemical system. The performance of these two descriptors is compared using them as inputs in high-dimensional neural network potentials (HDNNPs), employing the molecular structures and associated enthalpies of the 133 855 molecules containing up to five different elements reported in the QM9 database as reference data. A substantially smaller number of wACSFs than ACSFs is needed to obtain a comparable spatial resolution of the molecular structures. At the same time, this smaller set of wACSFs leads to a significantly better generalization performance in the machine learning potential than the large set of conventional ACSFs. Furthermore, we show that the intrinsic parameters of the descriptors can in principle be optimized with a genetic algorithm in a highly automated manner. For the wACSFs employed here, we find however that using a simple empirical parametrization scheme is sufficient in order to obtain HDNNPs with high accuracy.

SELECTION OF CITATIONS
SEARCH DETAIL
...