Building Blocks of Neural Network Intermolecular Interaction Potentials
Loading...
Author(s)
Metcalf, Derek
Advisor(s)
Editor(s)
Collections
Supplementary to:
Permanent Link
Abstract
The essence of the computational sciences is to find compressed, silicon-ready rulesets of the natural world and use them to predict all of the complexities of reality without actually observing it. Practically, no lossless variant of such a compression is compatible with the computers of today, and we instead focus on choosing a set of approximations that induce nicely-cancelling errors. One popular way of concocting approximations is to use well-established physical principles (such as the Schrödinger equation for computing properties of atomistic systems) and progressively remove complexity without introducing dependence on real-world observations. These "first principles" approaches contrast with empirical methods that often use parameters to encourage their simpler models to match experimental data at a reduced computational expense. Although less conceptually pleasant, some empiricism is a mainstay in computational chemistry as a result of the success and usefulness of molecular mechanics (MM), density functional theory (DFT), and recently, machine learning (ML). This thesis introduces developments in machine learning models, specifically neural networks, that seek to predict the strength of interactions between molecules. We further discuss neural networks imbued with physics, application domains within pharmaceutical discovery, and the all-important data upon which our models are parameterized.
Sponsor
Date
2022-10-17
Extent
Resource Type
Text
Resource Subtype
Dissertation