NAGL2 Neural Network Charge Model Phase 1

Driver

@Alexandra McIsaac

Approver

@Lily Wang @Brent Westbrook (Unlicensed)

Contributors

 

Other stakeholders

@David Mobley , @Michael Shirts , @Daniel Cole

Objective

A neural network charge model that can assign conformer-independent charges to both small molecules and large systems, at a higher level of theory than AM1BCC

Time frame

?

Key outcomes

A neural network charge model that:

  • Is trained on data with a higher level of QM theory than AM1-BCC, with polarization effects from a solvent model

  • Can accurately assign charges to small molecules and large systems at a reasonable speed

  • Assigns charges that perform better in simulation than AM1-BCC

  • Corrects issues with sulfur and phosphorus charges

A force field incorporating:

  • NAGL2 charges

  • re-trained vdW terms

  • re-trained valence terms

Key metrics

  • Good reproduction of the underlying data defined as equivalent or better testing error on ESPs, dipoles, and quadrupoles at the NAGL2 level of theory, compared to NAGL’s testing error on AM1BCC

  • Improved performance on “real-world” benchmarks compared to NAGL/AM1BCC-ELF10 (e.g. solvation free energies, protein-ligand benchmarks, or other similar targets), especially for hypervalent atoms

Status

In progress

GitHub repo

Slack channel

https://openforcefieldgroup.slack.com/archives/CDR1P66Q2

Designated meeting

FF fitting meeting

Released force field

Publication

 

 Problem Statement and Objective

AM1-BCC charges are trained to reproduce RESP charges, which are calculated at a low level of QM theory (HF/6-31G*) and rely on that theory level’s overpolarization to fortuitously model charge polarization in solution. The level of theory is particularly poorly suited for sulfur and phosphorus, which can be hypervalent, as well as some other functional groups. Additionally, it has been shown that HF/6-31G* does not consistently overpolarize charges by the same amount in every system, and within a given system, it erroneously polarizes both solvent-accessible and buried atoms by the same amount. These issues with polarization become more problematic the larger the simulated system is, causing more problems for large systems than small molecules.

In order to accurately model electrostatics, we wish to train a graph neural network charge model which solves these problems. We will train the GNN to a higher level of QM theory, to more accurately capture the electrostatics of complicated systems like hypervalent atoms. We will model the effects of solvent polarization directly by using a solvent model.

 Scope

Must have:

  • Neural network charge model that performs better than or equivalent to AM1BCC-ELF10 on very small molecules, small molecules, and proteins, lipids, and nucleic acids

  • Minimum element set includes all currently covered atoms

  • Charge assignment must scale better than AM1-BCC

  • Assigned charges must reproduce QM ESPs and dipoles better than NAGL1/AM1-BCC

  • Assigned charges must reproduce “real world” benchmarks like solvation free energies and protein-ligand binding better than NAGL1/AM1-BCC

  • Must provide reasonable/physical charges for “buried atoms” e.g. atoms that are not solvent accessible and often are assigned unphysical charges with unrestrained ESP fitting methods

Must have:

  • Neural network charge model that performs better than or equivalent to AM1BCC-ELF10 on very small molecules, small molecules, and proteins, lipids, and nucleic acids

  • Minimum element set includes all currently covered atoms

  • Charge assignment must scale better than AM1-BCC

  • Assigned charges must reproduce QM ESPs and dipoles better than NAGL1/AM1-BCC

  • Assigned charges must reproduce “real world” benchmarks like solvation free energies and protein-ligand binding better than NAGL1/AM1-BCC

  • Must provide reasonable/physical charges for “buried atoms” e.g. atoms that are not solvent accessible and often are assigned unphysical charges with unrestrained ESP fitting methods

Nice to have:

  • Expand element coverage to include B, Si, maybe metals?

  • Incorporating virtual sites

  • Confidence metric returned directly by neural network

Not in scope:

  • Large systems that aren’t proteins, e.g. organometallics

Project Approaches

References