Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Participants

Discussion topics

Item

Notes

General updates

  • MT – Met with Zach earlier, I like his attitude. Should he be in more slack channels?

    • JW – Yes, adding him to #internal and some others now. I think he can help a lot with community contact as a starting point - Amplifying release announcements on social media and stuff are tasks that are hard/distracting for me but may be straightforward for him.

  • JW – OFFTK 0.14.1 out, finishing API point removalChemicalEnvironment removal, adding custom polymer loader (for examples see here). Will continue on to QCSubmit/QCA interface now.

    • MT – How would I define my own substructures?

    • JW – See code in OFFTK’s utilities/make_substructure_dict/

  • JW – Thanks for pushing SMIRNOFF EP through. I like the PR to implement.

    • MT – One complication with implementation, I think I’ve worked around the relevant cases. When we did this a year ago with the electrostatic method there was an inherent simplicity with the implementation, since the file loading and OMM Force creation happened in the same file (there was never a case where create_force had to check to see whether the Toolkit knew about a certain version of a ParameterHandler tag.) So Interchange needs to know which version of the file was read… OFFTK could be made to auto-upgrade to vdW 0.4, but this would immediately require anew release of Interchange. So the vanilla approach will be to do simultaneous releases. But that will be high-risk. So what I’d rather do is have OFFTK upconvert to 0.4, and then Interchange would inspect the OFFTK object and DOWNconvert to 0.3 So the process would be:

      • Release of IC that accepts 0.3 AND 0.4 version vdWHandlers (downconverting 0.4s to 0.3s)

      • Release of OFFTK that auto-upconverts all 0.3s to 0.4s

      • Release of IC to accept 0.3s and 0.4s directly.

    • JW – This plan makes sense, though I’d also be fine with simultaneous releases.

    • MT – The simultaneous release plan will make a hard discontinuity in package support, whereas the three-step release plan will offer some flexibility for different versions.

    • JW – Good point - Let’s do the 3 step plan then.

    • MT – Ok, I think the PRs are nearly ready to go, I’ll loop you in on release planning.

  • JW – Advice on pydantic 2? Should we stick with the legacy v1 API in v2 or just keep pinning to v1?

    • MT – I have a plan on this - Our previous plan was to pin everything to v1 and greenlight each component in our ecosystem to v2 until it’s all done. I was previously unaware of the legacy API in v2 - Early testing looks promising and there’s a pattern I can roll out to get this going throughout our ecosystem.

    • MT – Two particularly good things here:

      • A try/except import trick with the legacy api will make it compatible with pydantic v1 or v2.

      • The code changes to get the backdoor working are pretty minimal - Will just need to roll them out in a lot of python files.

    • Two downsides are:

      • Have to make this change in ALL software that uses pydantic, and release each package. We have a lot of interdependent repos, so that will mean making releases in a careful and specific order. (models - interchange - toolkit (implicitly) - bespokefit - fragmenter - qcsubmit)

      • This just punts on actually updating the code. And pydantic v2 does have some conceptual differences so this will require a lot of hand-holding.

    • MT – My plan is basically to test this whole move using off-label builds. I think I can get help on testing this by sharing a conda yaml that pulls from these labels with our downstreams.

    • MT – I’m comfortable taking responsibility for this top-to-bottom, will probably take a few weeks. I’ll reach out if any toolkit releases are needed but I doubt it. May need releases of bespokefit, fragmenter, qcsubmit.

    • JW – That sounds great.

Trello


Issue/PR clearance

Action items

...