Beautiful new work “Del-Dock” on marrying DELs and docking stemming from the summer internship of @KShmilovich with us, led by him and Benson Chen, and our first of many works with @mmsltn and yours truly.
In this work we learn deep representations using attention models in top of sampled docking poses combined with readouts from DNA encoded libraries to predict binding affinity. We introduce a likelihood that takes care of background matrix binding.
See what DELs do here 2/6
Why does all this matter? Docking poses are typically sampled from somewhat misspecified energy functions. Using the target binding information we can learn representations which do not rank the samples poses by their likelihood, but by how well they bind, retailing the poses 3/6
All of this actually leads to models that work surprisingly well both quantitatively 4/6
Here are some more insights I to how well this combination performs 5/6
And interestingly we also receive qualitative insights, have a look at the paper!
This was a fun project, congrats to my co-authors, and we have a lot more ideas to build on this.
If you like this and also want to work on #AI4science with us, we are hiring (see previous)
6/6
The word is “re-ranking”
• • •
Missing some Tweet in this thread? You can try to
force a refresh
I will be taking interns and hiring FTEs for my advanced ML team at @insitro.
If you are interested in #AI4science and want to work on methods inspired by real world problems in drug discovery reach out. I am also at #neurips22 until the end of the week.
Topics! Links in 🧵1/4
We’re interested in:
generative models
causal inference
Robustness/uncertainty
Domain specific models for biology/genetics/chemistry/imaging
Probabilistic/deep modeling
Decision making & experimentation