Here, we used #MachineLearning to design high diversity antibody variants with orders of magnitude greater potency than could be found with conventional directed mutagenesis. #ProteinML#AntibodyEngineering
Our method uses protein embeddings and Bayesian ML to design optimized antibody variant libraries, and we compare directly with other methods in a head-to-head prospective design study. 2/4
It was wonderful to collaborate with Lin Li, Rajmonda Caceres, Matt Walsh, and the rest of the @MITLL, @MIT, and @AAlphaBio teams on this project! 3/4
If you want to try out these kinds of ML methods in your own work with a user-friendly web interface, check out OpenProtein.AI and sign up for the beta at forms.gle/UeD9wDLvdG9LRw…! 4/4
• • •
Missing some Tweet in this thread? You can try to
force a refresh
Excited to announce PoET, our (@timt1630, @tbepler1) retrieval-augmented generative protein language model that achieves state-of-the-art unsupervised variant function prediction performance on #ProteinGym. #MachineLearning#ProteinML 1/9
Inspired by evolution, PoET conditions on observed protein sequences to infer fitness constraints and extrapolate a generative distribution of protein sequences. This allows PoET to be focused on any level of homology, from superfamilies to families to subfamilies and beyond. 2/9
The key idea in the design of PoET was to create a transformer that could condition on homologous sequences but did not require aligned inputs. Our solution was to model the generative process of whole protein families as a sequence-of-sequences generative modeling problem. 3/9