Didn’t know that @chughesjohnson started a “working with Claire” @stripe. Brilliant! Makes me want to update and share more broadly the one I did a few years ago:
Interesting point here! Can’t the structure/team/culture be built so that team members (versus just leadership) are incentivized, rewarded and empowered to stop things? Sounds healthier/more scalable! cc @julien_c@Thom_Wolf
It is so cool to see ESM-2 & ESMFold models from @MetaAI, new state-of-the-art Transformer protein language and folding models, available in the @huggingface hub & in transformers thanks to @carrigmat@GuggerSylvain & team!
ESM-2 is trained with a masked language modeling objective, and it can be easily transferred to sequence and token classification tasks for proteins. ESMFold is a sequence protein folding model which produces high accuracy predictions significantly faster.
Unlike previous protein folding tools like AlphaFold2 and openfold, ESMFold uses a pretrained protein language model to generate token embeddings that are used as input to the folding model, and so does not require a multiple sequence alignment (MSA) of related proteins as input