Brian Lester Profile picture
AI Resident at Google working on Natural Language Processing and Understanding. Always looking to collaborate on new research ideas. Views are my own.
Sep 3, 2021 7 tweets 3 min read
My first @GoogleAI residency project was accepted to @emnlpmeeting #EMNLP2021!

Prompt Tuning can condition a frozen T5 XXL model to perform new tasks while only adding 0.003% more parameters and no performance loss.

Camera Ready 📸: arxiv.org/abs/2104.08691

Quick Thread 🧵(1/7) A graph plotting model performance on SuperGLUE verse the nu Fine-tuning all the parameters of large pre-trained models works well and is the core of many SotA NLP results right now, but has some sharp edges. The size can make them difficult to work with and serve, plus each fine-tuning run creates a fork. (2/7)