#1 AMMUS- A Survey of Transformer-based Pretrained Language Models #nlproc #nlp #bert #survey
Paper link: arxiv.org/abs/2108.05542
#3 classify and present an overview of various pretraining methods
#4 presents a new taxonomy of T-PTLMs
#5 classify and explain in detail about various downstream adaptation methods
#6 presents a brief overview of various benchmarks (intrinsic and extrinsic)
#7 presents a brief overview of various libraries to work with T-PTLMs
#8 presents various future directions which will further improve T-PTLMs
@ucl_nlp @TsinghuaNLP @philipvollet @nlp_zurich @ai_nlp_ml_iitp @coastalcph @huggingface @facebookai @GoogleAI @stanfordnlp @InriaParisNLP @lcs2iiitd @AUEBNLPGroup @ntunlpsg @Nils_Reimers
Share this Scrolly Tale with your friends.
A Scrolly Tale is a new way to read Twitter threads with a more visually immersive experience.
Discover more beautiful Scrolly Tales like this.