, 15 tweets, 9 min read
My Authors
Read all threads
The 2010s were an eventful decade for NLP! Here are ten shocking developments since 2010, and 13 papers* illustrating them, that have changed the field almost beyond recognition.

(* in the spirit of @iamtrask and @FelixHill84, exclusively from other groups :)).
Shock 1 (2010): Remember neural networks? They might be much more useful for NLP than we thought. Please learn about recurrent neural networks (RNNs, [1]) and Recursive neural networks (RNNs, [2]).

[1] Tomáš Mikolov et al: Interspeech 2010, fit.vutbr.cz/research/group…
[2] Socher, Richard, Christopher D. Manning, and Andrew Y. Ng.: Learning continuous phrase representations and syntactic parsing with recursive neural networks.
@RichardSocher @chrmanning @AndrewYNg
nlp.stanford.edu/pubs/2010Soche…
Shock 2 (2013): Forget about simple recurrent and recursive networks, learn all about Long Short Term Memory instead.

[3] Alex Graves, Generating Sequences With Recurrent Neural Networks
arxiv.org/abs/1308.0850
@DeepMindAI
Shock 3 (2013): Remember distributional semantics? Good, now forget about it and learn all about word2vec instead

[4] T Mikolov, I Sutskever, K Chen, GS Corrado, J Dean. Distributed representations of words and phrases and their compositionality arxiv.org/abs/1310.4546
@JeffDean
Shock 4 (2014): Forget about phrase-based machine translation, learn all about seq2seq instead. Remember LSTMs? Good, now learn all about GRUs as well: simpler and sometimes better.

[5] Cho, Kyunghyun et al.: Learning Phrase Representations ... arxiv.org/abs/1406.1078
@kchonyc
Shock 5 (2014): Remember seq2seq? Good, now learn all about attention on top of it as well.

[6] Dzmitry Bahdanau, Kyunghyun Cho, Yoshua Bengio: Neural Machine Translation by Jointly Learning to Align and Translate arxiv.org/abs/1409.0473
@kchonyc @MILAMontreal
Shock 6 (2014): This deep learning thing might become big. Here’s SOTA on speech recognition using end-to-end neural models.

[7] Awni Hannun et al. : Deep Speech: Scaling up end-to-end speech recognition arxiv.org/abs/1412.5567
Shock 7 (2015): Not even classical strongholds of symbolic structure are safe. Recursive neural networks can learn logic [sort of, 8]; bi-LSTMs-based parsers reached SOTA on syntactic parsing [9].

[8] Bowman, Potts & Manning: arxiv.org/abs/1406.1827
@sleepinyourhat @ChrisGPotts
[9] Eliyahu Kiperwasser, Yoav Goldberg (2016): Simple and Accurate Dependency Parsing Using Bidirectional LSTM Feature Representations aclweb.org/anthology/Q16-…
@elikiper @yoavgo
Shock 8 (2016): Neural MT completely takes over the machine translation field (and soon also Google Translate).

[10] R Sennrich, B Haddow, A Birch, Edinburgh Neural Machine Translation Systems for WMT 16
aclweb.org/anthology/W16-…
@alexandrabirch1 @RicoSennrich
Shock 9 (2017): Know all about LSTMs now? Good, now forget all of it, as Attention Is All You Need (i.e. the Transformer).

[11] Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Lukasz Kaiser, Illia Polosukhin arxiv.org/abs/1706.03762
Shock 10 (2018): Remember word2vec? Good, now forget about it and learn all about contextualized word embeddings (ELMO [12], BERT [13]).

[12] Peters, Neumann, Iyyer, Gardner, Clark, Lee, & Zettlemoyer : Deep contextualized word representations: arxiv.org/abs/1802.05365
@nlpmattg
[13] J. Devlin, M. Chang, K. Lee and K. Toutanova: BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding. arxiv.org/abs/1810.04805
@toutanova
It has been an exhausting 10 years, trying to keep up with the literature, redesigning courses and dealing with an explosion of number of students. But it has certainly also been a privilige to be in a field where so much was happening!

# students of our Master AI NLP-electives:
Missing some Tweet in this thread? You can try to force a refresh.

Enjoying this thread?

Keep Current with Jelle Zuidema

Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

Twitter may remove this content at anytime, convert it as a PDF, save and print for later use!

Try unrolling a thread yourself!

how to unroll video

1) Follow Thread Reader App on Twitter so you can easily mention us!

2) Go to a Twitter thread (series of Tweets by the same owner) and mention us with a keyword "unroll" @threadreaderapp unroll

You can practice here first or read more on our help page!

Follow Us on Twitter!

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just three indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3.00/month or $30.00/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!