, 7 tweets, 3 min read Read on Twitter
I have a lot of respect for Quoc - his research is extremely influential, creative and far reaching. However, both papers referenced here are cited in the ELMo paper. So this cannot be a complaint of academic attribution.(1/7)
So what caused this difference in credit assignment? Why are the ELMo/BERT/OpenAI GPT more credited in current research?

The difference is, code, pre-trained models and open source software. (2/7)
You can use ELMo in TF OR Pytorch. You can download 5 different pre-trained models here: allennlp.org/elmo. You can learn how to apply it to your own use case here: github.com/allenai/allenn…
And you can reproduce the original training here: github.com/allenai/bilm-tf (3/7)
The same is true of BERT and OpenAI GPT (mainly due to @Thom_Wolf !).

It's definitely possible this is true of "Semi-Supervised Sequence Learning" too! There is code here:
github.com/tensorflow/mod…

(4/7)
However, this doesn't look like the original implementation, as the author of the code is not present on the paper, so perhaps it is a re-implementation? Happy to be corrected here. Perhaps the code was released at the time and I just can't find it online now.
(5/7)
I tried to install it and run the tests with the single "tensorflow>=1.3" requirement. The tests don't pass. I can't run any of the 3 main executable files. Which version of python does it use? I don't know. I tried with python 3.5. How do I download a pre-trained model?
(6/7)
So really, the lesson here is that being first is not the sole contributor to academic attribution. If you want people to cite and build on your research
1) release your code
2) release your models
3) make it silly easy to install
3) make it silly easy to use and extend.
(7/7)
Missing some Tweet in this thread?
You can try to force a refresh.

Like this thread? Get email updates or save it to PDF!

Subscribe to Mark Neumann
Profile picture

Get real-time email alerts when new unrolls are available from this author!

This content may be removed anytime!

Twitter may remove this content at anytime, convert it as a PDF, save and print for later use!

Try unrolling a thread yourself!

how to unroll video

1) Follow Thread Reader App on Twitter so you can easily mention us!

2) Go to a Twitter thread (series of Tweets by the same owner) and mention us with a keyword "unroll" @threadreaderapp unroll

You can practice here first or read more on our help page!

Follow Us on Twitter!

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just three indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3.00/month or $30.00/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!