I finally watched all the talks I wanted to, ended up importing 56 papers to my bib, and now present to you:

🎉 My 13 favorite papers (sorted alphabetically) at #EMNLP2020! 🔥

[1/15]
#EMNLP2020 recommendation:

"Attention is Not Only a Weight: Analyzing Transformers with Vector Norms"
@goro_koba, @ttk_kuribayashi, @sho_yokoi_, Kentaro Inui

Small vectors with high attention still have small impact!

aclweb.org/anthology/2020…



[2/15]
#EMNLP2020 recommendation:

"BLEU might be Guilty but References are not Innocent"
@markuseful, David Grangier, @iseeaswell

Translationese references reward the wrong systems!

aclweb.org/anthology/2020…



[3/15]
#EMNLP2020 recommendation:

"Grounded Compositional Outputs for Adaptive Language Modeling"
@nik0spapp, @PhoebeNLP, @nlpnoah

Use definitions, related words, and spellings to improve word representations in LMing!

aclweb.org/anthology/2020…



[4/15]
#EMNLP2020 recommendation:

"How do Decisions Emerge across Layers in Neural Models? Interpretation with Differentiable Masking"
@nicola_decao, @michael_sejr, @wilkeraziz, @iatitov

Learn when to "erase" irrelevant words!

aclweb.org/anthology/2020…



[5/15]
#EMNLP2020 recommendation:

"How Much Knowledge Can You Pack Into the Parameters of a Language Model?"
@ada_rob, @colinraffel, Noam Shazeer

Giant model. Tons of experiments. Closed-book QA.

aclweb.org/anthology/2020…



[6/15]
#EMNLP2020 recommendation:

"Learning Music Helps You Read: Using Transfer to Study Linguistic Structure in Language Models"
Isabel Papadimitriou, Dan Jurafsky

Training LSTMs on music or code first helps for language!

aclweb.org/anthology/2020…

(no tweet :'( )

[7/15]
#EMNLP2020 recommendation:

"OCR Post Correction for Endangered Language Texts"
@shrutirij, @anas_ant, @gneubig

Off-the-shelf OCR systems *can* be used for endangered languages, but require lots more thought and effort!

aclweb.org/anthology/2020…



[8/15]
#EMNLP2020 recommendation:

"Pareto Probing: Trading Off Accuracy for Complexity"
@tpimentelms, @nsaphra, @adinamwilliams, @ryandcotterell

Scatter probes by their accuracy / complexity and look at the Pareto frontier!

aclweb.org/anthology/2020…



[9/15]
#EMNLP2020 recommendation:

"Quantifying Intimacy in Language"
@jiaxin_pei, @david__jurgens

On new data that labels questions for intimacy, analyze content and social context! Gender bias! Distance! Hedging!

aclweb.org/anthology/2020…



[10/15]
#EMNLP2020 recommendation:

"Reformulating Unsupervised Style Transfer as Paraphrase Generation"
@kalpeshk2011, @johnwieting2, @MohitIyyer

Big survey, slamming past eval metrics (hell yeah!). Oh, and a SotA model.

aclweb.org/anthology/2020…



[11/15]
#EMNLP2020 recommendation:

"Scaling Hidden Markov Language Models"
Justin Chiu, @srush_nlp

Make HMMs big (blocks & neuralization) and they're pretty powerful!

aclweb.org/anthology/2020…



[12/15]
#EMNLP2020 recommendation:

"Sparse Text Generation"
@pedrohenmartins, Zita Marinho, @andre_t_martins

Make your autoregressive LM use sparsemax instead of softmax! No need for top-k / nucleus hacks.

aclweb.org/anthology/2020…



[13/15]
#EMNLP2020 recommendation:

"With Little Power Comes Great Responsibility"
@dallascard, @PeterHndrsn, @ukhndlwl, @robinomial, @kmahowald, @jurafsky

Think about statistical power when *constructing* datasets!

aclweb.org/anthology/2020…



[14/15]
Overall, there were so many fascinating papers at #EMNLP2020 and I wonder how many more were in Findings but didn't get to upload a video (why not, really?)

Hopefully next time, I'll have the energy to actually prepare and attend. See y'all at #NeurIPS2020? 👋

[15/15]

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Sabrina J. Mielke

Sabrina J. Mielke Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @sjmielke

19 Jul
My first #ICML2020 was different from my n-th #acl2020nlp, but, or perhaps because of that, I did try to look for interesting papers that I could relate to but that might still teach me something new!

Papers, in roughly chronological order---each with a short summary :) [1/42]
“How Good is the Bayes Posterior in Deep Neural Networks Really?” (Florian Wenzel/@flwenz, Kevin Roth, @BasVeeling, Jakub Swiatkowsk, Linh Tran, @s_mandt, @JasperSnoek, @TimSalimans, @RJenatton, Sebastian Nowozin)

arxiv.org/abs/2002.02405


#ICML2020 [2/42]
[“How Good is the Bayes Posterior in Deep Neural Networks Really?” cont.]

As shown in @andrewgwils’ awesome tutorial, tempering works, probably because of bad priors?

#ICML2020 [3/42]
Read 43 tweets
2 May
With @iclr_conf #ICLR2020 over and a bit of sleep under my belt, I'd like to give my short summary of a truly great event---and offer a list of the papers I enjoyed seeing (for those who are into that kind of thing).
In general, I feel lucky to live in a time where we have venues like these full of really interesting papers on the intersection between NLP and ML (and others, but that's what I personally am most into, so my experience is biased).
First off, echoing what everyone else concluded: the website was great. For those who didn't attend, I hope you'll get to see it soon. Having a prerecorded 5-minute talk for each paper along with the slides you could click through made for excellent paper browsing in my mind:
Read 47 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!

Follow Us on Twitter!