Discover and read the best of Twitter Threads about #rep4nlp

Most recents (4)

#rep4nlp Yulia Tsvetkov talk #4

"Modeling Output spaces of NLP models" instead of the common #Bertology that focuses on Modeling input spaces only.

#ACL2019nlp
The focus in the presentation will be on Conditional language generation
#MT #summarization ..etc
"to be able to build diverse NLP models for 1000s of users we have to build 100ks of models for combinations of:

* Languages
* Tasks
* Domains
* People preferences
Read 19 tweets
Talk3: @raquelfdzrovira talking about representations shaped by dialogue interaction data.

#ACL2019nlp #rep4nlp
"Task-oriented dialogue" is the setup we are discussing now because it gives us success notion to the dialogue analyse
The plan:

Instead of pre-defined symbolic representations for dialogue systems lets model visually grounded agents that learn to "see, ask and guess"
Read 13 tweets
#rep4nlp Invited talk 2: @mohitban47 "Adversaially robust Representation Learning"

#acl2019nlp
@mohitban47 Adv. examples can break Reading comprehension systems.

"Adversarial Examples for Evaluating Reading Comprehension Systems" Jia and liang 2017
arxiv.org/abs/1707.07328
To fix that: "AddSentDiverse" a mod of AddSent (Jia et al. 2017), aimed at producing adversarial examples for robust training purposes based on rule-based semantic rules

Robust Machine Comprehension Models via Adversarial Training (NAACL2018 short)
arxiv.org/pdf/1804.06473…
Read 10 tweets
Marco Baroni is starting the first invited talk at #rep4nlp
"Language is representations by itself. " #ACL2019nlp @sigrep_acl
@sigrep_acl Marco is talking about his previous work about the emergence of language communication between agents.

References:

"MULTI-AGENT COOPERATION
AND THE EMERGENCE OF (NATURAL) LANGUAGE"
arxiv.org/pdf/1612.07182…

"How agents see things"
aclweb.org/anthology/D18-…

#ACL2019nlp
@sigrep_acl Efficient encoding of input information:

"We explore providing some information to the sender and receiver agents and look at the emerging language if it develops to ignore the redundant parts of the input"

Kharitonov et al. 2019
arxiv.org/abs/1905.13687

#ACLnlp2019 #rep4nlp
Read 13 tweets

Related hashtags

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3.00/month or $30.00/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!