How to get URL link on X (Twitter) App
📄 Link to paper PDF: arxiv.org/abs/2503.03965
https://twitter.com/ADuvalinho/status/1734904524429529300And I learnt so much from my passionate and kind co-authors on this long journey of writing something together without targeting any venues or aiming to publish 💙
How powerful are geometric GNNs? How do design choices influence expressivity?
https://twitter.com/karpathy/status/1305302243449516032Firstly (and this is the most well known), large-scale Transformers have seemingly replaced RNNs in commercial NLP pipelines as they scale better: jalammar.github.io/illustrated-be…
The key idea: Sentences are fully-connected graphs of words, and Transformers are very similar to Graph Attention Networks (GATs) which use multi-head attention to aggregate features from their neighborhood nodes (i.e., words).