Graph neural networks are driving lots of progress in machine learning by extending deep learning approaches to complex graph data and applications.

Let’s take a look at a few methods ↓
1) A Graph Convolutional Network, or GCN, is an approach for semi-supervised learning on graph-structured data. It’s based on an efficient variant of CNNs which operates directly on graphs and is useful for semi-supervised node classification.

paperswithcode.com/method/gcn
2) Diffusion-convolutional neural networks (DCNN) introduce a diffusion-convolution operation to extend CNNs to graph data. This enables learning of diffusion-based representations. It's used as an effective basis for node classification.

paperswithcode.com/method/dcnn
3) Graph Attention Network (GAT) is a graph neural network that leverages masked self-attentional layers. Hidden representations of nodes are computed by attending to neighbours using self-attention. It achieves SOTA results on node classification.

paperswithcode.com/method/gat
4) GraphSAGE is a general inductive framework that leverages node feature information (e.g., text attributes) to efficiently generate useful node embeddings for previously unseen data.

paperswithcode.com/method/graphsa…
5) The Graph Transformer is a generalization of transformer neural networks for arbitrary graphs. The architecture introduces new properties that leverage the graph connectivity inductive bias to perform well on problems where graph topology is important.

paperswithcode.com/method/graph-t…
And finally but not least... here is an extended list of graph neural networks and their associated papers, benchmark datasets, trends, and open source codes.

paperswithcode.com/methods/catego…

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Papers with Code

Papers with Code Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @paperswithcode

12 Oct
StyleGAN3 is out and results are 🤯!

It proposes architectural changes that suppress aliasing and forces the model to implement more natural hierarchical refinement which improves its ability to generate video and animation.

paperswithcode.com/paper/alias-fr…

1/8
In the cinemagraph below, we can see that in StyleGAN2 the texture (e.g., wrinkles and hairs) appears to stick to the screen coordinates. In comparison, StyleGAN3 (right) transforms details coherently:

2/8
The following example shows the same issue with StyleGAN2: textural details appear fixed. As for alias-free StyleGAN3, smooth transformations with the rest of the screen can be seen.

3/8
Read 8 tweets
4 Oct
In a new paper from @wightmanr et al. a traditional ResNet-50 is re-trained using a modern training protocol. It achieves a very competitive 80.4% top-1 accuracy on ImageNet without using extra data or distillation.

[mini-thread]

paperswithcode.com/paper/resnet-s…
The paper catalogues the exact training settings to provide a robust baseline for future experiments:
It also records training costs and inference times on ImageNet classification between other architectures trained with the proposed ResNet-50 optimized training procedure:
Read 6 tweets
20 Jan
🚨 Newsletter Issue #3. Featuring a new state-of-the-art on ImageNet, a trillion-parameter language model, 10 applications of transformers you didn’t know about, and much more! Read on below:

paperswithcode.com/newsletter/3
👩‍🔬 Research: featuring work by @hieupham789 et al., @LiamFedus et al., @Pengcheng2020 et al., Stergiou et al., Ding et al., @quocleix, among others.
👩‍💻 Libraries: featuring work by @laidacviet, @yzhao062, @nabla_theta, among others.
Read 4 tweets
30 Dec 20
⏪ Papers with Code: Year in Review. We’re ending the year by taking a look back at the top trending papers, libraries and benchmarks for 2020. Read on below!

medium.com/paperswithcode…
📄 Trending Papers for 2020, featuring:

- EfficientDet @tanmingxing @quocleix
- ResNeSt @zhanghang0704
- Big Transfer @__kolesnikov__ @giffmana
- FixRes @HugoTouvron
- as well as work by @QizheXie @colinraffel @ctnzr, and others
👩‍💻 Trending Libraries for 2020, featuring:

- Transformers @huggingface
- PyTorch Image Models @wightmanr
- Detectron2 @ppwwyyxx
- MMDetection @OpenMMLab
- as well as work by @eriklindernoren, @myleott, @ZloiAlexei, and others
Read 4 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!

Follow Us on Twitter!

:(