Namuk Park Profile picture
Feb 15 7 tweets 4 min read
Our paper “How Do Vision Transformers Work?” was accepted as a Spotlight at #ICLR2022!!

We show that the success of ViTs is NOT due to their weak inductive bias & capturing long-range dependency.

paper: openreview.net/forum?id=D78Go…
code & summary: github.com/xxxnell/how-do…

👇 (1/7)
We address the following three key questions of multi-head self-attentions (MSAs) and ViTs:

Q1. What properties of MSAs do we need to better optimize NNs? 
Q2. Do MSAs act like Convs? If not, how are they different?
Q3. How can we harmonize MSAs with Convs?

(2/7)
Q1. What Properties of MSAs Do We Need?

MSAs have their pros and cons. MSAs improve NNs by flattening the loss landscapes. A key feature is their data specificity, not long-range dependency. On the other hand, ViTs suffers from non-convex losses.

(3/7)
Q2. Do MSAs Act Like Convs?

MSAs and Convs exhibit opposite behaviors. Therefore, MSAs and Convs are complementary. For example, MSAs are low-pass filters, but Convs are high-pass filters. It suggests that MSAs are shape-biased, whereas Convs are texture-biased.

(4/7)
Q3. How Can We Harmonize MSAs With Convs?

MSAs at the end of a stage (not a model) play a key role. We thus introduce AlterNet by replacing Convs at the end of a stage with MSAs. AlterNet outperforms CNNs not only in large data regimes but also in small data regimes.

(5/7)
Then, how to apply MSA to your own CNN model?

1. Alternately replace Conv blocks with MSA blocks from the end of a baseline CNN.
2. If the added MSA block does not improve predictive performance, replace a Conv block located at the end of an earlier stage with an MSA.

(6/7)
In summary, MSA ≠ Conv with weak inductive bias.
The self-attention formulation is ANOTHER inductive bias that complements Convs.

slide: bit.ly/3gNkV7e

(7/7)

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Namuk Park

Namuk Park Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Don't want to be a Premium member but still want to support us?

Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal

Or Donate anonymously using crypto!

Ethereum

0xfe58350B80634f60Fa6Dc149a72b4DFbc17D341E copy

Bitcoin

3ATGMxNzCUFzxpMCHL5sWSt4DVtS8UqXpi copy

Thank you for your support!

:(