Excited to share my new paper on representational drift! Thanks to my collaborators @CPehlevan @fgh_shiva @dlipshutz @AnirvanMS @chklovskii #tweeprint (biorxiv.org/content/10.110…)
1) Many thought long-term memories and stable task performance were related to stable neuronal representations. Surprisingly recent experiments showed that neural activity in several brain regions continuously change
even after animals have fully learned and stably perform their tasks. The underlying mechanisms and dynamics of this “representational drift” remain largely unknown.
2) We focused on drift in a neural population that learns to represent stimuli in a way that optimizes a representational objective. We hypothesized that if this objective has degenerate optima, noisy synaptic updates during learning will drive the network
to explore the synaptic weight space that corresponds to (near-)optimal neural representations. This means that the neural representation will drift within the space of optimal representations.
3) Hebbian/anti-Hebbian networks provide a concrete example of such networks and are ideal to study in many ways: they are biologically plausible, they optimize similarity matching objectives,
and they learn localized RFs that tile the input data manifold, providing a minimum model for hippocampal place cells and neurons in posterior parietal cortex.
4) We explored the long-term dynamics of learned receptive fields in these network in the presence of synaptic noise. We found that the drifting receptive fields can be characterized by a coordinated random walk,
with the effective diffusion constants depending on various parameters such as learning rate, noise amplitude, and input statistics.
5) Despite the drift, the representational similarity of population codes is stable over time. Our model recapitulates recent experimental observations in hippocampus and posterior parietal cortex, and
makes testable predictions that we tested on data and further can be probed in future experiments.

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Shanshan Qin

Shanshan Qin Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!

Follow Us on Twitter!

:(