, 15 tweets, 6 min read Read on Twitter
Hierarchical Recurrent State Space Models Reveal Discrete and Continuous Dynamics of Neural Activity in C. elegans biorxiv.org/content/10.110…

This has been a big part of my postdoc and I'm glad to finally share it! Huge thanks to my coauthors/mentors Annika, Dave, Manuel, and Liam.
We took a generative approach to modeling whole brain imaging data from @neurotheory et al (2015) [cell.com/abstract/S0092…] and Nichols et al (2017) [science.sciencemag.org/node/695860.fu…] with the goal of better understanding latent states of neural activity and their dynamics.
Based on Kato et al and Nichols et al, switching linear dynamical systems (SLDS) seemed like a natural starting point. Essentially, SLDS capture the idea that neural activity reflects a low dim. latent state and that state dynamics switch between a few simple, linear regimes.
This is what a model fit looks like. We combine partial recordings of multiple worms to learn a shared model of neural activity. For each worm we get low dim. continuous states and an automatic segmentation of activity based on how those states are changing over time.
Interestingly, our inferred discrete states (i.e. segmentations) line up really well with the expert segmentations from past work. We get more than just a segmentation though; we also get an estimate of linear dynamics within each state and a quant. method for model comparison.
What's more, the discrete states correspond to patterns of neural activity that, in unconstrained worms, are associated with different behaviors, like forward crawling, reversals, turns, etc. (This actually follows from the overlap with expert labels.)
This connects to another line of work with @vulcnethologist et al in the @Datta_Lab and @blsabatini lab at Harvard. There we started from the other direction and found latent states in behavioral data that mapped to neural activity in striatum. sciencedirect.com/science/articl…
Worm data presents a bunch of new challenges though. 1. Each worm is slightly different. Hierarchical Bayesian models are a natural solution. 2. Discrete and continuous dynamics are tightly coupled. Our work on "recurrent" SLDS was designed for this. proceedings.mlr.press/v54/linderman1…
[Also check out the awesome tree-structured extensions by @JosueNassar!]
The Nichols et al (2017) dataset adds another interesting twist. She has 2 genetic strains of worms at 2 developmental stages, with time-varying sensory input. The probabilistic framework makes it easy to handle this complexity with a deeper hierarchy and covariates.
With hierarchy, inputs, and recurrent dynamics, we have a powerful model for extracting latent states of neural activity, but are we missing anything? One way to test this is by simulating new neural traces from the model. Here are some samples of real & sim. data.
The simulations recapitulate many features of real data: transitions between fwd/rev and turns; dynamics within each state; differences between worms. There's plenty of room for improvement, of course, and we're thinking hard about the next gen. of models!
What's next? We're thinking about how to relate states to algorithms and implementations; how to extend these methods for simultaneous neural and behavioral recordings; how to leverage (artificial) neural networks into the model (a la LFADS, @SussilloDavid); and lots more.
Finally, thanks again to all my labmates, mentors, and collaborators who've made this project (and all the related work it builds on) possible, and thank you to the @SCglobalbrain for all your support.
PS: If you're a prospective grad student or postdoc working at the intersection of machine learning/statistics/neuroscience and you're into this kind of work, look me up at Stanford next month!
Missing some Tweet in this thread?
You can try to force a refresh.

Like this thread? Get email updates or save it to PDF!

Subscribe to Scott Linderman
Profile picture

Get real-time email alerts when new unrolls are available from this author!

This content may be removed anytime!

Twitter may remove this content at anytime, convert it as a PDF, save and print for later use!

Try unrolling a thread yourself!

how to unroll video

1) Follow Thread Reader App on Twitter so you can easily mention us!

2) Go to a Twitter thread (series of Tweets by the same owner) and mention us with a keyword "unroll" @threadreaderapp unroll

You can practice here first or read more on our help page!

Follow Us on Twitter!

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just three indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3.00/month or $30.00/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!