DurstewitzLab Profile picture
Nov 5 12 tweets 6 min read
Interested in reconstructing computational dynamics from neural data using RNNs?

Here we review dynamical systems (DS) concepts, recent #ML/ #AI methods for recovering DS from data, evaluation, interpretation, analysis, & applic. in #Neuroscience:
biorxiv.org/content/10.110…
A 🧵... Image
Some important take homes:
1) To formally constitute a true state space or reconstructed DS, some math. conditions need to be met. PCA and other dim. reduc. tools often won’t give you a state space in the DS sense (may even destroy it). Just training an RNN on data may not either Image
2) For DS reconstruction (DSR), a trained RNN should be able to reproduce *invariant geometrical and temporal properties* of the underlying system when run on its own. Image
3) Correlation (R^2) with data or MSE may not be good measures for reconstruction quality, especially if the system is chaotic, since then nearby trajectories quickly diverge even for the *very same system with same parameters*. Geometrical & invariant measures are needed. Image
4) DSR is a burgeoning field in #ML/ #AI & #DataScience: Everything from basis expansions/ library methods, Graph NNs, Transformers, Reservoir Comp., Neural ODEs to numerous types of RNNs has been suggested. Often these employ algos purpose-tailored for DSR.
5) RNNs are dynamically & comput. universal - they can approx. any DS, can implement any Turing machine. They can emulate biophysical neurons in detail, though having a very different functional form → You don’t need a biophysical model if you’d like to simulate biophysics. Image
6) RNNs can learn to mimic physiological data and to behave just like the underlying DS, incl. single-unit rec. on single trials … if DSR was successful, we can analyze and simulate the RNN further as a surrogate for the system it has been trained on. Image
7) By coupling the RNN to the data via specifically designed observation models, we can interpret operations in the RNN in biological terms and connect them to the biological substrate. Image
8) RNNs are great for multi-modal data integration: You can couple many observed data streams for DSR to the same latent RNN through sets of data-specific decoder models. Image
9) There are many types of dym. and attractors, some with complex geom. struc. like separatrix cycles or chaos. As we get closer to true DSR, as neurosci. moves to more ecol. tasks, the picture of neural dym. may get more complex than the simple point attr.s that still dominate. Image
10) There are many open issues in DSR, most importantly perhaps generalization beyond dynamical regimes seen in training.
We would love to hear your thoughts on this, any suggestions where to send this, or hints to important lit. omissions so we can integrate them. We apologize for likely having missed a lot, esp. of the #ComputationalNeuroscience lit.!

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with DurstewitzLab

DurstewitzLab Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Don't want to be a Premium member but still want to support us?

Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal

Or Donate anonymously using crypto!

Ethereum

0xfe58350B80634f60Fa6Dc149a72b4DFbc17D341E copy

Bitcoin

3ATGMxNzCUFzxpMCHL5sWSt4DVtS8UqXpi copy

Thank you for your support!

Follow Us on Twitter!

:(