Assistant Professor @Stanford Statistics and @StanfordBrain. Computational Neuroscience, Machine Learning, Bayesian Statistics.
Dec 18, 2019 • 4 tweets • 2 min read
Enjoyed this new work from Sean Bittner et al. They use normalizing flows to approximate a max-ent distribution over parameters of a black box model, subject to the constraint that the model produce a specified "emergent property" of interest.
They apply the method to a bunch of neat examples: the somatogastric ganglion (everyone's favorite), a four-population model of V1, a task switching model of SC, and abstract RNNs trained to perform posterior inference.
May 3, 2019 • 15 tweets • 6 min read
Hierarchical Recurrent State Space Models Reveal Discrete and Continuous Dynamics of Neural Activity in C. elegans biorxiv.org/content/10.110…
This has been a big part of my postdoc and I'm glad to finally share it! Huge thanks to my coauthors/mentors Annika, Dave, Manuel, and Liam.
We took a generative approach to modeling whole brain imaging data from @neurotheory et al (2015) [cell.com/abstract/S0092…] and Nichols et al (2017) [science.sciencemag.org/node/695860.fu…] with the goal of better understanding latent states of neural activity and their dynamics.