My Authors
Read all threads
Very excited to share my first #tweeprint today!

Work by me, Daniel Trotter, @NeuroNaud and @mossy_fibers.

We address short-term plasticity with a linear-nonlinear model and find interesting algorithmic similarities between single synapses and CNNs.

biorxiv.org/content/10.110…
2/ In this paper, we question how to best describe complex short-term plasticity (STP) dynamics in a computational model.

Typically, people tend to categorize synapses into either facilitating (STF) or depressing (STD) types.
3/ This STF-STD dichotomy, however, is an oversimplification. Some synapses display more complex dynamics.

At hippocampal mossy fiber synapses, for example, facilitation is supra-linear in low (arguably more physiological) extracellular [Ca2+].
4/ Such dynamics emerge because multiple interdependent mechanisms contribute to versicle release at this synapse (jneurosci.org/content/34/33/…).
5/ In traditional computational models of STP, such as the Tsodyks-Markram (TM) model, time-dependent changes in synaptic efficacy are explained by only two factors: release probability (u) and depletion (R).
6/ The TM model framework is elegant, efficient and easily interpretable. However, describing complex dynamics that rely on multiple biophysical mechanisms (e.g. vesicle priming, use-dependent replenishment, intracellular calcium stores...) is problematic.
7/ Of course, one could add an additional factor for each relevant biophysical property. By that process, however, one would also accumulate a catalog of parameters that complicate fitting the model to experimental data.
8/ Here, we take a different approach: Sacrificing biophysical interpretability, we aim to accurately describe complex synaptic dynamics.

The result is a new computational model of STP, based on a linear-nonlinear operation, which we call Spike Response Plasticity (SRP) model.
9/ Briefly, a presynaptic efficacy kernel (B) defines the change in synaptic efficacy after a spike. The kernel is convolved with a presynaptic spike train (C). The efficacy of spikes is defined as a nonlinear readout of the accumulated efficacy (D), sampled at spike times (E).
10/ Here, the efficacy kernel is an exponential decay. Changing the sign allows us to describe STF and STD.

Similarly, a kernel based on the sum of two exponential decays with opposite sign and different time constants gives rise to facilitation followed by depression.
11/ What about more complex dynamics, such as the supra-linear facilitation mentioned before?

We can model those through changes in baseline synaptic efficacy (b). A lower baseline parameter (in red) gives rise to supra-linear facilitation.
12/ Notably, our efficacy kernel is not restricted to an exponential decay. Some dynamics require different functions that take into account multiple timescales.

We'll illustrate on another example, namely post-burst potentiation at MF-IN synapses (jneurosci.org/content/38/7/1…).
13/ This synapse is funky. During a presynaptic burst, the synapse displays weak facilitation. However, this facilitation continues to rise during 2 seconds after the burst, suggesting that at least one process on a much slower timescale is involved.
14/ In the SRP framework, we can model such delayed dynamics by taking the time-course into account in the efficacy kernel. The post-burst facilitation can be captured by constructing the efficacy kernel as the sum of three Gaussians.
15/ Let's go a step further. So far, we have only considered the effect of spike history on mean synaptic efficacy.

But, additionally to amplitude, the variability of synaptic transmission also depends on recent activation history, which leads to complex heteroscedasticity.
16/ To account for these stochastic properties, we extend the SRP model to a stochastic framework by considering gamma-distributed PSCs (why gamma? see here → doi.org/10.3389/fnsyn.…) and adding a second presynaptic kernel, which regulates variability.
17/ In the stochastic model, two independent efficacy kernels regulate the mean and variance of a gamma distribution.

In this example, three synapses with the same mean-, but different variance kernels display uniform synaptic depression, but different variability dynamics.
18/ Okay, the SRP model can describe some complex dynamics. But can we fit it to data?

Yes! We developed a maximum likelihood approach to infer the model parameters from naturalistic spike trains.
19/ On surrogate data generated from Poisson spike trains, our inference method identifies good parameter estimates after 100-150 spikes.
20/ In summary, we described the dynamics of synaptic transmission as a convolution followed by a nonlinear readout.

Coincidentally, this sequence of operations is also the central operation of convolutional neural networks (CNNs)...🤔
21/ We asked which aspect of neural network architecture would correspond to the operation performed by the linear-nonlinear synapse model and found a striking similarity between the formalism of the SRP model, and the use of dropout masks to randomly silence inputs in CNNs.
22/ Maybe this algorithmic similarity suggests that the linear-nonlinear processing capabilities of single synapses are functionally related to the use of dropout followed by fully connected layers in CNNs? 🤷
23/ As an alternative viewpoint, we consider that dendritic integration can be described by a cascade of linear-nonlinear processing. Our model suggests an additional layer to the cascade, in which presynaptic spikes undergo a linear-nonlinear operation before entering dendrites.
24/ But why care?

STP can add substantial complexity to neuronal circuitry by allowing the same axon to communicate different signals to postsynaptic partners.

Ergo, to understand information flow in networks, we need an understanding of both connectivity and STP properties.
25/ We think that our framework can help to address this understanding in two ways:

(1) flexible characterization of synaptic dynamics on a large scale through parameter inference

(2) investigations of how complex dynamics affect information processing in networks
Missing some Tweet in this thread? You can try to force a refresh.

Keep Current with Julian Rossbroich

Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

Twitter may remove this content at anytime, convert it as a PDF, save and print for later use!

Try unrolling a thread yourself!

how to unroll video

1) Follow Thread Reader App on Twitter so you can easily mention us!

2) Go to a Twitter thread (series of Tweets by the same owner) and mention us with a keyword "unroll" @threadreaderapp unroll

You can practice here first or read more on our help page!

Follow Us on Twitter!

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3.00/month or $30.00/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!