New preprint alert!

“Going Beyond the Point Neuron: Active Dendrites and Sparse Representations for Continual Learning”

Work by Karan Grewal, @jerem_forest, Ben Cohen, and @SubutaiAhmad.

#tweeprint below 👇 (1/13)

biorxiv.org/content/10.110…
@jerem_forest @SubutaiAhmad Can the properties of biological dendrites add value to artificial neural networks?

TLDR: Yes, we augmented standard artificial neurons with properties of active dendrites and found that our model can learn continually much better than standard ANNs. (2/13)
The commonly used point neuron model is nothing like its biological counterpart. It assumes a simple linear integration and fire mechanism, while biological neurons are significantly more sophisticated and display a wide range of complex non-linear integrative properties. (3/13)
We hypothesized that dendrites convey fundamental computational benefits to ANNs beyond a simple increase in complexity or power. Our goal is to show a proof-of-concept working system that illustrates how active dendritic concepts can be incorporated into deep learning. (4/13)
We did this by incorporating several properties of biological neural networks into an ANN:
🧠 active dendrites
🧠 local inhibition and sparsity
(5/13)
We then explored how these changes impacted ANN's resilience to catastrophic forgetting (a phenomenon when ANNs are unable to learn new information without forgetting what they've previously learned).

You can read more about it in this blog post: numenta.com/blog/2021/02/0…

(6/13)
Using dendritic modulation together with a k-Winner-Take-All mechanism, we found that subsets of neurons minimally interfere with each other during learning. This figure shows how different sub-populations of neurons are active for different tasks. (7/13)
We combined our Active Dendrites network with Synaptic Intelligence and applied it to permutedMNIST. We found that combining the network leads to the highest test accuracy of either method on its own. (8/13)
This suggests that biological mechanisms at the synapse, neuron, and network levels can operate together to handle continual learning. (9/13)
We also tested another training method, XdG, in permutedMNIST tasks, and our results with a large number of tasks were significantly better. This further shows that an Active Dendrites Network is competitive with benchmark methods in continual learning. (10/13)
We now know that the Active Dendrites Network performs extremely well in continual learning scenarios.

Is it possible to achieve the same results by simply adding more layers to a standard feedforward network?

(11/13)
As it turns out, continual learning overall accuracy for our 3-layers Active Dendrite Network was significantly better than regular feedforward networks with more layers. So no, our active dendrites network is not equivalent to a standard network with more layers. (12/13)
In conclusion: we have shown that augmenting point neurons with biological properties such as active dendrites and sparse representations significantly improve a network’s ability to learn continually.

Let us know if you have any questions or comments! (13/13)

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Numenta 🧠

Numenta 🧠 Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!

Follow Us on Twitter!

:(