@jerem_forest@SubutaiAhmad Can the properties of biological dendrites add value to artificial neural networks?
TLDR: Yes, we augmented standard artificial neurons with properties of active dendrites and found that our model can learn continually much better than standard ANNs. (2/13)
The commonly used point neuron model is nothing like its biological counterpart. It assumes a simple linear integration and fire mechanism, while biological neurons are significantly more sophisticated and display a wide range of complex non-linear integrative properties. (3/13)
We hypothesized that dendrites convey fundamental computational benefits to ANNs beyond a simple increase in complexity or power. Our goal is to show a proof-of-concept working system that illustrates how active dendritic concepts can be incorporated into deep learning. (4/13)
We did this by incorporating several properties of biological neural networks into an ANN:
🧠 active dendrites
🧠 local inhibition and sparsity
(5/13)
We then explored how these changes impacted ANN's resilience to catastrophic forgetting (a phenomenon when ANNs are unable to learn new information without forgetting what they've previously learned).
Using dendritic modulation together with a k-Winner-Take-All mechanism, we found that subsets of neurons minimally interfere with each other during learning. This figure shows how different sub-populations of neurons are active for different tasks. (7/13)
We combined our Active Dendrites network with Synaptic Intelligence and applied it to permutedMNIST. We found that combining the network leads to the highest test accuracy of either method on its own. (8/13)
This suggests that biological mechanisms at the synapse, neuron, and network levels can operate together to handle continual learning. (9/13)
We also tested another training method, XdG, in permutedMNIST tasks, and our results with a large number of tasks were significantly better. This further shows that an Active Dendrites Network is competitive with benchmark methods in continual learning. (10/13)
We now know that the Active Dendrites Network performs extremely well in continual learning scenarios.
Is it possible to achieve the same results by simply adding more layers to a standard feedforward network?
(11/13)
As it turns out, continual learning overall accuracy for our 3-layers Active Dendrite Network was significantly better than regular feedforward networks with more layers. So no, our active dendrites network is not equivalent to a standard network with more layers. (12/13)
In conclusion: we have shown that augmenting point neurons with biological properties such as active dendrites and sparse representations significantly improve a network’s ability to learn continually.
Let us know if you have any questions or comments! (13/13)
• • •
Missing some Tweet in this thread? You can try to
force a refresh