Neuromorphic computing

cosmos 10th December 2017 at 5:24pm
Neuromorphic engineering

Computing systems that imitate the working of Neuronal networks, at hardware and/or software level. A basic model is the Spiking neural network. One advantage is that they tend to be more energy and resource-efficient.

https://www.wikiwand.com/en/Neuromorphic_engineering

Numenta

IBM TrueNorth. –

Convolutional Networks for Fast, Energy-Efficient Neuromorphic Computing

This is direct evidence that an “integrate-and-spike” mechanism has the similar computational capability as the more proven ANNs. The IBM paper however highlighted one major weakness of SNN. That is, training of the TrueNorth system required simulation of back-propagation using another conventional GPU:

Training was performed offline on conventional GPUs, using a library of custom training layers built upon functions from the MatConvNet toolbox. Network specication and training complexity using these layers is on par with standard deep learning.

See more interesting stuff here: Microglia: A Biologically Plausible Basis for Back-Propagation

There however has been no biological evidence of a structural mechanism of “back-propagation” in biological brains. Yoshua Bengio published a paper in 2015 (see: http://arxiv.org/abs/1502.04156 ) “Towards Biologically Plausible Deep Learning”. The investigation attempts to explain a mechanism for back-propagation exists in Spike-Timing-Dependent Plasticity (STDP) of biological neurons.

It is however questionable whether neurons are able to learn by themselves without the need of an external feedback pathway that spans multiple layers.

There is however an alternative mechanism that recently has been discovered that may be a more convincing argument that is based on a structure that is independent of the brain’s neurons. There is a large class of cells in the Brain called Microglia ( see: https://www.technologyreview.com/s/601137/the-rogue-immune-cells-that-wreck-the-brain ) that are responsible for regulating the neurons and their connectivity.

In summary, biological brains have a regulatory mechanism in the form of microglia that are highly dynamic in regulating synapse connectivity and pruning neural growth. The activity is most pronounced during sleep. SNNs have been shown to have inference capabilities equivalent to Convolution Networks. SNNs however have not shown to effectively learn on their own without a ‘back-propagation’ mechanism. This mechanism is most plausibly provided by the microglia.

Energy-efficient neural network chips approach human recognition capabilities