Better than any words, you can get an idea how SNNs work from this animation.
But here's a downside:
...in order to reach accuracy of its ANN counterpart, it usually requires long spike trains to ensure the accuracy. Traditionally, a spike train needs around one thousand time steps to approach similar accuracy as its ANN counterpart
- http://www.xavierdupre.fr/app/ensae_teaching_cs/helpsphinx/ml2a/td2a_mlplus_snn.html
- A minimal sNN network https://thesai.org/Downloads/IJARAI/Volume4No7/Paper_1-A_Minimal_Spiking_Neural_Network_to_Rapidly_Train.pdf
- online book https://neuronaldynamics.epfl.ch/online/Ch7.S1.html
- Local connectivity and synaptic dynamics in mouse and human neocortex https://www.science.org/stoken/author-tokens/ST-374/full
Implementation
How to write a spiking neural network simulation from scratch in Python
- Izhikevich model, Tensorflow https://github.com/kaizouman/tensorsandbox/blob/master/snn/simple_spiking_model.ipynb
- SpykeTorch ? https://github.com/miladmozafari/SpykeTorch
- snnTorch https://snntorch.readthedocs.io/en/latest/tutorials/tutorial_3.html (white paper)
- Guillaume Chevalier's experiment in his blog
- Notebook simulating spiking neurons with Tensorflow
- The Attentional Routing Circuit by Bobier
- Eliasmith and Spaun model. Original article from Science. PDF version. And supplementary materials for A Large-Scale Model of the Functioning Brain. Main technique is called Neural Engineering Framework. And how memory works is explained here (reference 26 from supplementary materials). Which maybe needs one more step into neural integrators. Though I started with Hypervectors and Hyper Dimensional Computing reviewed in two articles (part 1, part 2) which actually led me to An Introduction to Hyperdimensional Computing for Robotics where for image recognition task they take intermediate layers of CNN networks and operate on big dimension vectors
Training methods
A list from this blog
Unsupervised Learning
- Spike-timing-dependent plasticity (STDP)
- Unsupervised Learning with Self-Organizing Spiking Neural Networks (Hazan et al 2018). Growing Spiking Neural Networks, activation based on spacial proximity which makes similar classes transition from one to another.
- Artola, Bröcher, Singer (ABS) rule
- Bienenstock, Cooper, Munro (BCM) rule
- Relationship between BCM and STDP rules
Supervised Learning
- SpikeProp
- Remote Supervised Method (ReSuMe)
- FreqProp
- Local error-driven associative biologically realistic algorithm (LEABRA)
- Supervised Hebbian Learning
Reinforcement Learning
- Spiking Actor-Critic method "A Spiking Neural Network Model of an Actor-Critic Learning Agent" by Wiebke Potjans, A. Morrison, M. Diesmann (2008)
- Reinforcement Learning through reward-modulated STDP
Papers
- 2002 Gerstner Kistler SPIKING NEURON MODELS: Single Neurons, Populations, Plasticity
- 2010 Basu, Hasler - Nullcline-Based Design of a Silicon Neuron
- 2019 Thiele SpikeGrad: An ANN-equivalent Computation Model for Implementing Backpropagation with Spikes
- 2020 Comsa et al - Temporal Coding in Spiking Neural Networks with Alpha Synaptic Function: Learning with Backpropagation https://arxiv.org/pdf/1907.13223.pdf
My links
A Minimal Spiking Neural Network to Rapidly Train and Classify Handwritten Digits in Binary and 10-Digit Tasks paper. Guess how spikes work in this work. Pixels from black and white picture unwrapped into spike trains (facepalm)
Questions
- How does STDP start if no synaptic connections exist a priori?