year: 2021
paper: https:<//proceedings.neurips.cc/paper/2021/file/d79c8788088c2193f0244d8f1f36d2db-Paper.pdf, Sem scholar version
website:
code:
connections: chaos theory, astrocyte, liquid state machine, reservoir computing, critical state, biologically inspired


LSMs, their limitations, and the critical state

LSMs avoid training via backpropagation by using a sparse, recurrent, spiking neural network (liquid) with fixed synaptic connection weights to project inputs into a high dimensional space from which a single neural layer can learn the correct outputs. Yet, these advantages over deep networks come at the expense of 1) sub-par accuracy and 2) extensive data-specific hand-tuning of liquid weights. Interestingly, these two limitations have been targeted by several studies that tackle one or the other, but not both.
As a general heuristic, LSM accuracy is maximized when LSM dynamics are positioned at the edge-of-chaos and specifically in the vicinity of a critical phase transition that separates: 1) the sub-critical phase, where network activity decays, and 2) the super-critical (chaotic) phase, where network activity gets exponentially amplified. Strikingly, brain networks have also been found to operate near a critical phase transition that is modeled as a branching process. Current LSM tuning methods organize network dynamics at the critical branching factor by adding forward and backward communication channels on top of the liquid. This, however, results in significant increases in training complexity and violates the LSM’s brain-inspired self-organization principles. For example, these methods lack local plasticity rules that are widely observed in the brain and considered a key component for both biological and neuromorphic learning.
A particular local learning rule, spike-timing dependent plasticity (STDP), is known to improve LSM accuracy. Yet, current methods of incorporating STDP into LSMs further exacerbate the limitations of data-specific hand-tuning as they require additional mechanisms to compensate for the STDP-imposed saturation of synaptic weights. This signifies the scarcity of LSM tuning
methods that are both computationally efficient and data-independent.