When an axon of cell A is near enough to excite cell B and repeatedly or persistently takes part in firing it, some growth process or metabolic change takes place in one or both cells such that A’s efficiency, as one of the cells firing B, is increased. - Donald Hebb
Neurons that fire together wire together.
This statement from Donald Hebb has been interpreted as meaning strengthening of synapses, but it can also mean creation of new synapses!
Hebbian weight update rule
change in the synaptic weight between neuron and .
learning rate (small positive constant).
activations of the pre- and post-synaptic neurons.
It is used for example in Hopfield Network and other associative memory models.
Oja's Rule
A normalized version of Hebbian learning, which addresses the issue of unbounded growth of synaptic weights:
With standardized inputs, Oja puts weights into the correlation range . Averaging over data:
\mathbb{E}[\Delta w_{ij}]
= \eta\big(\underbrace{\mathbb{E}[x_i x_j]}{C{ij}}
-\underbrace{\mathbb{E}[x_i^2]}{v_i},w{ij}\big).For small $\eta$ (stochastic approximation), the weights follow the [[ordinary differential equation|ODE]]\dot w_{ij}=\eta,(C_{ij}-v_i w_{ij}).
A fixed point of this dynamics satisfies , i.e. the mean change is zero:
0=C_{ij}-v_i w_{ij}\quad\Rightarrow\quad
w_{ij}^\star=\frac{C_{ij}}{v_i}.**Interpretation:** each synapse converges to “correlation divided by presynaptic variance” — i.e., the regression coefficient of $x_j$ on $x_i$. The decay prevents blow-up and scales the weight to a finite, data-determined value.
Another version is spike-timing dependent plasticity.
Transclude of principal-component-analysis#^852d71