Hebbian learning is one of the oldest learning algorithms, and is based in large part on the dynamics of biological systems.
A synapse between two neurons is strengthened when the neurons on either side of the synapse (input and output) have highly correlated outputs.
In essence, when an input neuron fires, if it frequently leads to the firing of the output neuron, the synapse is strengthened.
Following the analogy to an artificial system, the tap weight is increased with high correlation between two sequential neurons.
Mathematically, we can describe Hebbian learning as:
Here, η is a learning rate coefficient, and x are the outputs of the ith and jth elements.