Research done on neural networks and other experimental preparations has concluded that synaptic strength can be increased through Hebbian learning, in which if one neuron is stimulating another neuron while the second neuron is also firing -- indicating a high correlation between the activity of the two neurons -- the connection between them becomes stronger. When this effect persists over time, it is known as long-term potentiation (LTP). (Similarly, synapses can become weakened when low correlation between the activity of the neurons is shown -- called "long-term depression" when it persists.)

This is all well and good, until you consider that your neurons are firing all the time. If synapses become strengthened every time two or more neurons fire in concert, a runaway positive feedback loop is produced. The probability of a post-synaptic cell firing would keep getting higher and higher with every instance of Hebbian phenomena until every cell was excited beyond any reasonable point. Obviously, some sort of regulatory mechanism must be occurring. This is what we call synaptic scaling.

It has been shown that in addition to the phenomena regulating learning and LTP, there are checks in place in each neuron that control its overall firing rate, keeping it within a sane limit. If the firing rate of the cell drastically increases, the neuron adjusts so that each of its synapses is proportionally weakened. If firing becomes too depressed, the neuron will in turn proportionally strengthen all its synapses. This means that if one of a neuron's many synapses is very strong, significantly increasing its overall firing rate, the strong synapse will be preserved during synaptic scaling, but many weaker synapses of the same cell will lose significant long-term effects as all the synapses become proportionally "downsized." This synaptic weakening may be linked to synapse competition and the eventual elimination of very weak synapses.