used to process a certain stochastic data whether it should be on or off

helps determine information

activates information

determines stochastic data

calculate informations

BACK PROPAGATION NEURAL NETWORK by David Rumalhart

B.P. ALGORITHM

FEED FORWARD COMPUTATION

BACK PROPAGATION TO THE OUTPUT LAYER

BACK PROPAGATION TO THE HIDDEN LAYER

WEIGHT UPDATES

EVALUATES AGAINST A CERTAIN OUTPUT

RESONANCE THEORY

describes a number of neural network models which use supervised and unsupervised learning methods, and address problems such as pattern recognition and prediction.

object identification and recognition generally occur as a result of the interaction of 'top-down' observer expectations with 'bottom-up' sensory information.

ART 1, ART 2, ART 2-A, ART 3, Fuzzy ART, ARTMAP,[, Fuzzy ARTMAP

slow and fast

slow learning method, the degree of training of the recognition neuron’s weights towards the input vector is calculated to continuous values with differential equations and is thus dependent on the length of time the input vector is presented.

With fast learning, algebraic equations are used to calculate degree of weight adjustments to be made, and binary values are used.

THE HOPFIELD NETWORK 1982

content-addressable memory systems with binary threshold nodes

converge to a local minimum, but will sometimes converge to a false pattern (wrong local minimum) rather than the stored pattern (expected local minimum)