Backpropagation Algorithm
Weights are adjusted after seeing each input,output pair in the training set.
Forward pass lets activation flow through layers.
In the backward pass, actual output is compared with desired output and error estimates are computed for the output units. Then weights on output units can be adjusted to reduce errors; this gives error estimates for hidden units, and so on.
One epoch is defined as weight adjustment for all training pairs; usually nets require many epochs of training.