Backpropagation
Backpropagation is an algorithm to compute derivatives in a network of mathematical operations (e.g. an artificial neural network) by iteratively propagating known derivatives "backward", starting from the output of the network, according to the chain rule (from Calculus).