Backpropagation is a gradient estimation algorithm used to train neural networks. Backpropagation computes the gradient of a cost function with respect to the parameters of a neural network. This gradient is then used by an optimization algorithm to update the parameters of the network. Backpropagation does this through the chain rule as neural networks can at their core be modelled as a chain of successive functions.
Example of Backpropagation