Gradient descent is an iterative optimization algorithm used to find a local minimum in a multivariable function. Gradient descent does this by repeatedly taking small steps in the opposite direction of the gradient of the function.
Gradient descent can be defined as
Variations