Mini-batch gradient descent is a method of performing gradient descent in which the parameters are updated using the average of the the gradients of multiple training examples.