Learning rate decay is a method for slowly reducing the learning rate of an algorithm over time, this helps the algorithm to "settle" into a local minimum instead of jumping back and forth over it.
There are 2 main types of learning rate decay
Time based decay is a type of learning rate decay which decreases the learning rate relative to the last learning rate.
Time based learning rate decay can be defined by the following equation
Exponential decay is similar to time based decay except that it uses an exponent to model the decay.
Exponential learning rate decay can be defined by the following equation
Step based decay is a type of learning rate decay which decreases the learning rate every
Step based learning rate decay can be defined by the following equation