Step based decay is a type of learning rate decay which decreases the learning rate every steps.
Step based learning rate decay can be defined by the following equation
- Definitions
- is the learning rate for the iteration
- is the current iteration step
- is the number of iterations between learning rate changes (ie. means that the learning rate will drop every iteration)
- is the "decay parameter" and controls how quickly the learning rate will change.
- denotes the floor operator which rounds the input down (ie. is rounded to )