Learning-Rate Scheduler

A learning-rate scheduler is an algorithm that updates the learning rate in stochastic gradient descent optimizers as a function of the iteration, or of the epoch. Typically these algorithms progressively decrease the learning rate, thus the term "learning rate decay" is often used.
Related concepts:
Learning-Rate WarmupGradient DescentEpoch