Schedulers🔗
Schedulers are wrappers around PyTorch learning rate schedulers. They are used to update the learning rate of an optimizer during training.
Schedulers implemented by default:
ConstantScheduler
CosineAnnealingScheduler
CyclicScheduler
ExponentialScheduler
LinearScheduler
MultiStepScheduler
OneCycleScheduler
ReduceOnPlateauScheduler
StepScheduler
WarmRestartsScheduler
You can change the parameters at runtime.