Skip to content

Optimizers🔗

Optimizers are wrappers around PyTorch optimizers. They are used to update the parameters of a model during training.

Optimizers implemented by default:

You can change the parameters at runtime.