Optimizer callbacks#
- class SwitchOptimizer(new_optimizers, new_optimizers_kwargs, epoch_switch)[source]#
Bases:
Callback
PINA Implementation of a Lightning Callback to switch optimizer during training.
This callback allows for switching between different optimizers during training, enabling the exploration of multiple optimization strategies without the need to stop training.
- Parameters:
new_optimizers (torch.optim.Optimizer | list) – The model optimizers to switch to. Can be a single
torch.optim.Optimizer
or a list of them for multiple model solvers.new_optimizers_kwargs (dict | list) – The keyword arguments for the new optimizers. Can be a single dictionary or a list of dictionaries corresponding to each optimizer.
epoch_switch (int) – The epoch at which to switch to the new optimizer.
- Raises:
ValueError – If epoch_switch is less than 1 or if there is a mismatch in the number of optimizers and their corresponding keyword argument dictionaries.
Example
>>> switch_callback = SwitchOptimizer(new_optimizers=[optimizer1, optimizer2], >>> new_optimizers_kwargs=[{'lr': 0.001}, {'lr': 0.01}], >>> epoch_switch=10)