otx.algo.plugins#
Plugin for mixed-precision training on XPU.
Classes
|
Plugin for Automatic Mixed Precision (AMP) training with |
- class otx.algo.plugins.MixedPrecisionXPUPlugin(scaler: GradScaler | None = None)[source]#
Bases:
Precision
Plugin for Automatic Mixed Precision (AMP) training with
torch.xpu.autocast
.- Parameters:
scaler – An optional
torch.cuda.amp.GradScaler
to use.
- clip_gradients(optimizer: Optimizer, clip_val: int | float = 0.0, gradient_clip_algorithm: GradClipAlgorithmType = GradClipAlgorithmType.NORM) None [source]#
Handle grad clipping with scaler.
- optimizer_step(optimizer: Optimizable, model: pl.LightningModule, closure: Callable, **kwargs: dict) None | dict [source]#
Make an optimizer step using scaler if it was passed.