otx.core.model.anomaly#
Anomaly Lightning OTX model.
Classes
Mixin inherited before AnomalibModule to override OTXModel methods. |
|
Methods used to make OTX model compatible with the Anomalib model. |
- class otx.core.model.anomaly.AnomalyMixin[source]#
Bases:
object
Mixin inherited before AnomalibModule to override OTXModel methods.
- configure_optimizers() tuple[list[Optimizer], list[Optimizer]] | None [source]#
Call AnomlibModule’s configure optimizer.
- forward(inputs: AnomalyClassificationDataBatch | AnomalySegmentationDataBatch | AnomalyDetectionDataBatch) AnomalyClassificationBatchPrediction | AnomalySegmentationBatchPrediction | AnomalyDetectionBatchPrediction [source]#
Wrap forward method of the Anomalib model.
- predict_step(inputs: AnomalyModelInputs, batch_idx: int = 0, **kwargs) STEP_OUTPUT [source]#
Call test step of the anomalib model.
- test_step(inputs: AnomalyModelInputs, batch_idx: int = 0, **kwargs) STEP_OUTPUT [source]#
Call test step of the anomalib model.
- training_step(inputs: AnomalyModelInputs, batch_idx: int = 0) STEP_OUTPUT [source]#
Call training step of the anomalib model.
- class otx.core.model.anomaly.OTXAnomaly[source]#
Bases:
OTXModel
Methods used to make OTX model compatible with the Anomalib model.
- Parameters:
input_size (tuple[int, int] | None) – Model input size in the order of height and width. Defaults to None.
- configure_callbacks() list[Callback] [source]#
Get all necessary callbacks required for training and post-processing on Anomalib models.
- export(output_dir: Path, base_name: str, export_format: OTXExportFormatType, precision: OTXPrecisionType = OTXPrecisionType.FP32, to_exportable_code: bool = False) Path [source]#
Export this model to the specified output directory.
- Parameters:
output_dir (Path) – directory for saving the exported model
base_name – (str): base name for the exported model file. Extension is defined by the target export format
export_format (OTXExportFormatType) – format of the output model
precision (OTXExportPrecisionType) – precision of the output model
to_exportable_code (bool) – flag to export model in exportable code with demo package
- Returns:
path to the exported model.
- Return type:
Path
- get_dummy_input(batch_size: int = 1) AnomalyClassificationDataBatch | AnomalySegmentationDataBatch | AnomalyDetectionDataBatch [source]#
Returns a dummy input for anomaly model.
- on_predict_batch_end(outputs: dict, batch: AnomalyClassificationDataBatch | AnomalySegmentationDataBatch | AnomalyDetectionDataBatch, batch_idx: int, dataloader_idx: int = 0) None [source]#
Wrap the outputs to OTX format.
Since outputs need to be replaced inplace, we can’t change the datatype of outputs. That’s why outputs is cleared and replaced with the new outputs. The problem with this is that Instead of
engine.test()
returning [BatchPrediction,…], it returns [{prediction: BatchPrediction}, {…}, …]
- save_hyperparameters(*args: Any, ignore: Sequence[str] | str | None = None, frame: types.FrameType | None = None, logger: bool = True) None [source]#
Ignore task from hyperparameters.
Need to ignore task from hyperparameters as it is passed as a string from the CLI. This causes
log_hyperparameters
to fail as it does not match with instance ofOTXTaskType
fromOTXDataModule
.
- property task: TaskType#
Return the task type of the model.
- property trainable_model: str | None#
Use this to return the name of the model that needs to be trained.
This might not be the cleanest solution.
Some models have multiple architectures and only one of them needs to be trained. However the optimizer is configured in the Anomalib’s lightning model. This can be used to inform the OTX lightning model which model to train.