otx.core.model.anomaly#

Anomaly Lightning OTX model.

Classes

AnomalyMixin()

Mixin inherited before AnomalibModule to override OTXModel methods.

OTXAnomaly()

Methods used to make OTX model compatible with the Anomalib model.

class otx.core.model.anomaly.AnomalyMixin[source]#

Bases: object

Mixin inherited before AnomalibModule to override OTXModel methods.

configure_optimizers() tuple[list[Optimizer], list[Optimizer]] | None[source]#

Call AnomlibModule’s configure optimizer.

forward(inputs: AnomalyClassificationDataBatch | AnomalySegmentationDataBatch | AnomalyDetectionDataBatch) AnomalyClassificationBatchPrediction | AnomalySegmentationBatchPrediction | AnomalyDetectionBatchPrediction[source]#

Wrap forward method of the Anomalib model.

on_load_checkpoint(checkpoint: dict[str, Any]) None[source]#

Callback on loading checkpoint.

on_save_checkpoint(checkpoint: dict[str, Any]) None[source]#

Callback on saving checkpoint.

on_train_epoch_end() None[source]#

Callback triggered when the training epoch ends.

on_validation_start() None[source]#

Callback triggered when the validation starts.

predict_step(inputs: AnomalyModelInputs, batch_idx: int = 0, **kwargs) STEP_OUTPUT[source]#

Call test step of the anomalib model.

test_step(inputs: AnomalyModelInputs, batch_idx: int = 0, **kwargs) STEP_OUTPUT[source]#

Call test step of the anomalib model.

training_step(inputs: AnomalyModelInputs, batch_idx: int = 0) STEP_OUTPUT[source]#

Call training step of the anomalib model.

validation_step(inputs: AnomalyModelInputs, batch_idx: int = 0) STEP_OUTPUT[source]#

Call validation step of the anomalib model.

property input_size: tuple[int, int]#

Returns the input size of the model.

Returns:

The input size of the model as a tuple of (height, width).

Return type:

tuple[int, int]

class otx.core.model.anomaly.OTXAnomaly[source]#

Bases: OTXModel

Methods used to make OTX model compatible with the Anomalib model.

Parameters:

input_size (tuple[int, int] | None) – Model input size in the order of height and width. Defaults to None.

configure_callbacks() list[Callback][source]#

Get all necessary callbacks required for training and post-processing on Anomalib models.

configure_metric() None[source]#

This does not follow OTX metric configuration.

export(output_dir: Path, base_name: str, export_format: OTXExportFormatType, precision: OTXPrecisionType = OTXPrecisionType.FP32, to_exportable_code: bool = False) Path[source]#

Export this model to the specified output directory.

Parameters:
  • output_dir (Path) – directory for saving the exported model

  • base_name – (str): base name for the exported model file. Extension is defined by the target export format

  • export_format (OTXExportFormatType) – format of the output model

  • precision (OTXExportPrecisionType) – precision of the output model

  • to_exportable_code (bool) – flag to export model in exportable code with demo package

Returns:

path to the exported model.

Return type:

Path

get_dummy_input(batch_size: int = 1) AnomalyClassificationDataBatch | AnomalySegmentationDataBatch | AnomalyDetectionDataBatch[source]#

Returns a dummy input for anomaly model.

on_predict_batch_end(outputs: dict, batch: AnomalyClassificationDataBatch | AnomalySegmentationDataBatch | AnomalyDetectionDataBatch, batch_idx: int, dataloader_idx: int = 0) None[source]#

Wrap the outputs to OTX format.

Since outputs need to be replaced inplace, we can’t change the datatype of outputs. That’s why outputs is cleared and replaced with the new outputs. The problem with this is that Instead of engine.test() returning [BatchPrediction,…], it returns [{prediction: BatchPrediction}, {…}, …]

on_test_epoch_end() None[source]#

Don’t call OTXModel’s on_test_epoch_end.

on_test_epoch_start() None[source]#

Don’t call OTXModel’s on_test_epoch_start.

on_validation_epoch_end() None[source]#

Don’t call OTXModel’s on_validation_epoch_end.

on_validation_epoch_start() None[source]#

Don’t call OTXModel’s on_validation_epoch_start.

save_hyperparameters(*args: Any, ignore: Sequence[str] | str | None = None, frame: types.FrameType | None = None, logger: bool = True) None[source]#

Ignore task from hyperparameters.

Need to ignore task from hyperparameters as it is passed as a string from the CLI. This causes log_hyperparameters to fail as it does not match with instance of OTXTaskType from OTXDataModule.

property task: TaskType#

Return the task type of the model.

property trainable_model: str | None#

Use this to return the name of the model that needs to be trained.

This might not be the cleanest solution.

Some models have multiple architectures and only one of them needs to be trained. However the optimizer is configured in the Anomalib’s lightning model. This can be used to inform the OTX lightning model which model to train.