otx.algorithms.anomaly.tasks#
Initialization of OTX Anomalib.
Classes
|
Base Anomaly Task. |
|
Base Anomaly Task. |
|
Base Anomaly Task. |
|
OpenVINO inference task. |
- class otx.algorithms.anomaly.tasks.InferenceTask(task_environment: TaskEnvironment, output_path: str | None = None)[source]#
Bases:
IInferenceTask
,IEvaluationTask
,IExportTask
,IUnload
Base Anomaly Task.
Train, Infer, Export, Optimize and Deploy an Anomaly Classification Task.
- Parameters:
task_environment (TaskEnvironment) – OTX Task environment.
output_path (Optional[str]) – output path where task output are saved.
- cancel_training() None [source]#
Cancel the training after_batch_end.
This terminates the training; however validation is still performed.
- evaluate(output_resultset: ResultSetEntity, evaluation_metric: str | None = None) None [source]#
Evaluate the performance on a result set.
- Parameters:
output_resultset (ResultSetEntity) – Result Set from which the performance is evaluated.
evaluation_metric (Optional[str], optional) – Evaluation metric. Defaults to None. Instead, metric is chosen depending on the task type.
- export(export_type: ExportType, output_model: ModelEntity, precision: ModelPrecision = ModelPrecision.FP32, dump_features: bool = False) None [source]#
Export model to OpenVINO IR.
- Parameters:
export_type (ExportType) – Export type should be ExportType.OPENVINO
output_model (ModelEntity) – The model entity in which to write the OpenVINO IR data
precision (bool) – Output model weights and inference precision
dump_features (bool) – Flag to return “feature_vector” and “saliency_map”.
- Raises:
Exception – If export_type is not ExportType.OPENVINO
- get_config() DictConfig | ListConfig [source]#
Get Anomalib Config from task environment.
- Returns:
Anomalib config.
- Return type:
Union[DictConfig, ListConfig]
- infer(dataset: DatasetEntity, inference_parameters: InferenceParameters) DatasetEntity [source]#
Perform inference on a dataset.
- Parameters:
dataset (DatasetEntity) – Dataset to infer.
inference_parameters (InferenceParameters) – Inference parameters.
- Returns:
Output dataset with predictions.
- Return type:
- load_model(otx_model: ModelEntity | None) AnomalyModule [source]#
Create and Load Anomalib Module from OTX Model.
This method checks if the task environment has a saved OTX Model, and creates one. If the OTX model already exists, it returns the the model with the saved weights.
- Parameters:
otx_model (Optional[ModelEntity]) – OTX Model from the task environment.
- Returns:
- Anomalib
classification or segmentation model with/without weights.
- Return type:
AnomalyModule
- model_info() Dict [source]#
Return model info to save the model weights.
- Returns:
Model info.
- Return type:
Dict
- save_model(output_model: ModelEntity) None [source]#
Save the model after training is completed.
- Parameters:
output_model (ModelEntity) – Output model onto which the weights are saved.
- class otx.algorithms.anomaly.tasks.NNCFTask(task_environment: TaskEnvironment, **kwargs)[source]#
Bases:
InferenceTask
,IOptimizationTask
Base Anomaly Task.
Task for compressing models using NNCF.
- Parameters:
task_environment (TaskEnvironment) – OTX Task environment.
**kwargs – Addition keyword arguments.
- load_model(otx_model: ModelEntity | None) AnomalyModule [source]#
Create and Load Anomalib Module from OTX Model.
This method checks if the task environment has a saved OTX Model, and creates one. If the OTX model already exists, it returns the the model with the saved weights.
- Parameters:
otx_model (Optional[ModelEntity]) – OTX Model from the task environment.
- Returns:
- Anomalib
classification or segmentation model with/without weights.
- Return type:
AnomalyModule
- model_info() Dict [source]#
Return model info to save the model weights.
- Returns:
Model info.
- Return type:
Dict
- optimize(optimization_type: OptimizationType, dataset: DatasetEntity, output_model: ModelEntity, optimization_parameters: OptimizationParameters | None = None)[source]#
Train the anomaly classification model.
- Parameters:
optimization_type (OptimizationType) – Type of optimization.
dataset (DatasetEntity) – Input dataset.
output_model (ModelEntity) – Output model to save the model weights.
optimization_parameters (OptimizationParameters) – Training parameters
- class otx.algorithms.anomaly.tasks.OpenVINOTask(task_environment: TaskEnvironment)[source]#
Bases:
IInferenceTask
,IEvaluationTask
,IOptimizationTask
,IDeploymentTask
OpenVINO inference task.
- Parameters:
task_environment (TaskEnvironment) – task environment of the trained anomaly model
- deploy(output_model: ModelEntity) None [source]#
Exports the weights from
output_model
along with exportable code.- Parameters:
output_model (ModelEntity) – Model with
openvino.xml
and.bin
keys- Raises:
Exception – If
task_environment.model
is None
- evaluate(output_resultset: ResultSetEntity, evaluation_metric: str | None = None)[source]#
Evaluate the performance of the model.
- Parameters:
output_resultset (ResultSetEntity) – Result set storing ground truth and predicted dataset.
evaluation_metric (Optional[str], optional) – Evaluation metric. Defaults to None.
- get_config() Dict [source]#
Get Anomalib Config from task environment.
- Returns:
Anomalib config
- Return type:
ADDict
- get_openvino_model() AnomalyDetection [source]#
Create the OpenVINO inferencer object.
- Returns:
AnomalyDetection model
- infer(dataset: DatasetEntity, inference_parameters: InferenceParameters) DatasetEntity [source]#
Perform Inference.
- Parameters:
dataset (DatasetEntity) – Inference dataset
inference_parameters (InferenceParameters) – Inference parameters.
- Returns:
Output dataset storing inference predictions.
- Return type:
- optimize(optimization_type: OptimizationType, dataset: DatasetEntity, output_model: ModelEntity, optimization_parameters: OptimizationParameters | None)[source]#
Optimize the model.
- Parameters:
optimization_type (OptimizationType) – Type of optimization [POT or NNCF]
dataset (DatasetEntity) – Input Dataset.
output_model (ModelEntity) – Output model.
optimization_parameters (Optional[OptimizationParameters]) – Optimization parameters.
- Raises:
ValueError – When the optimization type is not POT, which is the only support type at the moment.
- class otx.algorithms.anomaly.tasks.TrainingTask(task_environment: TaskEnvironment, output_path: str | None = None)[source]#
Bases:
InferenceTask
,ITrainingTask
Base Anomaly Task.
Train, Infer, Export, Optimize and Deploy an Anomaly Classification Task.
- Parameters:
task_environment (TaskEnvironment) – OTX Task environment.
output_path (Optional[str]) – output path where task output are saved.
- load_model(otx_model: ModelEntity | None) AnomalyModule [source]#
Create and Load Anomalib Module from OTX Model.
This method checks if the task environment has a saved OTX Model, and creates one. If the OTX model already exists, it returns the the model with the saved weights.
- Parameters:
otx_model (Optional[ModelEntity]) – OTX Model from the task environment.
- Returns:
- Anomalib
classification or segmentation model with/without weights.
- Return type:
AnomalyModule
- train(dataset: DatasetEntity, output_model: ModelEntity, train_parameters: TrainParameters, seed: int | None = None, deterministic: bool = False) None [source]#
Train the anomaly classification model.
- Parameters:
dataset (DatasetEntity) – Input dataset.
output_model (ModelEntity) – Output model to save the model weights.
train_parameters (TrainParameters) – Training parameters
seed (Optional[int]) – Setting seed to a value other than 0
deterministic (bool) – Setting PytorchLightning trainer’s deterministic flag.