CFlow

This is the implementation of the CFlow paper.

Model Type: Segmentation

Description

CFLOW model is based on a conditional normalizing flow framework adopted for anomaly detection with localization. It consists of a discriminatively pretrained encoder followed by a multi-scale generative decoders. The encoder extracts features with multi-scale pyramid pooling to capture both global and local semantic information with the growing from top to bottom receptive fields. Pooled features are processed by a set of decoders to explicitly estimate likelihood of the encoded features. The estimated multi-scale likelyhoods are upsampled to input size and added up to produce the anomaly map.

Architecture

CFlow Architecture

Usage

python tools/train.py --model cflow

PyTorch model for CFlow model implementation.

class anomalib.models.cflow.torch_model.CflowModel(input_size: tuple[int, int], backbone: str, layers: list[str], pre_trained: bool = True, fiber_batch_size: int = 64, decoder: str = 'freia-cflow', condition_vector: int = 128, coupling_blocks: int = 8, clamp_alpha: float = 1.9, permute_soft: bool = False)[source]

Bases: Module

CFLOW: Conditional Normalizing Flows.

Initializes internal Module state, shared by both nn.Module and ScriptModule.

forward(images) Any[source]

Forward-pass images into the network to extract encoder features and compute probability.

Parameters:

images – Batch of images.

Returns:

Predicted anomaly maps.

training: bool

CFLOW: Real-Time Unsupervised Anomaly Detection via Conditional Normalizing Flows.

https://arxiv.org/pdf/2107.12571v1.pdf

class anomalib.models.cflow.lightning_model.Cflow(input_size: tuple[int, int], backbone: str, layers: list[str], pre_trained: bool = True, fiber_batch_size: int = 64, decoder: str = 'freia-cflow', condition_vector: int = 128, coupling_blocks: int = 8, clamp_alpha: float = 1.9, permute_soft: bool = False, lr: float = 0.0001)[source]

Bases: AnomalyModule

PL Lightning Module for the CFLOW algorithm.

configure_optimizers() Optimizer[source]

Configures optimizers for each decoder.

Note

This method is used for the existing CLI. When PL CLI is introduced, configure optimizers method will be

deprecated, and optimizers will be configured from either config.yaml file or from CLI.

Returns:

Adam optimizer for each decoder

Return type:

Optimizer

training_step(batch: dict[str, str | Tensor], *args, **kwargs) STEP_OUTPUT[source]

Training Step of CFLOW.

For each batch, decoder layers are trained with a dynamic fiber batch size. Training step is performed manually as multiple training steps are involved

per batch of input images

Parameters:

batch (dict[str, str | Tensor]) – Input batch

Returns:

Loss value for the batch

validation_step(batch: dict[str, str | Tensor], *args, **kwargs) STEP_OUTPUT[source]

Validation Step of CFLOW.

Similar to the training step, encoder features are extracted from the CNN for each batch, and anomaly map is computed.

Parameters:

batch (dict[str, str | Tensor]) – Input batch

Returns:

Dictionary containing images, anomaly maps, true labels and masks. These are required in validation_epoch_end for feature concatenation.

class anomalib.models.cflow.lightning_model.CflowLightning(hparams: DictConfig | ListConfig)[source]

Bases: Cflow

PL Lightning Module for the CFLOW algorithm.

Parameters:

hparams (DictConfig | ListConfig) – Model params

configure_callbacks() list[EarlyStopping][source]

Configure model-specific callbacks.

Note

This method is used for the existing CLI. When PL CLI is introduced, configure callback method will be

deprecated, and callbacks will be configured from either config.yaml file or from CLI.

Anomaly Map Generator for CFlow model implementation.

class anomalib.models.cflow.anomaly_map.AnomalyMapGenerator(image_size: ListConfig | tuple, pool_layers: list[str])[source]

Bases: Module

Generate Anomaly Heatmap.

Initializes internal Module state, shared by both nn.Module and ScriptModule.

compute_anomaly_map(distribution: list[Tensor], height: list[int], width: list[int]) Tensor[source]

Compute the layer map based on likelihood estimation.

Parameters:
  • distribution – Probability distribution for each decoder block

  • height – blocks height

  • width – blocks width

Returns:

Final Anomaly Map

forward(**kwargs: list[Tensor] | list[int] | list[list]) Tensor[source]

Returns anomaly_map.

Expects distribution, height and ‘width’ keywords to be passed explicitly

Example >>> anomaly_map_generator = AnomalyMapGenerator(image_size=tuple(hparams.model.input_size), >>> pool_layers=pool_layers) >>> output = self.anomaly_map_generator(distribution=dist, height=height, width=width)

Raises:

ValueErrordistribution, height and ‘width’ keys are not found

Returns:

anomaly map

Return type:

torch.Tensor

training: bool