otx.api.utils.segmentation_utils#

This module implements segmentation related utilities.

Functions

create_annotation_from_segmentation_map(...)

Creates polygons from the soft predictions.

create_hard_prediction_from_soft_prediction(...)

Creates a hard prediction containing the final label index per pixel.

get_subcontours(contour)

Splits contour into subcontours that do not have self intersections.

mask_from_annotation(annotations, labels, ...)

Generate a segmentation mask of a numpy image, and a list of shapes.

mask_from_dataset_item(dataset_item, labels)

Creates a mask from dataset item.

mask_from_file(dataset_item)

Loads masks directly from annotation image.

otx.api.utils.segmentation_utils.create_annotation_from_segmentation_map(hard_prediction: ndarray, soft_prediction: ndarray, label_map: dict) List[Annotation][source]#

Creates polygons from the soft predictions.

Background label will be ignored and not be converted to polygons.

Parameters:
  • hard_prediction – hard prediction containing the final label index per pixel. See function create_hard_prediction_from_soft_prediction.

  • soft_prediction – soft prediction with shape H x W x N_labels, where soft_prediction[:, :, 0] is the soft prediction for background. If soft_prediction is of H x W shape, it is assumed that this soft prediction will be applied for all labels.

  • label_map – dictionary mapping labels to an index. It is assumed that the first item in the dictionary corresponds to the background label and will therefore be ignored.

Returns:

List of shapes

otx.api.utils.segmentation_utils.create_hard_prediction_from_soft_prediction(soft_prediction: ndarray, soft_threshold: float, blur_strength: int = 5) ndarray[source]#

Creates a hard prediction containing the final label index per pixel.

Parameters:
  • soft_prediction – Output from segmentation network. Assumes floating point values, between 0.0 and 1.0. Can be a 2d-array of shape (height, width) or per-class segmentation logits of shape (height, width, num_classes)

  • soft_threshold – minimum class confidence for each pixel. The higher the value, the more strict the segmentation is (usually set to 0.5)

  • blur_strength – The higher the value, the smoother the segmentation output will be, but less accurate

Returns:

Numpy array of the hard prediction

otx.api.utils.segmentation_utils.get_subcontours(contour: List[Tuple[float, float]]) List[List[Tuple[float, float]]][source]#

Splits contour into subcontours that do not have self intersections.

otx.api.utils.segmentation_utils.mask_from_annotation(annotations: List[Annotation], labels: List[LabelEntity], width: int, height: int) ndarray[source]#

Generate a segmentation mask of a numpy image, and a list of shapes.

The mask is will be two dimensional and the value of each pixel matches the class index with offset 1. The background class index is zero. labels[0] matches pixel value 1, etc. The class index is determined based on the order of labels:

Parameters:
  • annotations – List of annotations to plot in mask

  • labels – List of labels. The index position of the label determines the class number in the segmentation mask.

  • width – Width of the mask

  • height – Height of the mask

Returns:

2d numpy array of mask

otx.api.utils.segmentation_utils.mask_from_dataset_item(dataset_item: DatasetItemEntity, labels: List[LabelEntity], use_otx_adapter: bool = True) ndarray[source]#

Creates a mask from dataset item.

The mask will be two dimensional, and the value of each pixel matches the class index with offset 1. The background class index is zero. labels[0] matches pixel value 1, etc. The class index is determined based on the order of ‘labels’.

Parameters:
  • dataset_item – Item to make mask for

  • labels – The labels to use for creating the mask. The order of the labels determines the class index.

Returns:

Numpy array of mask

otx.api.utils.segmentation_utils.mask_from_file(dataset_item: DatasetItemEntity) ndarray[source]#

Loads masks directly from annotation image.

Only Common Sematic Segmentation format is supported.