Sam Models#

class model_api.models.sam_models.SAMDecoder(model_adapter, configuration={}, preload=False)#

Bases: SegmentationModel

Image Decoder for SAM: https://arxiv.org/abs/2304.02643

Image model constructor

It extends the Model constructor.

Parameters:
  • inference_adapter (InferenceAdapter) – allows working with the specified executor

  • configuration (dict, optional) – it contains values for parameters accepted by specific wrapper (confidence_threshold, labels etc.) which are set as data attributes

  • preload (bool, optional) – a flag whether the model is loaded to device while initialization. If preload=False, the model must be loaded via load method before inference

Raises:

WrapperError – if the wrapper failed to define appropriate inputs for images

apply_coords(coords, orig_size)#

Process coords according to preprocessed image size using image meta.

Return type:

ndarray

classmethod parameters()#

Defines the description and type of configurable data parameters for the wrapper.

See types.py to find available types of the data parameter. For each parameter the type, default value and description must be provided.

The example of possible data parameter:
‘confidence_threshold’: NumericalValue(

default_value=0.5, description=”Threshold value for detection box confidence”

)

The method must be implemented in each specific inherited wrapper.

Return type:

dict[str, Any]

Returns:

  • the dictionary with defined wrapper data parameters

postprocess(outputs, meta)#

Postprocess to convert soft prediction to hard prediction.

Parameters:
  • outputs (dict[str, np.ndarray]) – The output of the model.

  • meta (dict[str, Any]) – Contain label and original size.

Returns:

The postprocessed output of the model.

Return type:

(dict[str, np.ndarray])

preprocess(inputs)#

Preprocess prompts.

Return type:

list[dict]

class model_api.models.sam_models.SAMImageEncoder(inference_adapter, configuration={}, preload=False)#

Bases: ImageModel

Image Encoder for SAM: https://arxiv.org/abs/2304.02643

Image model constructor

It extends the Model constructor.

Parameters:
  • inference_adapter (InferenceAdapter) – allows working with the specified executor

  • configuration (dict, optional) – it contains values for parameters accepted by specific wrapper (confidence_threshold, labels etc.) which are set as data attributes

  • preload (bool, optional) – a flag whether the model is loaded to device while initialization. If preload=False, the model must be loaded via load method before inference

Raises:

WrapperError – if the wrapper failed to define appropriate inputs for images

classmethod parameters()#

Defines the description and type of configurable data parameters for the wrapper.

See types.py to find available types of the data parameter. For each parameter the type, default value and description must be provided.

The example of possible data parameter:
‘confidence_threshold’: NumericalValue(

default_value=0.5, description=”Threshold value for detection box confidence”

)

The method must be implemented in each specific inherited wrapper.

Return type:

dict[str, Any]

Returns:

  • the dictionary with defined wrapper data parameters

postprocess(outputs, meta)#

Interface for postprocess method.

Parameters:
  • outputs (dict) –

    model raw output in the following format: {

    ’output_layer_name_1’: raw_result_1, ‘output_layer_name_2’: raw_result_2, …

    }

  • meta (dict) – the input metadata obtained from preprocess method

Return type:

ndarray

Returns:

  • postprocessed data in the format defined by wrapper

preprocess(inputs)#

Update meta for image encoder.

Return type:

list[dict]