otx.algo.samplers#

Custom samplers for the OTX2.0.

Classes

BalancedSampler(dataset[, efficient_mode, ...])

Balanced sampler for imbalanced data for class-incremental task.

class otx.algo.samplers.BalancedSampler(dataset: OTXDataset, efficient_mode: bool = False, num_replicas: int = 1, rank: int = 0, drop_last: bool = False, n_repeats: int = 1, generator: torch.Generator | None = None)[source]#

Bases: Sampler

Balanced sampler for imbalanced data for class-incremental task.

This sampler is a sampler that creates an effective batch In reduce mode, reduce the iteration size by estimating the trials that all samples in the tail class are selected more than once with probability 0.999

Parameters:
  • dataset (OTXDataset) – A built-up dataset

  • efficient_mode (bool) – Flag about using efficient mode

  • num_replicas (int, optional) – Number of processes participating in distributed training. By default, world_size is retrieved from the current distributed group.

  • rank (int, optional) – Rank of the current process within num_replicas. By default, rank is retrieved from the current distributed group.

  • drop_last (bool, optional) – if True, then the sampler will drop the tail of the data to make it evenly divisible across the number of replicas. If False, the sampler will add extra indices to make the data evenly divisible across the replicas. Default: False.

  • n_repeats (int, optional) – number of iterations for manual setting