Loggers¶
Load PyTorch Lightning Loggers.
- class anomalib.utils.loggers.AnomalibCometLogger(api_key: Optional[str] = None, save_dir: Optional[str] = None, project_name: Optional[str] = None, rest_api_key: Optional[str] = None, experiment_name: Optional[str] = None, experiment_key: Optional[str] = None, offline: bool = False, prefix: str = '', **kwargs)[source]¶
Bases:
ImageLoggerBase
,CometLogger
Logger for comet.
Adds interface for add_image in the logger rather than calling the experiment object. .. note:: Same as the CometLogger provided by PyTorch Lightning and the doc string is reproduced below.
Track your parameters, metrics, source code and more using Comet.
Install it with pip:
pip install comet-ml
Comet requires either an API Key (online mode) or a local directory path (offline mode).
- Parameters:
api_key – Required in online mode. API key, found on Comet.ml. If not given, this will be loaded from the environment variable COMET_API_KEY or ~/.comet.config if either exists.
save_dir – Required in offline mode. The path for the directory to save local comet logs. If given, this also sets the directory for saving checkpoints.
project_name – Optional. Send your experiment to a specific project. Otherwise will be sent to Uncategorized Experiments. If the project name does not already exist, Comet.ml will create a new project.
rest_api_key – Optional. Rest API key found in Comet.ml settings. This is used to determine version number
experiment_name – Optional. String representing the name for this particular experiment on Comet.ml.
experiment_key – Optional. If set, restores from existing experiment.
offline – If api_key and save_dir are both given, this determines whether the experiment will be in online or offline mode. This is useful if you use save_dir to control the checkpoints directory and have a ~/.comet.config file but still want to run offline experiments.
prefix – A string to put at the beginning of metric keys.
kwargs – Additional arguments like workspace, log_code, etc. used by
CometExperiment
can be passed as keyword arguments in this logger.
- Raises:
ModuleNotFoundError – If required Comet package is not installed on the device.
MisconfigurationException – If neither
api_key
norsave_dir
are passed as arguments.
Example
>>> from anomalib.utils.loggers import AnomalibCometLogger >>> from pytorch_lightning import Trainer >>> comet_logger = AnomalibCometLogger() >>> trainer = Trainer(logger=comet_logger)
See also
- add_image(image: numpy.ndarray | matplotlib.figure.Figure, name: str | None = None, **kwargs: Any) None [source]¶
Interface to add image to comet logger.
- Parameters:
image (np.ndarray | Figure) – Image to log
name (str | None) – The tag of the image
kwargs – Accepts only global_step (int). The step at which to log the image.
- class anomalib.utils.loggers.AnomalibTensorBoardLogger(save_dir: str, name: str | None = 'default', version: Optional[Union[int, str]] = None, log_graph: bool = False, default_hp_metric: bool = True, prefix: str = '', **kwargs)[source]¶
Bases:
ImageLoggerBase
,TensorBoardLogger
Logger for tensorboard.
Adds interface for add_image in the logger rather than calling the experiment object.
Note
Same as the Tensorboard Logger provided by PyTorch Lightning and the doc string is reproduced below.
Logs are saved to
os.path.join(save_dir, name, version)
. This is the default logger in Lightning, it comes preinstalled.Example
>>> from pytorch_lightning import Trainer >>> from anomalib.utils.loggers import AnomalibTensorBoardLogger >>> logger = AnomalibTensorBoardLogger("tb_logs", name="my_model") >>> trainer = Trainer(logger=logger)
- Parameters:
save_dir (str) – Save directory
name (Optional, str) – Experiment name. Defaults to
'default'
. If it is the empty string then no per-experiment subdirectory is used.version (Optional, int, str) – Experiment version. If version is not specified the logger inspects the save directory for existing versions, then automatically assigns the next available version. If it is a string then it is used as the run-specific subdirectory name, otherwise
'version_${version}'
is used.log_graph (bool) – Adds the computational graph to tensorboard. This requires that the user has defined the self.example_input_array attribute in their model.
default_hp_metric (bool) – Enables a placeholder metric with key hp_metric when log_hyperparams is called without a metric (otherwise calls to log_hyperparams without a metric are ignored).
prefix (str) – A string to put at the beginning of metric keys.
**kwargs – Additional arguments like comment, filename_suffix, etc. used by
SummaryWriter
can be passed as keyword arguments in this logger.
- add_image(image: numpy.ndarray | matplotlib.figure.Figure, name: str | None = None, **kwargs: Any)[source]¶
Interface to add image to tensorboard logger.
- Parameters:
image (np.ndarray | Figure) – Image to log
name (str | None) – The tag of the image
kwargs – Accepts only global_step (int). The step at which to log the image.
- class anomalib.utils.loggers.AnomalibWandbLogger(name: Optional[str] = None, save_dir: Optional[str] = None, offline: bool | None = False, id: Optional[str] = None, anonymous: Optional[bool] = None, version: Optional[str] = None, project: Optional[str] = None, log_model: str | bool = False, experiment=None, prefix: str | None = '', **kwargs)[source]¶
Bases:
ImageLoggerBase
,WandbLogger
Logger for wandb.
Adds interface for add_image in the logger rather than calling the experiment object.
Note
Same as the wandb Logger provided by PyTorch Lightning and the doc string is reproduced below.
Log using Weights and Biases.
Install it with pip:
$ pip install wandb
- Parameters:
name – Display name for the run.
save_dir – Path where data is saved (wandb dir by default).
offline – Run offline (data can be streamed later to wandb servers).
id – Sets the version, mainly used to resume a previous run.
version – Same as id.
anonymous – Enables or explicitly disables anonymous logging.
project – The name of the project to which this run will belong.
log_model – Save checkpoints in wandb dir to upload on W&B servers.
prefix – A string to put at the beginning of metric keys.
experiment – WandB experiment object. Automatically set when creating a run.
**kwargs – Arguments passed to
wandb.init()
like entity, group, tags, etc.
- Raises:
ImportError – If required WandB package is not installed on the device.
MisconfigurationException – If both
log_model
andoffline``is set to ``True
.
Example
>>> from anomalib.utils.loggers import AnomalibWandbLogger >>> from pytorch_lightning import Trainer >>> wandb_logger = AnomalibWandbLogger() >>> trainer = Trainer(logger=wandb_logger)
Note: When logging manually through wandb.log or trainer.logger.experiment.log, make sure to use commit=False so the logging step does not increase.
See also
Tutorial on how to use W&B with PyTorch Lightning
- anomalib.utils.loggers.configure_logger(level: int | str = 20) None [source]¶
Get console logger by name.
- Parameters:
level (int | str, optional) – Logger Level. Defaults to logging.INFO.
- Returns:
The expected logger.
- Return type:
Logger
- anomalib.utils.loggers.get_experiment_logger(config: omegaconf.dictconfig.DictConfig | omegaconf.listconfig.ListConfig) Union[Logger, Iterable[Logger], bool] [source]¶
Return a logger based on the choice of logger in the config file.
- Parameters:
config (DictConfig) – config.yaml file for the corresponding anomalib model.
- Raises:
ValueError – for any logger types apart from false and tensorboard
- Returns:
Logger
- Return type:
Logger | Iterable[Logger] | bool]