otx.core.utils.cache#
Cache Class for Trainer kwargs.
Classes
|
Cache arguments. |
- class otx.core.utils.cache.TrainerArgumentsCache(**kwargs)[source]#
Bases:
object
Cache arguments.
Since the Engine class accepts PyTorch Lightning Trainer arguments, we store these arguments using this class before the trainer is instantiated.
- Parameters:
(**kwargs) –
Trainer arguments that are cached
Example
>>> conf = OmegaConf.load("config.yaml") >>> cache = TrainerArgumentsCache(**conf) >>> cache.args { ... 'max_epochs': 100, 'val_check_interval': 0 } >>> config = {"max_epochs": 1, "val_check_interval": 1.0} >>> cache.update(config) Overriding max_epochs from 100 with 1 Overriding val_check_interval from 0 with 1.0 >>> cache.args { ... 'max_epochs': 1, 'val_check_interval': 1.0 }
- static get_trainer_constructor_args() set[str] [source]#
Get the set of arguments accepted by the Trainer class constructor.
- requires_update(**kwargs) bool [source]#
Checks if the cached arguments need to be updated based on the provided keyword arguments.
- Parameters:
**kwargs – The keyword arguments to compare with the cached arguments.
- Returns:
True if any of the cached arguments need to be updated, False otherwise.
- Return type: