benchmark.utils
- setup_args(parser: ArgumentParser) Namespace [source]
- setup_logger(logpath: Path | str = PosixPath('../log'), level_console: int = 10, level_file: int = 15, quiet: bool = True, fmt='{message}')[source]
- setup_logpath(dir: Path | str = PosixPath('../log'), folder_args: tuple | None = None, quiet: bool = True)[source]
Resolve log path for saving.
- class ResLogger(logpath: Path | str = PosixPath('../log'), prefix: str = 'summary', suffix: str | None = None, quiet: bool = True)[source]
Bases:
object
Logger for formatting result to strings by wrapping pd.DataFrame table.
- Parameters:
- static guess_fmt(key: str, val) Callable [source]
Guesses the string format function based on its name.
- concat(vals: list[tuple[str, Any, Callable]] | dict, row: int = 0, suffix: str | None = None)[source]
Concatenate data entries of a single row to data.
- merge(logger: ResLogger, rows: list[int] | None = None, suffix: str | None = None)[source]
Merge from another logger.
- _get(col: list | str | None = None, row: list | str | None = None) DataFrame | Series | str [source]
Retrieve one or sliced data and apply string format.
- Parameters:
- Returns:
val – Formatted data. - type: follows the return type of DataFrame.loc[row, col]. - value: formatted string in each entry.
- class CkptLogger(logpath: Path | str, patience: int = -1, period: int = 0, prefix: str = 'model', storage: str = 'state_gpu', metric_cmp: Callable[[float, float], bool] | str = 'max')[source]
Bases:
object
Checkpoint Logger for saving and loading models and managing early stopping during training.
- Parameters:
logpath (
Path
|str
) – Path to checkpoints saving directory.patience (
int
, default:-1
) – Patience for early stopping. Defaults no early stopping.period (
int
, default:0
) – Periodic saving interval. Defaults to no periodic saving.prefix (
str
, default:'model'
) – Prefix for the checkpoint file names.storage (
str
, default:'state_gpu'
) – Storage scheme for saving the checkpoints. * ‘model’ vs ‘state’: Save model object or state_dict. * ‘_file’, ‘_ram’, ‘_gpu’: Save as file, RAM, or GPU memory.metric_cmp (
Callable
[[float
,float
],bool
] |str
, default:'max'
) – Comparison function for the metric. Can be ‘max’ or ‘min’.
- save(*suffix, model: Module)[source]
Save the model according to storage scheme.
- Parameters:
suffix – Variable length argument for suffix in the model file name.
model (
nn.Module
) – The model to be saved.
- load(*suffix, model: Module, map_location='cpu') Module [source]
Load the model from the storage.
- Parameters:
suffix – Variable length argument for suffix in the model file name.
model (
Module
) – The model structure to load.map_location (default:
'cpu'
) – map_location argument for torch.load.
- Returns:
model (
nn.Module
) – The loaded model.
benchmark.utils.config
|
|
|
|
|
|
|
|
|