espnet2.schedulers.exponential_decay_warmup.ExponentialDecayWarmup
Less than 1 minute
espnet2.schedulers.exponential_decay_warmup.ExponentialDecayWarmup
class espnet2.schedulers.exponential_decay_warmup.ExponentialDecayWarmup(optimizer: Optimizer, max_lr: float, min_lr: float, total_steps: int, warmup_steps: int = 0, warm_from_zero: bool = False, last_epoch: int = -1)
Bases: _LRScheduler
, AbsBatchStepScheduler
Exponential Decay with Warmup.
if step < warmup_steps: : if warm_from_zero: : lr = initial_lr * (step / warmup_steps)
else: : lr = initial_lr <br/> else: : decay_factor = (step - warmup_steps) / (total_steps - warmup_steps) lr = initial_lr * exp(decay_factor * log(final_lr / initial_lr))
- Parameters:
- optimizer (Optimizer) – Wrapped optimizer.
- max_lr (float) – Initial learning rate (before decay).
- min_lr (float) – Final learning rate (after decay).
- total_steps (int) – Total number of steps (epochs * iters per epoch).
- warmup_steps (int) – Number of warmup steps. Default: 0.
- warm_from_zero (bool) – If True, warmup starts from 0 to initial_lr.
- last_epoch (int) – The index of last step. Default: -1.
get_lr()
init_lr()
step(epoch: int | None = None)