espnet2.schedulers.noam_lr.NoamLR
Less than 1 minute
espnet2.schedulers.noam_lr.NoamLR
class espnet2.schedulers.noam_lr.NoamLR(optimizer: Optimizer, model_size: int | float = 320, warmup_steps: int | float = 25000, last_epoch: int = -1)
Bases: _LRScheduler
, AbsBatchStepScheduler
The LR scheduler proposed by Noam
Ref: : “Attention Is All You Need”, https://arxiv.org/pdf/1706.03762.pdf
FIXME(kamo): PyTorch doesn’t provide _LRScheduler as public class, : thus the behaviour isn’t guaranteed at forward PyTorch version.
NOTE(kamo): The “model_size” in original implementation is derived from : the model, but in this implementation, this parameter is a constant value. You need to change it if the model is changed.
get_lr()
lr_for_WarmupLR(lr: float) → float