espnet.nets.pytorch_backend.transformer.label_smoothing_loss.LabelSmoothingLoss
Less than 1 minute
espnet.nets.pytorch_backend.transformer.label_smoothing_loss.LabelSmoothingLoss
class espnet.nets.pytorch_backend.transformer.label_smoothing_loss.LabelSmoothingLoss(size, padding_idx, smoothing, normalize_length=False, criterion=KLDivLoss())
Bases: Module
Label-smoothing loss.
- Parameters:
- size (int) – the number of class
- padding_idx (int) – ignored class id
- smoothing (float) – smoothing rate (0.0 means the conventional CE)
- normalize_length (bool) – normalize loss by sequence length if True
- criterion (torch.nn.Module) – loss function to be smoothed
Construct an LabelSmoothingLoss object.
forward(x, target)
Compute loss between x and target.
- Parameters:
- x (torch.Tensor) – prediction (batch, seqlen, class)
- target (torch.Tensor) – target signal masked with self.padding_id (batch, seqlen)
- Returns: scalar float value
:rtype torch.Tensor