espnet.nets.pytorch_backend.transducer.transformer_decoder_layer.TransformerDecoderLayer
Less than 1 minute
espnet.nets.pytorch_backend.transducer.transformer_decoder_layer.TransformerDecoderLayer
class espnet.nets.pytorch_backend.transducer.transformer_decoder_layer.TransformerDecoderLayer(hdim: int, self_attention: MultiHeadedAttention, feed_forward: PositionwiseFeedForward, dropout_rate: float)
Bases: Module
Transformer decoder layer module for custom Transducer model.
- Parameters:
- hdim – Hidden dimension.
- self_attention – Self-attention module.
- feed_forward – Feed forward module.
- dropout_rate – Dropout rate.
Construct an DecoderLayer object.
forward(sequence: Tensor, mask: Tensor, cache: Tensor | None = None)
Compute previous decoder output sequences.
- Parameters:
- sequence – Transformer input sequences. (B, U, D_dec)
- mask – Transformer intput mask sequences. (B, U)
- cache – Cached decoder output sequences. (B, (U - 1), D_dec)
- Returns: Transformer output sequences. (B, U, D_dec) mask: Transformer output mask sequences. (B, U)
- Return type: sequence