espnet.nets.pytorch_backend.transducer.rnn_encoder.RNNP
Less than 1 minute
espnet.nets.pytorch_backend.transducer.rnn_encoder.RNNP
class espnet.nets.pytorch_backend.transducer.rnn_encoder.RNNP(idim: int, rnn_type: str, elayers: int, eunits: int, eprojs: int, subsample: ndarray, dropout_rate: float, aux_output_layers: List = [])
Bases: Module
RNN with projection layer module.
- Parameters:
- idim – Input dimension.
- rnn_type – RNNP units type.
- elayers – Number of RNNP layers.
- eunits – Number of units ((2 * eunits) if bidirectional).
- eprojs – Number of projection units.
- subsample – Subsampling rate per layer.
- dropout_rate – Dropout rate for RNNP layers.
- aux_output_layers – Layer IDs for auxiliary RNNP output sequences.
Initialize RNNP module.
forward(rnn_input: Tensor, rnn_len: Tensor, prev_states: List[Tensor] | None = None) → Tuple[Tensor, List[Tensor], Tensor]
RNNP forward.
Parameters:
- rnn_input – RNN input sequences. (B, T, D_in)
- rnn_len – RNN input sequences lengths. (B,)
- prev_states – RNN hidden states. [N x (B, T, D_proj)]
Returns: RNN output sequences. (B, T, D_proj) : with or without intermediate RNN output sequences. ((B, T, D_proj), [N x (B, T, D_proj)])
rnn_len: RNN output sequences lengths. (B,) current_states: RNN hidden states. [N x (B, T, D_proj)]
Return type: rnn_output