espnet2.enh.layers.dptnet.DPTNet
Less than 1 minute
espnet2.enh.layers.dptnet.DPTNet
class espnet2.enh.layers.dptnet.DPTNet(rnn_type, input_size, hidden_size, output_size, att_heads=4, dropout=0, activation='relu', num_layers=1, bidirectional=True, norm_type='gLN')
Bases: Module
Dual-path transformer network.
- Parameters:
- rnn_type (str) – select from ‘RNN’, ‘LSTM’ and ‘GRU’.
- input_size (int) – dimension of the input feature. Input size must be a multiple of att_heads.
- hidden_size (int) – dimension of the hidden state.
- output_size (int) – dimension of the output size.
- att_heads (int) – number of attention heads.
- dropout (float) – dropout ratio. Default is 0.
- activation (str) – activation function applied at the output of RNN.
- num_layers (int) – number of stacked RNN layers. Default is 1.
- bidirectional (bool) – whether the RNN layers are bidirectional. Default is True.
- norm_type (str) – type of normalization to use after each inter- or intra-chunk Transformer block.
Initializes internal Module state, shared by both nn.Module and ScriptModule.
forward(input)
Defines the computation performed at every call.
Should be overridden by all subclasses.
NOTE
Although the recipe for forward pass needs to be defined within this function, one should call the Module
instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.
inter_chunk_process(x, layer_index)
intra_chunk_process(x, layer_index)