espnet2.enh.layers.dprnn.DPRNN_TAC
Less than 1 minute
espnet2.enh.layers.dprnn.DPRNN_TAC
class espnet2.enh.layers.dprnn.DPRNN_TAC(rnn_type, input_size, hidden_size, output_size, dropout=0, num_layers=1, bidirectional=True)
Bases: Module
Deep duaL-path RNN with TAC applied to each layer/block.
- Parameters:
- rnn_type – string, select from ‘RNN’, ‘LSTM’ and ‘GRU’.
- input_size – int, dimension of the input feature. The input should have shape (batch, seq_len, input_size).
- hidden_size – int, dimension of the hidden state.
- output_size – int, dimension of the output size.
- dropout – float, dropout ratio. Default is 0.
- num_layers – int, number of stacked RNN layers. Default is 1.
- bidirectional – bool, whether the RNN layers are bidirectional. Default is False.
Initializes internal Module state, shared by both nn.Module and ScriptModule.
forward(input, num_mic)
Defines the computation performed at every call.
Should be overridden by all subclasses.
NOTE
Although the recipe for forward pass needs to be defined within this function, one should call the Module
instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.