espnet2.enh.layers.uses.ChannelAttention
Less than 1 minute
espnet2.enh.layers.uses.ChannelAttention
class espnet2.enh.layers.uses.ChannelAttention(input_dim, att_heads=4, att_dim=256, activation='relu', eps=1e-05)
Bases: Module
Channel Attention module.
- Parameters:
- input_dim (int) – dimension of the input feature.
- att_heads (int) – number of attention heads in self-attention.
- att_dim (int) – projection dimension for query and key before self-attention.
- activation (str) – non-linear activation function.
- eps (float) – epsilon for layer normalization.
forward(x, ref_channel=None)
ChannelAttention Forward.
- Parameters:
- x (torch.Tensor) – input feature (batch, C, N, freq, time)
- ref_channel (None or int) – index of the reference channel.
- Returns: output feature (batch, C, N, freq, time)
- Return type: output (torch.Tensor)