espnet.nets.pytorch_backend.transformer.lightconv.LightweightConvolution
Less than 1 minute
espnet.nets.pytorch_backend.transformer.lightconv.LightweightConvolution
class espnet.nets.pytorch_backend.transformer.lightconv.LightweightConvolution(wshare, n_feat, dropout_rate, kernel_size, use_kernel_mask=False, use_bias=False)
Bases: Module
Lightweight Convolution layer.
This implementation is based on https://github.com/pytorch/fairseq/tree/master/fairseq
- Parameters:
- wshare (int) – the number of kernel of convolution
- n_feat (int) – the number of features
- dropout_rate (float) – dropout_rate
- kernel_size (int) – kernel size (length)
- use_kernel_mask (bool) – Use causal mask or not for convolution kernel
- use_bias (bool) – Use bias term or not.
Construct Lightweight Convolution layer.
forward(query, key, value, mask)
Forward of ‘Lightweight Convolution’.
This function takes query, key and value but uses only query. This is just for compatibility with self-attention layer (attention.py)
- Parameters:
- query (torch.Tensor) – (batch, time1, d_model) input tensor
- key (torch.Tensor) – (batch, time2, d_model) NOT USED
- value (torch.Tensor) – (batch, time2, d_model) NOT USED
- mask (torch.Tensor) – (batch, time1, time2) mask
- Returns: (batch, time1, d_model) output
- Return type: x (torch.Tensor)