espnet.nets.pytorch_backend.rnn.attentions.initial_att
Less than 1 minute
espnet.nets.pytorch_backend.rnn.attentions.initial_att
espnet.nets.pytorch_backend.rnn.attentions.initial_att(atype, eprojs, dunits, aheads, adim, awin, aconv_chans, aconv_filts, han_mode=False)
Instantiates a single attention module
- Parameters:
- atype (str) – attention type
- eprojs (int) – # projection-units of encoder
- dunits (int) – # units of decoder
- aheads (int) – # heads of multi head attention
- adim (int) – attention dimension
- awin (int) – attention window size
- aconv_chans (int) – # channels of attention convolution
- aconv_filts (int) – filter size of attention convolution
- han_mode (bool) – flag to swith on mode of hierarchical attention
- Returns: The attention module