espnet2.gan_tts.parallel_wavegan.parallel_wavegan.ParallelWaveGANDiscriminator
Less than 1 minute
espnet2.gan_tts.parallel_wavegan.parallel_wavegan.ParallelWaveGANDiscriminator
class espnet2.gan_tts.parallel_wavegan.parallel_wavegan.ParallelWaveGANDiscriminator(in_channels: int = 1, out_channels: int = 1, kernel_size: int = 3, layers: int = 10, conv_channels: int = 64, dilation_factor: int = 1, nonlinear_activation: str = 'LeakyReLU', nonlinear_activation_params: Dict[str, Any] = {'negative_slope': 0.2}, bias: bool = True, use_weight_norm: bool = True)
Bases: Module
Parallel WaveGAN Discriminator module.
Initialize ParallelWaveGANDiscriminator module.
- Parameters:
- in_channels (int) – Number of input channels.
- out_channels (int) – Number of output channels.
- kernel_size (int) – Number of output channels.
- layers (int) – Number of conv layers.
- conv_channels (int) – Number of chnn layers.
- dilation_factor (int) – Dilation factor. For example, if dilation_factor = 2, the dilation will be 2, 4, 8, …, and so on.
- nonlinear_activation (str) – Nonlinear function after each conv.
- nonlinear_activation_params (Dict *[*str , Any ]) – Nonlinear function parameters
- bias (bool) – Whether to use bias parameter in conv.
- use_weight_norm (bool) – If set to true, it will be applied to all of the conv layers.
apply_weight_norm()
Apply weight normalization module from all of the layers.
forward(x: Tensor) → Tensor
Calculate forward propagation.
- Parameters:x (Tensor) – Input noise signal (B, 1, T).
- Returns: Output tensor (B, 1, T).
- Return type: Tensor
remove_weight_norm()
Remove weight normalization module from all of the layers.