espnetez.config.update_finetune_config
Less than 1 minute
espnetez.config.update_finetune_config
espnetez.config.update_finetune_config(task, pretrain_config, path)
Update the fine-tuning configuration with values from a specified YAML file.
This function loads the fine-tuning configuration from a YAML file and updates the provided pre-training configuration dictionary. It prioritizes values from the fine-tuning configuration, while ensuring that any distributed-related settings are reset to their defaults. Additionally, it integrates default configurations from the specified task.
- Parameters:
- task (str) – The name of the task for which the configuration is being updated.
- pretrain_config (dict) – The existing pre-training configuration dictionary to be updated.
- path (str) – The file path to the YAML file containing the fine-tuning configuration.
- Returns: The updated pre-training configuration dictionary after merging with the : fine-tuning configuration and defaults from the specified task.
- Return type: dict
Examples
>>> pretrain_cfg = {
... "learning_rate": 0.001,
... "batch_size": 32,
... "dist_backend": "nccl"
... }
>>> updated_cfg = update_finetune_config("asr", pretrain_cfg,
"finetune_config.yaml")
>>> print(updated_cfg)
{
"learning_rate": 0.0001, # updated from finetune_config.yaml
"batch_size": 32,
"dist_backend": "nccl",
"other_config": "default_value" # from task defaults
}
- Raises:
- FileNotFoundError – If the specified YAML file does not exist.
- yaml.YAMLError – If the YAML file is improperly formatted.
NOTE
The function assumes that the task class provides a method get_default_config() which returns the default configuration as a dictionary.