LmTrainerMixin#

class montreal_forced_aligner.language_modeling.trainer.LmTrainerMixin(prune_method='relative_entropy', order=3, method='kneser_ney', prune_thresh_small=3e-07, prune_thresh_medium=1e-07, **kwargs)[source]#

Bases: DictionaryMixin, TrainerMixin, MfaWorker

Abstract mixin class for training language models

Parameters:
  • prune_method (str) – Pruning method for pruning the ngram model, defaults to “relative_entropy”

  • prune_thresh_small (float) – Pruning threshold for the small language model, defaults to 0.0000003

  • prune_thresh_medium (float) – Pruning threshold for the medium language model, defaults to 0.0000001

See also

DictionaryMixin

For dictionary parsing parameters

TrainerMixin

For training parameters

MfaWorker

For worker parameters

export_model(output_model_path)[source]#

Export language model to specified path

Parameters:

output_model_path (Path) – Path to export model

property far_path#

Internal temporary path to the FAR file

finalize_training()[source]#

Run one training iteration

initialize_training()[source]#

Initialize training

property large_arpa_path#

Internal temporary path to the large arpa file

property medium_arpa_path#

Internal temporary path to the medium arpa file

property mod_path#

Internal temporary path to the model file

prune_large_language_model()[source]#

Prune the large language model into small and medium versions

property small_arpa_path#

Internal temporary path to the small arpa file

train_iteration()[source]#

Run one training iteration