.. _tokenization_api: Tokenizers ========== Tokenizers allow for adding spaces as word boundaries for orthographic systems that don't normally use them (i.e., Japanese, Chinese, Thai). .. currentmodule:: montreal_forced_aligner.models .. autosummary:: :toctree: generated/ TokenizerModel .. toctree:: training tokenizer helper