flair.tokenization#
Tokenizer using konoha to support popular japanese tokenizers. |
|
Tokenizer that uses the en_core_sci_sm Spacy model and some special heuristics. |
|
Tokenizer using segtok, a third party library dedicated to rules-based Indo-European languages. |
|
Tokenizer based on space character only. |
|
Tokenizer using spacy under the hood. |
|
An abstract class representing a |
|
Helper class to wrap tokenizer functions to the class-based tokenizer interface. |