Tokens
Written by
Updated at April 26, 2024
Neural networks work with texts by representing words and sentences as tokens. Tokens are logical fragments or frequently used character sequences that are common for a natural language. Tokens help neural networks detect patterns and process natural language.
YandexGPT API uses its own tokenizer for text processing. You can estimate the text field size in tokens using the Tokenizer
REST API methods. The token count of the same text may vary from one model to another.