Basic unit of text that AI language models process, typically representing words, word parts, or punctuation. Like the individual puzzle pieces that make up text.
The word 'unhappiness' might be split into three tokens: 'un', 'happi', and 'ness' by a language model's tokenizer.