Tokenization

The process of breaking down text into individual units, such as words or phrases, which are then used for further analysis.
Example: Splitting a sentence into words for NLP tasks.


Related Keywords:
Tokenization ,,