Tokenization
« Back to Glossary Index
« Back to Glossary Index
Categories: Cryptocurrency
Tokenization is a method of protecting sensitive information by substituting a critical data element with a non-sensitive unique alphanumeric identifier, referred to as a token, that has no exploitable meaning or value to third parties. E.g. tokenization can be used to create a representation of a real-world asset by a digital token.
Tokenization (Wikipedia)

Look up tokenization or tokenisation in Wiktionary, the free dictionary.
Tokenization may refer to:
- Tokenization (lexical analysis) in language processing
- Tokenization (data security) in the field of data security
- Word segmentation
- Tokenism of minorities