- The process of substituting a sensitive data element with a non-sensitive equivalent, referred to as a token, that has no extrinsic or exploitable meaning or value. The token is a reference (i.e., identifier) that maps back to the sensitive data through a tokenization system. ← Wikipedia
- Previous term: Token authentication
- Next term: TOM
- Random term: Advanced Perceptual Contrast Algorithm