Tokenization: The process of converting sensitive data into a series of random, meaningless characters or tokens. It is used to protect data during transmission and storage. Examples -using tokenization to protect credit card information during online transactions and using tokenization to protect sensitive medical records in a healthcare database.
Categories: CC D5: Security Operations | CCSP D2: Cloud Data Security | CISM D3: Information Security Program | CISSP D3: Security Architecture and Engineering | Security+ D1: General Security Concepts | SSCP D5: Cryptography
Related Articles: