Place Your Bets on Tokenization to Improve Cybersecurity

  • Friday, March 4, 2016 | 9:00 AM – 9:50 AM | West | Room: 2005

View all Sessions

Tokenizing toxic and proprietary data is one of the most effective security controls available. Tokenization is a way of abstracting data so applications can effectively use it, but without the risks associated with raw data. By replacing certain sensitive data strings with tokens, organizations can shield themselves from the consequences of a data breach and meet data residency requirements.


This document was retrieved from on Sun, 18 Aug 2019 11:24:20 -0400.