Issa chicago next generation tokenization ulf mattsson apr 2011


Published on

Data tokenization for PCI DSS, HIPAA HITECH and US state laws

Published in: Technology
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Issa chicago next generation tokenization ulf mattsson apr 2011

  1. 1. Next Generation Tokenization for Compliance and Cloud Data Protection Ulf Mattsson CTO Protegrity ulf . mattsson AT protegrity . com
  2. 2. Ulf Mattsson 20 years with IBM Development & Global Services Inventor of 22 patents – Encryption and Tokenization Co-founder of Protegrity (Data Security) Research member of the International Federation for Information Processing (IFIP) WG 11.3 Data and Application Security Member of • PCI Security Standards Council (PCI SSC) • American National Standards Institute (ANSI) X9 • Cloud Security Alliance (CSA) • Information Systems Security Association (ISSA) • Information Systems Audit and Control Association (ISACA)02
  3. 3. 03
  4. 4. ISACA - Articles About Compliance05
  5. 5. PCI DSS & Visa USA – 10 Years06
  6. 6. Cloud Security07
  7. 7. No Confidence in Cloud Security (Oct 2010) CSO Magazine Survey: Cloud Security Still a Struggle for Many Companies A recent article written by Bill Brenner, senior editor at CSO Magazine, reveals that companies are still a bit scared of putting critical data in the cloud. Results from the 8th Annual Global Information Security Survey conducted by CSO, along with CIO and PriceWaterhouseCoopers, cites: 62% of companies have little to no confidence in their ability to secure any assets put in the cloud. Also, of the 49% of respondents who have ventured into cloud computing, 39% have major qualms about security. Source, CSO. October, 2010 :
  8. 8. Risks Associated with Cloud Computing Handing over sensitive data to a third party Threat of data breach or loss Weakening of corporate network security Uptime/business continuity Financial strength of the cloud computing provider Inability to customize applications 0 10 20 30 40 50 60 70 % The evolving role of IT managers and CIOs Findings from the 2010 IBM Global IT Risk Study09
  9. 9. Cloud Computing to Fuel Security Market (Oct 2010) 1. "Concerns about cloud security have grown in the past year” 2. "In 2009, the fear was abstract: a general concern as there is with all new technologies when theyre introduced ... 3. “Today, however, concerns are both more specific and more weighty” 4. “We see organizations placing a lot more scrutiny on cloud providers as to their controls and security processes; and they are more likely to defer adoption because of security inadequacies than to go ahead despite them." 5. Opportunities in the cloud for vendors are data security, identity and access management, cloud governance, application security, and operational security.
  10. 10. What Amazon AWS’s PCI Compliance Means1. Just because AWS is certified doesnt mean you are. You still need to deploy a PCI compliant application/service and anything on AWS is still within your assessment scope.2. The open question? PCI-DSS 2.0 doesnt address multi-tenancy concerns3. AWS is certified as a service provider doesnt mean all cloud IaaS providers will be4. You can store PAN data on S3, but it still needs to be encrypted in accordance with PCI-DSS requirements5. Amazon doesnt do this for you -- its something you need to implement yourself; including key management, rotation, logging, etc.6. If you deploy a server instance in EC2 it still needs to be assessed by your QSA7. What this certification really does is eliminate any doubts that you are allowed to deploy an in-scope PCI system on AWS8. This is a big deal, but your organizations assessment scope isnt necessarily reduced9. it might be when you move to something like a tokenization service where you reduce your handling of PAN data011
  11. 11. Data Breaches012
  12. 12. The Changing Threat Landscape (Aug, 2010) Some issues have stayed constant: 1. Threat landscape continues to gain sophistication 2. Attackers will always be a step ahead of the defenders Different motivation, methods and tools today: • Were fighting highly organized, well-funded crime syndicates and nations Move from detective to preventative controls needed: • Several layers of security to address more significant areas of risks Source:
  13. 13. Data Breaches Six years, 900+ breaches, and over 900 million compromised records The majority of cases have not yet been disclosed and may never be Over half of the breaches occurred outside of the U.S. Online Data is Compromised Most Frequently: % Source: 2010 Data Breach Investigations Report, Verizon Business RISK team and USSS014
  14. 14. Threat Action Categories Compromised records 1. 90 % lost in highly sophisticated attacks 2. Hacking and Malware are more dominant Source: 2010 Data Breach Investigations Report, Verizon Business RISK team and USSS015
  15. 15. Patching Software vs. Locking Down Data User Attacker Application Software Patching Not a Database Single Intrusion Exploited OS File System a Patchable Vulnerability Storage System BackupSource: 2010 Data Breach Investigations Report, Verizon Business RISK team and USSS
  16. 16. Not Enough to Encrypt the Pipe & Files Attacker SSL Encrypted Public Data Network (PCI DSS) Private Network Application Clear Text Clear Text Data Data Database Encrypted Data Data OS File System At Rest (PCI DSS) (PCI DSS) Storage System017
  17. 17. Protecting the Data Flow - Choose Your Defenses
  18. 18. Data Protection019
  19. 19. Data Security Today is a Catch-22 We need to protect both data and the business processes that rely on that data Enterprises are currently on their own in deciding how to apply emerging technologies for PCI data protection Data Tokenization - an evolving technology How to reduce PCI audit scope and exposure to data020
  20. 20. Evaluating Options021
  21. 21. Different Ways to Render the PAN Unreadable Two-way cryptography with associated key management processes One-way cryptographic hash functions Index tokens and pads Truncation022
  22. 22. Positioning Different Protection Options Area Evaluation Criteria Strong Field Formatted Distributed Encryption Encryption Token High risk data Security Compliance to PCI, NIST Transparent to applications Initial Expanded storage size Cost Transparent to databases schema Performance impact when loading data Long life-cycle data Unix or Windows mixed with “big iron” Operational (EBCDIC) Cost Easy re-keying of data in a data flow Disconnected environments Distributed environments Best Worst23
  23. 23. Choose Your Defenses – Different Approaches Web Database Application Columns Firewall DatabaseApplications Activity Monitoring Database Database Activity Log Files Monitoring Data Data Files Loss Database Server Prevention Encryption/Tokenization
  24. 24. Choose Your Defenses – Cost Effective PCI DSS Firewalls Encryption/Tokenization for data at rest Anti-virus & anti-malware solution Encryption for data in motion Access governance systems Identity & access management systemsCorrelation or event management systems Web application firewalls (WAF) WAF Endpoint encryption solution Data loss prevention systems (DLP) DLP Intrusion detection or prevention systemsDatabase scanning and monitoring (DAM) DAM ID & credentialing system Encryption/Tokenization 0 10 20 30 40 50 60 70 80 90 % Source: 2009 PCI DSS Compliance Survey, Ponemon Institute
  25. 25. Current, Planned Use of Enabling Technologies Strong interest in database encryption, data masking, tokenization Access controls 1% 91% 5% Database activity monitoring 18% 47% 16% Database encryption 30% 35% 10% Backup / Archive encryption 21% 39% 4% Data masking 28% 28% 7% Application-level encryption 7% 29% 7% Tokenization 22% 23% 13% Evaluating Current Use Planned Use <12 Months026
  26. 26. Risk Adjusted Data Protection Data Protection Methods Performance Storage Security Transparency System without data protection Monitoring + Blocking + Masking Format Controlling Encryption Downstream Masking Strong Encryption Tokenization Hashing Protection Method Extensibility: Data Protection Methods must evolve with changes in the security industry and with compliance requirements. The Protegrity Best Worst Data Security Platform can be easily extended to meet security and compliance requirements.27
  27. 27. Hiding Data in Plain Sight – Data Tokenization Data Entry Y&SFD%))S( Tokenization Server 400000 123456 7899 Data Token 400000 222222 7899 Application Databases028
  28. 28. PCI Case Study - Large Chain Store Stores Stores Token Authorization Servers Aggregating Hub for Store Token Channel Servers Settlement Loss Prevention Analysis - EDW ERP Settlement : Integration point029
  29. 29. Case Study Large Chain Store Uses Tokenization to Simplify PCI Compliance By segmenting cardholder data with tokenization, a regional chain of 1,500 local convenience stores is reducing its PCI audit from seven to three months “ We planned on 30 days to tokenize our 30 million card numbers. With Protegrity Tokenization, the whole process took about 90 minutes”030
  30. 30. Case Study Qualified Security Assessors had no issues with the effective segmentation provided by Tokenization “With encryption, implementations can spawn dozens of questions” “There were no such challenges with tokenization”031
  31. 31. Case Study Faster PCI audit – half that time Lower maintenance cost – don’t have to apply all 12 requirements of PCI DSS to every system Better security – able to eliminate several business processes such as generating daily reports for data requests and access Strong performance – rapid processing rate for initial tokenization, sub-second transaction SLA032
  32. 32. What Exactly Makes a “Secure Tokenization” Algorithm? Ramon Krikken: Ask vendors what their token-generating algorithms are Be sure to analyze anything other than strong random number generators for security.033
  33. 33. Best Practices for Tokenization * Unique Sequence Number One way Hash Secret per Irreversible merchant Function** Randomly generated value *: Published July 14, 2010 **: Multi-use tokens034
  34. 34. Best Practices from Visa Best Practices for Token Generation Token type Single-Use Multi-Use Known ANSI or ISO Algorithm strong approved and key algorithm algorithm Unique sequence OK OK One way number Secret per Secret per irreversible Hashing transaction merchant function Randomly generated value OK OK
  35. 35. Comments on Visa’s Tokenization Best Practices Visa recommendations should have been simply to use a random number You should not write your own home-grown token servers036
  36. 36. Centralized vs. Distributed Tokenization Large companies may need to utilize the tokenization services for locations throughout the world How do you deliver tokenization to many locations without the impact of latency?037
  37. 37. Evaluating Encryption & Tokenization Approaches Evaluation Criteria Encryption Tokenization Database Database Centralized Distributed Area Impact File Column Tokenization Tokenization Encryption Encryption (old) (new) AvailabilityScalability Latency CPU Consumption Data Flow Protection Compliance Scoping Security Key Management Randomness Separation of Duties038 Best Worst
  38. 38. Evaluating Encryption Granularity & Tokenization Evaluation Criteria Encryption Tokenization Database Database Dynamic Static Area Impact File Column Tokenization Tokenization Encryption Encryption (old) (new) AvailabilityScalability Latency CPU Consumption Data Flow Protection Compliance Scoping Security Key Management Randomness Separation of Duties039 Best Worst
  39. 39. Evaluating Field Encryption & Tokenization Intrusiveness (to Applications and Databases) Hashing - !@#$%a^///&*B()..,,,gft_+!@4#$2%p^&* Standard Encryption Strong Encryption - !@#$%a^.,mhu7/////&*B()_+!@ Alpha - 123456 aBcdeF 1234 Encoding Tokenizing or Partial - 123456 777777 1234 Formatted Encryption Clear Text Data - 123456 123456 1234 Data I I Length Original Longer040
  40. 40. Visa Best Practices for Tokenization Version 1Published July 14, 2010.Token Generation Token Types Single Use Token Multi Use TokenAlgorithm and Known strong algorithmKey Reversible (NIST Approved) - Unique Sequence NumberOne way Hash Secret per Secret perIrreversibleFunction transaction merchant Randomly generated value
  41. 41. Making Data Unreadable – Protection Methods (Pro’s & Con’s) Evaluating Different Tokenization Method IO Interface Protection ImplementationsSystem Layer Granularity AES/CBC, Formatted Data Hashing Data AES/CTR Encryption Tokenization Masking Column/Field Application Record Column Database Table Table Space OS File IO Block Storage IO Block System Best Worse
  42. 42. Evaluating Column Encryption & Tokenization Evaluation Criteria Database Column Encryption Tokenization Formatted Strong Dynamic Static Area Impact Encryption Encryption Tokenization Tokenization (old) (new) AvailabilityScalability Latency CPU Consumption Data Flow Protection Compliance Scoping Security Key Management Randomness Separation of Duties043 Best Worst
  43. 43. Tokenization Server Location Tokenization Server Location Evaluation Aspects Mainframe Remote Area Criteria DB2 Work Separate In-house Out-sourced Load Address Space Manager Availability Operational Latency Performance Separation Security PCI DSS Scope Best Worst
  44. 44. Positioning Different Protection Options Area Evaluation Criteria Strong Formatted Distributed Encryption Encryption Token High risk data Compliance to PCI, NIST Security Transparent to applications Expanded storage size Initial Transparent to databases schema Cost Performance impact when loading data Long life-cycle data Unix or Windows mixed with “big iron” (EBCDIC) Easy re-keying of data in a data flow Operational Disconnected environments Cost Distributed environments Best Worst45
  45. 45. Positioning Different Protection Options Evaluation Criteria Strong Formatted Tokens Encryption Encryption Security & Compliance Total Cost of Ownership Use of Encoded Data Best Worst46
  46. 46. Strong Column Encryption (AES CBC ) 123456 123456 1234 123456 123456 1234 123456 123456 1234 User User User Crypto Crypto Crypto Service Service Service Application Databases @#$#@$$/// @#$#@$$/// @#$#@$$/// Unprotected sensitive information:047 Protected sensitive information
  47. 47. Formatted Column Encryption (FCE, FPE ) 123456 999999 1234 123456 123456 1234 123456 123456 1234 User User User Crypto Crypto Service Service 123456 999999 1234 123456 999999 1234 123456 999999 1234 Unprotected sensitive information:048 Protected sensitive information
  48. 48. Data Tokens 123456 123456 1234 123456 999999 1234 123456 123456 1234 User User User TokenizationTokenization Service Service Application Databases 123456 999999 1234 123456 999999 1234 123456 999999 1234 : Data Token Unprotected sensitive information:049 Protected sensitive information
  49. 49. Issues with Traditional Tokenization Approaches User User High latency Replication: High cost, size & complexity Dynamic Dynamic token tables: token tables: Tokenization Low Tokenization Low Servers performance Servers performance Dynamic token tables: Token collisions and duplications User Encryption based tokens: User Keys must be managed & algorithm could be breached050 : Data Token
  50. 50. Benefits with a Pre-generated Tokenization Approach User User Static token tables: Tokenization Tokenization Server High performance Server Low latency No token collisions and no duplications No replication, low cost, small size & simple configuration Tokenization Tokenization Server Server User User051 : Data Token
  51. 51. Pre-generated Tokenization Approach User User Tokenization Tokenization Server Server Security Admin Installation Installation Push Push Tokenization Tokenization Server Server User User052 : Data Token
  52. 52. Data Protection Challenges Actual protection is not the challenge Management of solutions • Key management • Security policy • Auditing, Monitoring and reporting Minimizing impact on business operations • Transparency • Performance vs. security Minimizing the cost implications Maintaining compliance Implementation Time053
  53. 53. Best Practices - Data Security Management Policy File System Protector Database Protector Audit Log Application Protector Enterprise Data Security Administrator Tokenization Secure Server Archive054 : Encryption service
  54. 54. Case Study: Granularity of Reporting and Separation of Duties User / Client User Access Patient Health Record x Read a xxx Complete DBA Read b xxx Log 3rd Party Database z Write c xxx Encryption Possible DBA manipulation Database User Access Patient Health Record No Read Native z Write c xxx Log Encryption Possible DBA manipulation Health Health Acces User Patient Data Data s Record File OS File Database Process Read ? ? PHI002 No System 0001 Information Encryption Database On User Process Read ? ? PHI002 0001 or Record Database Process Write ? ? PHI002 0001 : Encryption service
  55. 55. Choose Your Defenses – Total Cost of OwnershipCost Cost of Aversion – Expected Losses Protection of Data from the Risk Total Cost Optimal Risk X Risk I I Level Strong Weak Protection Protection
  56. 56. Developing a Risk-adjusted Data Protection Plan 1. Know Your Data 2. Find Your Data 3. Understand Your Enemy 4. Understand the New Options in Data Protection 5. Deploy Defenses 6. Crunch the Numbers
  57. 57. Deploy DefensesMatching Data Protection Solutions with Risk Level Risk Level Solution Data Risk Field Level Low Risk Monitor Credit Card Number 25 (1-5)Social Security Number 20 CVV 20 Monitor, mask, At Risk Customer Name 12 access control (6-15) Secret Formula 10 limits, format Employee Name 9 controlEmployee Health Record 6 encryption High Risk Replacement, Zip Code 3 (16-25) strong encryption
  58. 58. Please contact me for more information Ulf Mattsson Ulf . Mattsson AT protegrity . com