Things you need to know in implementing security controls to comply with PCI for Hadoop and ancillary applications either through tokenization or encryption.
6. PCI Compliance Guideline
Transmission of identity must be encrypted
Two factor authentication
Key management
location
encryption
expiration of keys/tokens
Management of user access to resources
Services
Data
Geography
7. PCI Compliance Guideline
Strong encryption protocols at rest (AES-256, etc.) and in
motion (latest TLS/SSL)
No passwords in the clear
System audit information based on resource, time, client
info, userid and function
Prove “Chain of Custody”
No sensitive data stored in logs
10. What is tokenization?
Process of turning sensitive data into a value with no
meaning, called token i.e. 1234-567890-12345 =>
$^hAt_786Ab}+=-12345
If token is compromised, there’s zero risk
Recipient of token is out-of-scope for PCI compliance
21. What To Watch Out For
Kerberos is a MUST
If using Tokenization App, choose with NoSQL Backend (HBase,
Redis, etc.)
No RC4 or MD5
Use TLSv1.2 or newer
Use key length greater than 128 bits
All passwords must be encrypted
No super user - root & hdfs has access to encryption keys
22. What To Watch Out For
Do not delete encryption keys/rolled over keys
LDAPS is a MUST
If operators, not admins, has access to machines at OS level,
LUKS won’t work.
Lock down permissions to OS security config files
Use CA Certs if possible
Only open ports you will use
Guarantee “ordered” processing from a batch source