Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

HIPAA Compliance in the Cloud

1,690 views

Published on

HIPAA Compliance in the Cloud

Published in: Technology
  • Be the first to comment

HIPAA Compliance in the Cloud

  1. 1. HIPAA Compliance in the Cloud Christopher Crosbie & Jonathan Fritz
  2. 2. CHRISTOPHER CROSBIE MPH, MS HEALTHCARE AND LIFE SCIENCE SOLUTION ARCHITECT MANAGER ccrosbie@amazon.com
  3. 3. An Expansive Ecosystem Industry and world-spanning ecosystem
  4. 4. Cloud Computing: Rx for Healthcare ~83% of Healthcare organizations are using cloud services and use is expected to grow in the future. The most frequent uses today include hosting clinical applications and/or data and the most common model seen is SaaS. Nearly all of the healthcare organizations presently using cloud services plan to expand use of cloud services in the future.* * 2014 HIMSS Analytics Cloud Survey.
  5. 5. Collaborative Medical Research on AWS Management Application on Amazon EC2 AWS Direct Connect bucket with objects vault Metadata on DynamoDB Metadata exposure via Amazon CloudSearch Research center data center External Researchers RDS for Data permission management Internet gateway Analytics Processing on multiple clusters Lifecycle polices
  6. 6. Collaborative Medical Research on AWS Management Application on Amazon EC2 AWS Direct Connect bucket with objects vault Metadata on DynamoDB Metadata exposure via Amazon CloudSearch Research center data center External Researchers RDS for Data permission management Internet gateway Analytics Processing on multiple clusters Lifecycle polices
  7. 7. Amazon EMR – Hadoop in the Cloud • Managed platform • Launch a cluster in minutes • Leverage the elasticity of the cloud • Baked in security features • Pay by the hour and save with Spot • Flexibility to customize HIPAA controls for Hadoop are relevant no matter which distribution or cloud vendor you choose
  8. 8. Why is HIPAA compliance such a hot topic with Hadoop? Because it’s important, and it’s hard
  9. 9. HIPAA 101 • It’s HIPAA, not HIPPA • HIPAA stands for the Health Insurance Portability and Accountability Act. • HIPAA regulation, terms you should know • Privacy rule • Protected Health Information (PHI) • Security rule • Breach Notification rules • Enforcement rules • HHS Office for Civil Rights (OCR) conducts audits • The Office of the National Coordinator for Health Information Technology (ONC) • Omnibus Rule (2013) A data storage company that has access to protected health information (whether digital or hard copy) qualifies as a business associate, even if the entity does not view the information or only does so on a random or infrequent basis. Thus, document storage companies maintaining 26 protected health information on behalf of covered entities are considered business associates, regardless of whether they actually view the information they hold. To help clarify this point, WE HAVE MODIFIED THE DEFINITION OF “BUSINESS ASSOCIATE” to generally provide that a business associate includes a person who “creates, receives, MAINTAINS, OR TRANSMITS” protected health information on behalf of a covered entity.
  10. 10. Who is a Business Associate? • A third party that creates, receives, maintains, or transmits protected health information(PHI) on behalf of a health care provider, clearinghouse or health plan. (covered entity) • i.e. your cloud provider https://aws.amazon.co m/compliance/hipaa- compliance/ https://cloud.google.co m/security/compliance https://www.microsoft.com/en- us/TrustCenter/Compliance/HIPAA
  11. 11. Meeting BAA Requirements Example AWS HIPAA Configuration Requirements Customers must encrypt ePHI in transit and at rest Customers must use EC2 Dedicated Instances for instances processing, storing, or transmitting ePHI Customers must record and retain activity related to use of and access to ePHI 11
  12. 12. Why can this be hard to meet with Hadoop? Secure Infrastructure Data Protection Access Controls Monitoring Relies on the traditional data-center model Data at rest (HDFS-TDE) Data in-transit (Fragmented) Authentication: MIT Kerberos !!! Authorization (In-consistent) Multiple options (Ganglia, Yarn Logs, Ambari) HIPAA shouldn’t mean giving up on ease of use or introducing complexity
  13. 13. Hadoop in the cloud… • Hadoop (and security) was designed for processing and assuming a dedicated cluster and multi-user tenancy. VS • In the Cloud, resources are ephemeral and offers the most utilization on a service/use based model
  14. 14. Encryption ComplianceSecurity Fundamentals • Private Subnets in VPC • EC2 Security Groups • Identity and Access Management (IAM) policies • Bucket policies • Access Control Lists (ACLs) • Query string authentication • SSL endpoints • Server Side Encryption (SSE-S3) • Server Side Encryption with provided keys (SSE-C, SSE- KMS) • Client-side Encryption • S3 bucket access logs • Lifecycle management policies • Access Control Lists (ACLs) • Versioning & MFA deletes • Certifications – HIPAA, PCI, SOC 1/2/3 etc.
  15. 15. Data Encryption Amazon S3 Local FS HDFS
  16. 16. Data Encryption At-Rest – Amazon S3 and EMRFS Server-Side Encryption - S3 managed keys (SSE-S3), AWS Key Management Service keys (SSE-KMS), or customer managed key (SSE-C) - S3 Client with extra metadata Client-Side Encryption - Customer managed keys or AWS Key Management Service - Use a custom Encryption Materials Provider with the S3 Encryption Client S3 uses AES-256 with envelope encryption. EMRFS makes S3 encryption transparent for applications on your cluster. Amazon S3
  17. 17. Data Encryption At-Rest – On Cluster Local FS - Need to encrypt scratch directories - LUKS using random key or AWS Key Management Service key HDFS - Need to encrypt intermediates or data stored in HDFS - HDFS transparent data encryption (HDFS- 6134) - Use Hadoop KMS or Ranger KMS Local FS HDFS
  18. 18. Data at Rest– HDFS TDE • HDFS encryption zones - encryption zone key (EZK) • Each File - unique data encryption key (DEK), which is encrypted (EDEK) • End-to-end (at-rest and in-transit) when data is written to an encryption zone • Uses Hadoop KMS with the Java Cryptography Extension KeyStore (JCEKS) EZK DEK EDEK
  19. 19. Data Encryption In-Flight MapReduce Shuffle (Shuffle Service) - Encrypted shuffle using SSL Spark Shuffle (BlockTransferService) - SASL encryption (digest-MD5) - SSL for Akka and HTTP (for broadcast and fileServer) HDFS Data Transfer - Use HDFS TDE (encrypts client side) - Or encrypt RPC (hadoop.rpc.protection) and Data Transfer (dfs.encrypt.data.transfer) Web UIs and clients - HTTPS (if supported) - Use SSH tunnels and port forwarding SSL
  20. 20. Access Control
  21. 21. Different permissions in a cloud environment • Who can launch a cluster? • What other cloud services can a cluster access? • What permissions do multiple users on a cluster have? • How can permissions be stateless when clusters can be transient?
  22. 22. • You get to control who can do what in your AWS environment when and from where • Limit permissions using IAM users and account federation with IAM roles • Fine-grained control of your AWS cloud with multi-factor authentication • Integrate with your existing Active Directory using federation and single sign-on AWS account owner Network management Security management Server management Storage management Control access and segregate duties everywhere
  23. 23. VPC private subnets to isolate network • Use Amazon S3 Endpoints for connectivity to S3 • Use Managed NAT for connectivity to other services or the Internet • Control the traffic using Security Groups • ElasticMapReduce-Master-Private • ElasticMapReduce-Slave-Private • ElasticMapReduce-ServiceAccess
  24. 24. IAM roles limit service and cluster permissions Service Role Cloud Resources
  25. 25. Kerberos for general on-cluster authentication Automated scripts in Apache Bigtop to enable Kerberos and create trust with AWS Directory Service or Active Directory (AWS Big Data Blog post coming soon).
  26. 26. LDAP authentication for secure entry points https://blogs.aws.amazon.com/bigdata/post/Tx3J2RL8V6N72G7/Using-LDAP-via-AWS-Directory- Service-to-Access-and-Administer-Your-Hadoop-Enviro - Direct integration with: HiveServer2, Presto, Hue, Zeppelin (coming soon), Phoenix, and other tools - Easier to set up than Kerberos, but more limited
  27. 27. Fine-Grained Access Controls / Authorization HiveServer2 - SQL-standards based authorization on Hive tables and views HBase - Cell level access control Ranger / Sentry + RecordService - Plug-ins for a variety of Hadoop ecosystem projects - Column level control for Hive tables - Ranger bootstrap action for EMR available (AWS Big Data Blog coming soon!) 3rd Party Solutions for access control and data masking - BlueTalon, DataGuise, and more!
  28. 28. Monitoring and Auditing
  29. 29. Monitoring and auditing Interaction with AWS environment - AWS CloudTrail will record access to API calls and save logs in your S3 buckets, no matter how those API calls were made Access to objects in S3 - EMR can log user-defined information in S3 audit logs to track which application accessed object Hadoop ecosystem audit logging - Access to logs generated by each application - Ranger and Sentry also generate audit logs from activity Ganglia and AWS CloudWatch for general monitoring
  30. 30. Conclusions Security is critical AWS has tools to make it easier You can move fast and stay safe Get started in minutes with EMR 4.7 Spark 1.6.1, Hadoop 2.7.2, Hive 1.0, Presto 0.147, HBase 1.2.1, Tez 0.8.3, Phoenix 4.7.0, Oozie 4.2.0, Zeppelin 0.5.6, Pig 0.14.0, Hue 3.7.1, Mahout 0.12.0, Sqoop 1.4.6, Hcatalog 1.0.0, ZooKeeper 3.4.8 Jon Fritz - jonfritz@amazon.com Senior Product Manager aws.amazon.com/emr

×