SlideShare a Scribd company logo
1 of 12
Download to read offline
HIPAA Compliance For Your Apache™ Hadoop®
Environment www.hortonworks.com ©2016 Hortonworks
A HORTONWORKS WHITE PAPER
AUGUST 2016
HIPAA COMPLIANCE FOR YOUR
APACHE™ HADOOP®
ENVIRONMENT
2HIPAA Compliance For Your Apache™ Hadoop®
Environment • ©2016 Hortonworks • www.hortonworks.com
Contents
Overview  3
Covered Entities 4
Penalties For Non-Compliance 5
Breach Notification 6
Building Support For HIPAA Compliance Into Your Hadoop Environment 7
How Hortonworks Helps You Meet Key HIPAA Requirements 9
Dynamic Metadata-base Access Policies for Real-Time Access Control 10
Physical Safeguards 11
Conclusion12
3HIPAA Compliance For Your Apache™ Hadoop®
Environment • ©2016 Hortonworks • www.hortonworks.com
In 1996, the U.S. Department of Health and Human Services (HHS) issued the Health
Insurance Portability and Accountability Act (HIPAA). The HIPAA Privacy Rule addresses
the use and disclosure of individuals’ health information called protected health information
or PHI. The Privacy Rule protects all “individually identifiable health information” stored or
transmitted by a covered entity or its business associate, in any form or media.
PHI is a core component of HIPAA and includes demographic
data such as name, address, birthdate, social security number or
other information that can be useful in identifying an individual
and can be used to relate his or her:
•	 Past, present or future physical and mental health or condition
•	 Provision of health care to the individual
•	 Payment for the provision of health care to the individual
According to the American Health Information Management
Association (AHIMA), during a hospitalization, an average of 150
people have access to a patient’s medical records including x-ray
technicians, nursing staff and billing personnel among others.
While many of these individuals have a legitimate need to see all
or part of a patient’s records, prior to HIPAA and the Privacy Rule,
there wasn’t any regulation that specified the:
•	 Personnel who could access patient records
•	 Type of information these personnel are able to see
•	 Actions these personnel are permitted to perform with the
information once they have access to it
The objective of the Privacy Rule is to strike a balance between
protecting individuals’ health information and facilitating the flow
of health information that is required to provide and promote high
quality health care.
HIPAA and the Privacy Rule have important data security and
governance implications that have led to the application of
technologies to control access to personal health information.
These applications include the authentication and authorization
of individuals who have access to patient information and the
establishment of audit trails of those accessing or modifying
information at different levels.
OVERVIEW
4HIPAA Compliance For Your Apache™ Hadoop®
Environment • ©2016 Hortonworks • www.hortonworks.com
The organizations that are subject to the Privacy Rule are knows as covered entities. These include:
•	 Health plans that provide or pay for the cost of medical care
•	 Health Care Providers that includes hospitals, physicians, dentists and any person or organization that furnishes, bills, or is paid for
health care services
•	 Health Care Clearinghouses that includes billing services, re-pricing companies, community health management information systems
and value-added networks and switches.
Covered Entities
1 Covered entities are required to comply with every Security Rule classified as “Standard.” Certain implementation specifications within those standards
are classified as “addressable,” while others as “required.” The “required” implementation specifications must be implemented. The “addressable”
designation permits covered entities to determine whether the implementation specification is reasonable and appropriate for that covered entity.
HIPAA TECHNICAL SAFEGUARDS – REQUIREMENT SUMMARY
Standards Section Implementation
Specifications
Required (R) or
Addressable (A)1
Hortonworks
Competency
Access Control § 164.312(a)(1) 1. Unique User
Identification
R Kerberos, Knox  Ranger
with LDAP/ AD
2. Automatic Logoff A Ranger
3. Encryption and
Decryption
A Wire encryption in
Hadoop
HDFS encryption w/
Ranger
Audit Control § 164.312(b) Hardware  software
procedural controls
R Falcon, Ranger, Atlas
Integrity § 164.312(c)(1) Mechanism to
Authenticate Electronic
Protected Health
Information
A Ranger, Kerberos  Knox
Person or Entity
Authentication
§ 164.312(d) R Ranger  Knox
Transmission Security § 164.312(e)(1) 1. Integrity Controls A
2. Encryption A Wire encryption in
Hadoop
HDFS encryption w/
Ranger
5HIPAA Compliance For Your Apache™ Hadoop®
Environment • ©2016 Hortonworks • www.hortonworks.com
The Office for Civil Rights (OCR) in Health and Human Services is responsible for implementing
and enforcing the Privacy Rule. OCR may impose a penalty on a covered entity for failure to
comply with the Privacy Rule. Penalties take into considerations factors such as the date of the
violation, whether the covered entity knew or should have known that it’s in non-compliance, or
whether the covered entity’s failure to comply was due to willful neglect.
In 2009 Congress passed the Health Information Technology for Economic and Clinical Health Act
(HITECH). The HITECH Act strengthened HIPAA enforcement by greatly increasing the penalties
for violations and authorizing states’ attorneys general for its enforcement. The HITECH Act also
outlined the requirements for first federal data security breach notification, and mandated HHS to
conduct privacy and security audits.
Penalties For Non-Compliance
FOR VIOLATIONS
OCCURRING PRIOR TO
2/18/2009
FOR VIOLATIONS
OCCURRING ON OR AFTER
2/18/2009
Penalty Amount Up to $100 per violation
$100 to $50,000 or more
per violation
Calendar Year Cap $25,000 $1,500,000
A person who knowingly obtains or discloses individually identifiable health information may
face a criminal penalty of up to $50,000 and up to one-year imprisonment. The criminal penalties
increase to $100,000 and up to five years imprisonment if the wrongful conduct involves false
pretenses, and to $250,000 and up to 10 years imprisonment if the wrongful conduct involves the
intent to sell, transfer, or use identifiable health information for commercial advantage, personal
gain or malicious harm. PCI is designed to protect both cardholder and sensitive authentication
data relating to payment methods used by consumers.
6HIPAA Compliance For Your Apache™ Hadoop®
Environment • ©2016 Hortonworks • www.hortonworks.com
The HIPAA Breach Notification Rule, requires covered entities to notify the affected individuals,
the Secretary, and in certain circumstances the media of a breach of protected health information.
A breach is defined as an impermissible use or disclosure that compromises the security or
privacy of the protected health information.
Also the business associate has responsibility to notify the covered entities, in case the breach
occurs at or by the business associate.
Breach Notification
7HIPAA Compliance For Your Apache™ Hadoop®
Environment • ©2016 Hortonworks • www.hortonworks.com
The Hortonworks Data Platform (HDP®) forms the core of the modern data architecture that
allows enterprises to store and process massive amount of structured, semi-structured, and
unstructured data. Trusted by more than 700 customers and proven in highly security-conscious
environments, HDP provides robust security capabilities to help businesses meet HIPAA
requirements.
Based on a holistic approach to protection, the Hortonworks security framework revolves around
five pillars: administration, authentication/ perimeter security, authorization, audit and data
protection. Rather than applying protection as an afterthought, Hortonworks prevents gaps and
inconsistencies through a bottom-up platform approach that makes it possible to enforce and
manage security across the stack through a central point of administration and management
built into the DNA of HDP. By implementing security at the platform level, Hortonworks ensures
that security is consistently administered to all the applications across the stack, and makes the
process of adding or retiring applications operating on top of Hadoop seamless.
The Hortonworks security architecture begins with Apache Ranger, Apache Knox and Kerberos.
Ranger serves as the central interface for security administration. Users can create and update
policies, which are then stored in a policy database. Ranger plugins consisting of lightweight
Java programs are embedded within processes of each cluster component. These plugins pull in
policies from a central server and store them locally in a file. When a user request comes through
the component, these plugins intercept the request and evaluate it against the security policy.
Plugins also collect data from the user request and follow a separate thread to send this data
back to the audit server.
This platform approach ensures that each of the five pillars of Hadoop security complement each
other effectively to enable comprehensive protection.
Figure 1: Comprehensive security in HDP
ADMINISTRATION
Ranger provides a single pane of glass to define, administer and manage security policies
consistently across all the components of the Hadoop stack. Ranger enhances the productivity
of security administrators and reduces potential errors by empowering them to define security
policy once and apply it to all the applicable components across the Hadoop stack from a central
location.
Building Support For HIPAA Compliance Into Your
Hadoop Environment
CENTRALIZED SECURITY ADMINISTRATION WITH RANGER
HDP
AUTHENTICATION/
PERIMETER SECURITY
•	 Kerberos
•	 Perimeter security
with Apache Knox
•	 Fine grain access
control with Apache
Ranger
•	 Centralized audit
reporting with
Apache Ranger
•	 Wire encryption in
Hadoop
•	 HDFS Encryption
with Ranger KMS
AUTHORIZATION AUDIT DATA PROTECTION
8HIPAA Compliance For Your Apache™ Hadoop®
Environment • ©2016 Hortonworks • www.hortonworks.com
AUTHENTICATION AND PERIMETER SECURITY
Apache Knox Gateway ensures perimeter security for Hortonworks customers, providing a way
for users to reliably identify themselves and then have that identity propagated throughout the
Hadoop cluster to access resources such as files and directories, and to perform tasks such
as running MapReduce jobs. Hortonworks uses Kerberos, an industry standard, to authenticate
users and resources within Hadoop cluster. Hortonworks has also simplified Kerberos setup,
configuration and maintenance through Ambari 2.0. With Knox, enterprises can confidently
extend the Hadoop REST API to new users without Kerberos complexities, while also maintaining
compliance with enterprise security policies. Knox provides a central gateway for Hadoop REST
APIs that have varying degrees of authorization, authentication, SSL and SSO capabilities to
enable a single access point for Hadoop.
AUTHORIZATION
Ranger manages fine-grained access control through a rich user interface that ensures consistent
policy administration across Hadoop data access components. Security administrators have
the flexibility to define security policies for a database, table and column or a file, and administer
permissions for specific LDAP based groups or individual users. Rules based on dynamic
conditions such as time or geography can also be added to an existing policy rule. The Ranger
authorization model is highly pluggable and can be easily extended to any data source using
a service-based definition. Ranger works with standard authorization APIs in each Hadoop
component and is able to enforce centrally administered policies for any method of accessing the
data lake. The combination of Ranger’s rich user interface with deep audit visibility makes it highly
intuitive to use, enhancing productivity for security administrators.
AUDIT
Established by Hortonworks with Aetna, Merck, Target and SAS, the Data Governance Initiative
(DGI) introduced a common approach to Hadoop data governance into the open source
community. This initiative has since evolved into a new open source project called Apache Atlas,
a set of core foundational governance services that enables enterprises to effectively and
efficiently meet their compliance requirements within Hadoop and allows integration with the
complete enterprise data ecosystem.
Ranger also provides a centralized framework for collecting access audit history and easily
reporting on this data, including the ability to filter data based on various parameters. Together
with Apache Atlas, this makes it possible for users to gain a comprehensive view of data lineage
and access audit, with an ability to query and filter audit based on data classification, users or
groups, and other filters.
DATA PROTECTION
HDP adds a robust layer of security by making data unreadable in transit over the network or at
rest on a disk. Hortonworks has recently introduced HDFS encryption for encrypting HDFS files,
complemented with a Ranger-embedded open source Hadoop key management store (KMS).
Ranger provides security administrators with the ability to manage keys and authorization policies
for KMS. Hortonworks is also working extensively with its encryption partners to integrate HDFS
encryption with enterprise-grade key management frameworks. With Hortonworks, our customers
have the flexibility to leverage an open source key management system (KMS), or use enterprise
wide KMS solutions provided by the partner ecosystem.
Encryption in HDFS, combined with KMS access policies maintained by Ranger, prevents rogue
Linux or Hadoop administrators from accessing data and supports segregation of duties for both
data access and encryption.
9HIPAA Compliance For Your Apache™ Hadoop®
Environment • ©2016 Hortonworks • www.hortonworks.com
A covered entity must maintain reasonable and appropriate technical and physical safeguards to prevent intentional or unintentional
disclosure of protected health information in violation of the Privacy Rule.
How Hortonworks Helps You Meet Key HIPAA Requirements
HIPAA REQUIREMENT HOW HORTONWORKS CAN HELP?
Access Controls: A covered entity
must implement technical policies and
procedures that allow only authorized
persons to access electronic protected
health information (e-PHI)
HDP supports implementation of Kerberos for authentication within a cluster. Apache Knox
provides authentication for REST-based services and can be integrated with AD/LDAP or
other authentication mechanisms.
Integrating Kerberos and Knox with LDAP/ AD also enables organizations to identify
unique users.
Audit Controls: A covered entity must
implement hardware, software, and/or
procedural mechanisms to record and
examine access and other activity in
information systems that contain or use
e-PHI
Ranger provides a centralized framework to collect access audit history and easily report
on this data, including the ability to filter data based on various parameters.
The combination of Ranger and Apache Atlas makes it possible for users to gain a
comprehensive view of data lineage and access audit, with an ability to query and filter
audit based on data classification, users or groups, and other filters.
HDP supports wire encryption for all access protocols as well as Solr, Kafka and YARN,
with dynamic attributes such as geo, time and data to drive security policy decisions.
Integrity Controls: A covered entity
must implement policies and
procedures to ensure that e-PHI is
not improperly altered or destroyed.
Electronic measures must be put in
place to confirm that e-PHI has not been
improperly altered or destroyed
Apache Ranger provides fine-grained access control across HDFS, Hive, HBase, Storm,
Knox, Solr, Kafka and Yarn. Security administrators have the flexibility to define security
policies for a database, table and column or a file, and administer permissions for specific
LDAP based groups or individual users.
With the introduction of dynamic tag-based policies enabled through the integration of
Ranger with Apache Atlas (data governance solution), users now have the flexibility to
enforce both role-based (RBAC) and attribute-based (ABAC) access control. For further
details, please refer to the section titled Dynamic Tag-Based Access Control.
Person or Entity Authentication:
Implement procedures to verify that
a person or entity seeking access to
electronic protected health information
is the one claimed
HDP supports implementation of Kerberos for authentication within a cluster. Apache Knox
provides authentication for REST-based services and can be integrated with AD/LDAP or
other authentication mechanisms.
Transmission Security: A covered
entity must implement technical
security measures that guard against
unauthorized access to e-PHI that is
being transmitted over an electronic
network
HDP supports encrypting network traffic to provide privacy for data movement, as well as
for data at rest.
HDP supports wire encryption for all access protocols as well as Solr, Kafka and YARN,
with dynamic attributes such as geo, time and data to drive security policy decisions.
HDP also supports the ability to encrypt files stored in Hadoop, including a Ranger-
embedded open source Hadoop key management store (KMS). Ranger provides security
administrators with the ability to manage keys and authorization policies for KMS.
With HDP, Hortonworks customers have the flexibility to leverage open source KMS or use
enterprise wide KMS solutions provided by the partner ecosystem.
10HIPAA Compliance For Your Apache™ Hadoop®
Environment • ©2016 Hortonworks • www.hortonworks.com
Apache Atlas was proposed to provide data governance
capabilities in Hadoop. At its core, Atlas is designed to exchange
metadata both within and outside of the Hadoop stack.
By reconciling both logical data models and forensic events,
enriched by business taxonomy metadata, Atlas enables a
scalable set of core governance services (for detailed information
on Apache Atlas, download the Governance White Paper from
http://hortonworks.com/hdp/governance/). These services enable
enterprises to effectively and efficiently address their compliance
requirements by providing:
•	 Search and lineage for datasets
•	 Metadata-driven data access control
•	 Indexed and searchable centralized audit for operational events
•	 Comprehensive data lifecycle management from ingestion
to disposition
•	 Metadata interchange with other metadata tools
By integrating Ranger with Atlas, Hortonworks empowers
enterprises to rationalize compliance policy at runtime. Now
Atlas’s data classification schemes can be leveraged by Ranger to
enforce a flexible attribute-based policy that prevents violations
from occurring as mandated by HIPAA.
Ranger’s centralized platform empowers data administrators
to define security policy once based on Atlas metadata tags or
attributes defined by a data steward or administrators, and apply
this policy in real-time to the entire hierarchy of assets. Data
stewards can focus on discovery and tagging while another group
can manage compliance policy. This decoupling of explicit policy
offers two important benefits:
•	 Dynamic policy enforcement: data analysis-driven tags can be
enforced immediately
•	 Reusability: One policy can be applied to many assets,
simplifying management
Apache Ranger enforces both role-based (RBAC) and attribute-
based (ABAC) access control to create a flexible security profile
that meets the needs of data-driven enterprises. The initial set of
policies being constructed within the community are defined as:
•	 Attribute-based access controls: For example, a column in
a particular Hive table is marked with the metadata tag “PII”
or “HIPAA-Sensitive” This tag is then used to assign multiple
entitles to a group. This is an evolution from role-based
entitlements, which require discrete and static one-to-one
mappings.
•	 Prohibition against dataset combinations: It’s possible for two
data sets—for example, one consisting of account numbers and
the other of customer names—to be in compliance individually,
but pose a violation if combined. Administrators can apply
a metadata tag to both sets to prevent them from being
combined, helping avoid such violations.
•	 Time-based access policies: Administrators can use metadata
to define access according to time windows in order to enforce
compliance with regulations such as SOX 90-day reporting
rules.
•	 Location-specific access policies: Similar to time-based access
policies, administrators can define entitlements differently by
geography. For example, a U.S.-based user might be granted
access to data while still in a domestic office. If this user
travels to Switzerland and try to access the same data, different
geographical context would apply to trigger a different set of
privacy rules to be evaluated and enforced.
These policies can be used in combination to create a very
sophisticated security access policy for each user at that
point in time and location and can be highly relevant for HIPAA
compliance. Of course, the reach that Apache Ranger provides
in terms of authorization for an ever growing number of Hadoop
ecosystem components (currently eight at the time of this writing)
allows organizations to consistently define and apply data access
policies based on metadata regardless of the route by which the
user or application attempts to access the data itself.
Dynamic Metadata-base Access Policies for Real-Time
Access Control
11HIPAA Compliance For Your Apache™ Hadoop®
Environment • ©2016 Hortonworks • www.hortonworks.com
These HIPAA safeguards are related to establishing policies and procedures that enable organizations to respond to an emergency that
might damage electronic systems that contain PHI such as fire, vandalism, system failure, and natural disaster.
Apache Falcon can play a major role in addressing HIPAA physical safeguards related to Hadoop data replication, business continuity,
and lineage tracing by deploying a framework for data management and processing. Falcon can leverage HDP components, such as
Pig, HDFS, and Oozie and enables simplified management of data pipeline definition, deployment and management. Data pipelines
contain:
•	 A definition of the dataset to be processed.
•	 Interfaces to the Hadoop cluster where the data resides
•	 A process definition that defines how the data is consumed and invokes processing logic
Falcon provides users with a central interface to manage data lifecycle, where they can define and manage policies and pipelines for
data ingest, processing, and export. Falcon also provides audit and compliance features that enables administrators to visualize data
pipeline lineage, track data pipeline audit logs, and tag data with business metadata.
Apache Falcon addresses data governance requirements and provides a wizard-like GUI that eliminates hand coding of complex data
sets and offers.
Physical Safeguards
HIPAA REQUIREMENT HOW HORTONWORKS CAN HELP?
Data Backup Plan: Establish and
implement procedures to create and
maintain retrievable exact copies of
electronic protected health information
Apache Falcon provides organizations with the capability to mirror HDFS directories or Hive
tables, which produces an exact copy of the data and keeps both copies synchronized.
Companies can also mirror between HDFS and Amazon S3 or Microsoft Azure.
Database replication can be performed with Hive.
Disaster Recovery Plan: Establish
procedures to restore any loss of data
In addition to data replication, Falcon provides functionality to trigger processes for retry,
and handle late data arrival logic. In addition, Falcon can mirror file systems or Hive and
HCatalog on clusters using recipes that enable users to re-use complex workflows
Emergency Mode Operation Plan:
Establish procedures to enable
continuation of critical business
processes related to protecting
the security of electronic protected
health information while operating in
emergency mode
Falcon provides alerting mechanism for a variety of events to let administrators monitor
the health of their data pipelines.
All events are logged and administrators can view the events from the log or capture them
using a custom interface. Each event logged provides the following information:
•	 Date: Date of action
•	 Action: Event name
•	 Dimensions: List of name/value pairs of various attributes for a given action
•	 Status: Result of the action
•	 Time-taken: Time in nanoseconds for a given action to complete
12HIPAA Compliance For Your Apache™ Hadoop®
Environment • ©2016 Hortonworks • www.hortonworks.com
© 2011-2016 Hortonworks Inc. All Rights Reserved.
Privacy Policy | Terms of Service
Hortonworks is a leading innovator at creating, distributing and supporting enterprise-ready open data platforms. Our mission is to
manage the world’s data. We have a single-minded focus on driving innovation in open source communities such as Apache Hadoop, NiFi,
and Spark. Our open Connected Data Platforms power Modern Data Applications that deliver actionable intelligence from all data: data-
in-motion and data-at-rest. Along with our 1600+ partners, we provide the expertise, training and services that allows our customers to
unlock the transformational value of data across any line of business. We are Powering the Future of Data™.
About Hortonworks
+1 408 675-0983
+1 855 8-HORTON
INTL: +44 (0) 20 3826 1405
For further information visit
www.hortonworks.com
Contact
To realize the full strategic benefits of Big Data without increasing the risk of compliance
violations or data breaches, companies need to ensure that their Hadoop environments meet
the strict requirements laid out in the HIPAA standard. The best way to accomplish this is
through a holistic approach based on a platform that provides for all five essential pillars of
Hadoop security—administration, authentication / perimeter security, authorization, audit and
data protection. With HDP, Hortonworks provides robust capabilities to address key HIPAA
requirements and facilitate the creation and maintenance of fully compliant modern data
architecture. More than 700 customers trust HDP to power their Big Data strategy, including
some of the world’s largest retail, e-commerce and telecom companies. To learn more about
how Hortonworks can help you address key security and data protection challenges in your
organization, we welcome you to download the Hortonworks white paper “Solving Hadoop
Security” or speak with a Hortonworks representative at 1-855-8-HORTON.
Conclusion

More Related Content

What's hot

Healthcare CyberSecurity Update: Ensuring HIPAA Compliance with Cloud Service...
Healthcare CyberSecurity Update: Ensuring HIPAA Compliance with Cloud Service...Healthcare CyberSecurity Update: Ensuring HIPAA Compliance with Cloud Service...
Healthcare CyberSecurity Update: Ensuring HIPAA Compliance with Cloud Service...eFax Corporate®
 
Guide to hipaa compliance for containers
Guide to hipaa compliance for containersGuide to hipaa compliance for containers
Guide to hipaa compliance for containersAbhishek Sood
 
Cloud Computing Under HIPAA
Cloud Computing Under HIPAACloud Computing Under HIPAA
Cloud Computing Under HIPAABrad Rostolsky
 
Application Developers Guide to HIPAA Compliance
Application Developers Guide to HIPAA ComplianceApplication Developers Guide to HIPAA Compliance
Application Developers Guide to HIPAA ComplianceTrueVault
 
HIPAA-1-_FINAL_Draft
HIPAA-1-_FINAL_DraftHIPAA-1-_FINAL_Draft
HIPAA-1-_FINAL_DraftKevin Jenkins
 
HIPAA compliance for Business Associates- The value of compliance, how to acq...
HIPAA compliance for Business Associates- The value of compliance, how to acq...HIPAA compliance for Business Associates- The value of compliance, how to acq...
HIPAA compliance for Business Associates- The value of compliance, how to acq...Compliancy Group
 
Compliance & hipaa regulations
Compliance & hipaa regulationsCompliance & hipaa regulations
Compliance & hipaa regulationsrcpopp2002
 
HIPAA compliance tuneup 2016
HIPAA compliance tuneup 2016HIPAA compliance tuneup 2016
HIPAA compliance tuneup 2016Compliancy Group
 
GDPR/CCPA Compliance and Data Governance in Hadoop
GDPR/CCPA Compliance and Data Governance in HadoopGDPR/CCPA Compliance and Data Governance in Hadoop
GDPR/CCPA Compliance and Data Governance in HadoopEyad Garelnabi
 
Drug Discovery Innovation in a Precompetitive Cloud Platform (LFS302-S) - AWS...
Drug Discovery Innovation in a Precompetitive Cloud Platform (LFS302-S) - AWS...Drug Discovery Innovation in a Precompetitive Cloud Platform (LFS302-S) - AWS...
Drug Discovery Innovation in a Precompetitive Cloud Platform (LFS302-S) - AWS...Amazon Web Services
 
HIPAA Compliance For Small Practices
HIPAA Compliance For Small PracticesHIPAA Compliance For Small Practices
HIPAA Compliance For Small PracticesNisos Health
 
Iadmdhipmkt1.0
Iadmdhipmkt1.0Iadmdhipmkt1.0
Iadmdhipmkt1.0profit10
 
A Framework for Health Information Technology and Network Security
A Framework for Health Information Technology and Network Security A Framework for Health Information Technology and Network Security
A Framework for Health Information Technology and Network Security Jeff Horsager
 
Health insurance portability and accountability act (hipaa)
Health insurance portability and accountability act (hipaa)Health insurance portability and accountability act (hipaa)
Health insurance portability and accountability act (hipaa)ZyLAB
 
Mbm Hipaa Hitech Ss Compliance Risk Assessment
Mbm Hipaa Hitech Ss Compliance Risk AssessmentMbm Hipaa Hitech Ss Compliance Risk Assessment
Mbm Hipaa Hitech Ss Compliance Risk AssessmentMBMeHealthCareSolutions
 
How to safeguard ePHIi in the cloud
How to safeguard ePHIi in the cloud How to safeguard ePHIi in the cloud
How to safeguard ePHIi in the cloud Compliancy Group
 

What's hot (20)

2010 New Guidelines Hipaa Checklist V1
2010 New Guidelines Hipaa Checklist V12010 New Guidelines Hipaa Checklist V1
2010 New Guidelines Hipaa Checklist V1
 
Healthcare CyberSecurity Update: Ensuring HIPAA Compliance with Cloud Service...
Healthcare CyberSecurity Update: Ensuring HIPAA Compliance with Cloud Service...Healthcare CyberSecurity Update: Ensuring HIPAA Compliance with Cloud Service...
Healthcare CyberSecurity Update: Ensuring HIPAA Compliance with Cloud Service...
 
Guide to hipaa compliance for containers
Guide to hipaa compliance for containersGuide to hipaa compliance for containers
Guide to hipaa compliance for containers
 
Cloud Computing Under HIPAA
Cloud Computing Under HIPAACloud Computing Under HIPAA
Cloud Computing Under HIPAA
 
HIPAA Basic Healthcare Guide
HIPAA Basic Healthcare GuideHIPAA Basic Healthcare Guide
HIPAA Basic Healthcare Guide
 
Application Developers Guide to HIPAA Compliance
Application Developers Guide to HIPAA ComplianceApplication Developers Guide to HIPAA Compliance
Application Developers Guide to HIPAA Compliance
 
HIPAA-1-_FINAL_Draft
HIPAA-1-_FINAL_DraftHIPAA-1-_FINAL_Draft
HIPAA-1-_FINAL_Draft
 
HIPAA compliance for Business Associates- The value of compliance, how to acq...
HIPAA compliance for Business Associates- The value of compliance, how to acq...HIPAA compliance for Business Associates- The value of compliance, how to acq...
HIPAA compliance for Business Associates- The value of compliance, how to acq...
 
Set paper winter sem 15 16 (final)
Set paper winter sem 15 16 (final)Set paper winter sem 15 16 (final)
Set paper winter sem 15 16 (final)
 
Compliance & hipaa regulations
Compliance & hipaa regulationsCompliance & hipaa regulations
Compliance & hipaa regulations
 
HIPAA compliance tuneup 2016
HIPAA compliance tuneup 2016HIPAA compliance tuneup 2016
HIPAA compliance tuneup 2016
 
GDPR/CCPA Compliance and Data Governance in Hadoop
GDPR/CCPA Compliance and Data Governance in HadoopGDPR/CCPA Compliance and Data Governance in Hadoop
GDPR/CCPA Compliance and Data Governance in Hadoop
 
Drug Discovery Innovation in a Precompetitive Cloud Platform (LFS302-S) - AWS...
Drug Discovery Innovation in a Precompetitive Cloud Platform (LFS302-S) - AWS...Drug Discovery Innovation in a Precompetitive Cloud Platform (LFS302-S) - AWS...
Drug Discovery Innovation in a Precompetitive Cloud Platform (LFS302-S) - AWS...
 
HIPAA Compliance For Small Practices
HIPAA Compliance For Small PracticesHIPAA Compliance For Small Practices
HIPAA Compliance For Small Practices
 
Iadmdhipmkt1.0
Iadmdhipmkt1.0Iadmdhipmkt1.0
Iadmdhipmkt1.0
 
A Framework for Health Information Technology and Network Security
A Framework for Health Information Technology and Network Security A Framework for Health Information Technology and Network Security
A Framework for Health Information Technology and Network Security
 
Healthcare Cloud Adoption – HIPAA Still the Major Priority
Healthcare Cloud Adoption – HIPAA Still the Major PriorityHealthcare Cloud Adoption – HIPAA Still the Major Priority
Healthcare Cloud Adoption – HIPAA Still the Major Priority
 
Health insurance portability and accountability act (hipaa)
Health insurance portability and accountability act (hipaa)Health insurance portability and accountability act (hipaa)
Health insurance portability and accountability act (hipaa)
 
Mbm Hipaa Hitech Ss Compliance Risk Assessment
Mbm Hipaa Hitech Ss Compliance Risk AssessmentMbm Hipaa Hitech Ss Compliance Risk Assessment
Mbm Hipaa Hitech Ss Compliance Risk Assessment
 
How to safeguard ePHIi in the cloud
How to safeguard ePHIi in the cloud How to safeguard ePHIi in the cloud
How to safeguard ePHIi in the cloud
 

Similar to Hortonworks help customers building a HIPAA compliant Data Lake

Privacy on FHIR Demo at HIMSS!5
Privacy on FHIR Demo at HIMSS!5Privacy on FHIR Demo at HIMSS!5
Privacy on FHIR Demo at HIMSS!5agropper
 
HIPAA Compliance for Developers
HIPAA Compliance for DevelopersHIPAA Compliance for Developers
HIPAA Compliance for DevelopersTrueVault
 
Cloud compliance test
Cloud compliance testCloud compliance test
Cloud compliance testPrancer Io
 
HIPAA Compliant Salesforce Health Cloud – Why Healthcare Organizations Must C...
HIPAA Compliant Salesforce Health Cloud – Why Healthcare Organizations Must C...HIPAA Compliant Salesforce Health Cloud – Why Healthcare Organizations Must C...
HIPAA Compliant Salesforce Health Cloud – Why Healthcare Organizations Must C...Ajeet Singh
 
HIPAA Compliance Testing In Software Applications.pdf
HIPAA Compliance Testing In Software Applications.pdfHIPAA Compliance Testing In Software Applications.pdf
HIPAA Compliance Testing In Software Applications.pdfZoe Gilbert
 
Dental Compliance for Dentists and Business Associates
Dental Compliance for Dentists and Business AssociatesDental Compliance for Dentists and Business Associates
Dental Compliance for Dentists and Business Associatesgppcpa
 
Comprehensive Hadoop Security for the Enterprise | Part I | Compliance Ready ...
Comprehensive Hadoop Security for the Enterprise | Part I | Compliance Ready ...Comprehensive Hadoop Security for the Enterprise | Part I | Compliance Ready ...
Comprehensive Hadoop Security for the Enterprise | Part I | Compliance Ready ...Cloudera, Inc.
 
Building an Integrated Healthcare Platform with FHIR®
Building an Integrated Healthcare Platform with FHIR®Building an Integrated Healthcare Platform with FHIR®
Building an Integrated Healthcare Platform with FHIR®WSO2
 
HxRefactored - TrueVault - Jason Wang
HxRefactored - TrueVault - Jason WangHxRefactored - TrueVault - Jason Wang
HxRefactored - TrueVault - Jason WangHxRefactored
 
OnRamp Customer Case Study - analyticsMD
OnRamp Customer Case Study - analyticsMDOnRamp Customer Case Study - analyticsMD
OnRamp Customer Case Study - analyticsMDJoshua Berman
 
Hardening Hadoop for Healthcare with Project Rhino
Hardening Hadoop for Healthcare with Project RhinoHardening Hadoop for Healthcare with Project Rhino
Hardening Hadoop for Healthcare with Project RhinoAmazon Web Services
 
Big Data Everywhere Chicago: The Big Data Imperative -- Discovering & Protect...
Big Data Everywhere Chicago: The Big Data Imperative -- Discovering & Protect...Big Data Everywhere Chicago: The Big Data Imperative -- Discovering & Protect...
Big Data Everywhere Chicago: The Big Data Imperative -- Discovering & Protect...BigDataEverywhere
 
Eu gdpr technical workflow and productionalization neccessary w privacy ass...
Eu gdpr technical workflow and productionalization   neccessary w privacy ass...Eu gdpr technical workflow and productionalization   neccessary w privacy ass...
Eu gdpr technical workflow and productionalization neccessary w privacy ass...Steven Meister
 
Automatic Detection, Classification and Authorization of Sensitive Personal D...
Automatic Detection, Classification and Authorization of Sensitive Personal D...Automatic Detection, Classification and Authorization of Sensitive Personal D...
Automatic Detection, Classification and Authorization of Sensitive Personal D...DataWorks Summit/Hadoop Summit
 
Ensuring HIPAA Compliance in the Cloud A Guide for Healthcare Organizations.pdf
Ensuring HIPAA Compliance in the Cloud A Guide for Healthcare Organizations.pdfEnsuring HIPAA Compliance in the Cloud A Guide for Healthcare Organizations.pdf
Ensuring HIPAA Compliance in the Cloud A Guide for Healthcare Organizations.pdfPostDICOM
 
Achieving HIPAA Compliance with Postgres Plus Cloud Database
Achieving HIPAA Compliance with Postgres Plus Cloud DatabaseAchieving HIPAA Compliance with Postgres Plus Cloud Database
Achieving HIPAA Compliance with Postgres Plus Cloud DatabaseEDB
 
What Every Physician Needs to Know About Cloud Storage
What Every Physician Needs to Know About Cloud StorageWhat Every Physician Needs to Know About Cloud Storage
What Every Physician Needs to Know About Cloud StorageTexas Medical Liability Trust
 
An Overview of HIPAA Laws and Regulations.pdf
An Overview of HIPAA Laws and Regulations.pdfAn Overview of HIPAA Laws and Regulations.pdf
An Overview of HIPAA Laws and Regulations.pdfSeasiaInfotech2
 
Top Three Big Data Governance Issues and How Apache ATLAS resolves it for the...
Top Three Big Data Governance Issues and How Apache ATLAS resolves it for the...Top Three Big Data Governance Issues and How Apache ATLAS resolves it for the...
Top Three Big Data Governance Issues and How Apache ATLAS resolves it for the...DataWorks Summit/Hadoop Summit
 
Data-driven Healthcare
Data-driven HealthcareData-driven Healthcare
Data-driven HealthcareLindaWatson19
 

Similar to Hortonworks help customers building a HIPAA compliant Data Lake (20)

Privacy on FHIR Demo at HIMSS!5
Privacy on FHIR Demo at HIMSS!5Privacy on FHIR Demo at HIMSS!5
Privacy on FHIR Demo at HIMSS!5
 
HIPAA Compliance for Developers
HIPAA Compliance for DevelopersHIPAA Compliance for Developers
HIPAA Compliance for Developers
 
Cloud compliance test
Cloud compliance testCloud compliance test
Cloud compliance test
 
HIPAA Compliant Salesforce Health Cloud – Why Healthcare Organizations Must C...
HIPAA Compliant Salesforce Health Cloud – Why Healthcare Organizations Must C...HIPAA Compliant Salesforce Health Cloud – Why Healthcare Organizations Must C...
HIPAA Compliant Salesforce Health Cloud – Why Healthcare Organizations Must C...
 
HIPAA Compliance Testing In Software Applications.pdf
HIPAA Compliance Testing In Software Applications.pdfHIPAA Compliance Testing In Software Applications.pdf
HIPAA Compliance Testing In Software Applications.pdf
 
Dental Compliance for Dentists and Business Associates
Dental Compliance for Dentists and Business AssociatesDental Compliance for Dentists and Business Associates
Dental Compliance for Dentists and Business Associates
 
Comprehensive Hadoop Security for the Enterprise | Part I | Compliance Ready ...
Comprehensive Hadoop Security for the Enterprise | Part I | Compliance Ready ...Comprehensive Hadoop Security for the Enterprise | Part I | Compliance Ready ...
Comprehensive Hadoop Security for the Enterprise | Part I | Compliance Ready ...
 
Building an Integrated Healthcare Platform with FHIR®
Building an Integrated Healthcare Platform with FHIR®Building an Integrated Healthcare Platform with FHIR®
Building an Integrated Healthcare Platform with FHIR®
 
HxRefactored - TrueVault - Jason Wang
HxRefactored - TrueVault - Jason WangHxRefactored - TrueVault - Jason Wang
HxRefactored - TrueVault - Jason Wang
 
OnRamp Customer Case Study - analyticsMD
OnRamp Customer Case Study - analyticsMDOnRamp Customer Case Study - analyticsMD
OnRamp Customer Case Study - analyticsMD
 
Hardening Hadoop for Healthcare with Project Rhino
Hardening Hadoop for Healthcare with Project RhinoHardening Hadoop for Healthcare with Project Rhino
Hardening Hadoop for Healthcare with Project Rhino
 
Big Data Everywhere Chicago: The Big Data Imperative -- Discovering & Protect...
Big Data Everywhere Chicago: The Big Data Imperative -- Discovering & Protect...Big Data Everywhere Chicago: The Big Data Imperative -- Discovering & Protect...
Big Data Everywhere Chicago: The Big Data Imperative -- Discovering & Protect...
 
Eu gdpr technical workflow and productionalization neccessary w privacy ass...
Eu gdpr technical workflow and productionalization   neccessary w privacy ass...Eu gdpr technical workflow and productionalization   neccessary w privacy ass...
Eu gdpr technical workflow and productionalization neccessary w privacy ass...
 
Automatic Detection, Classification and Authorization of Sensitive Personal D...
Automatic Detection, Classification and Authorization of Sensitive Personal D...Automatic Detection, Classification and Authorization of Sensitive Personal D...
Automatic Detection, Classification and Authorization of Sensitive Personal D...
 
Ensuring HIPAA Compliance in the Cloud A Guide for Healthcare Organizations.pdf
Ensuring HIPAA Compliance in the Cloud A Guide for Healthcare Organizations.pdfEnsuring HIPAA Compliance in the Cloud A Guide for Healthcare Organizations.pdf
Ensuring HIPAA Compliance in the Cloud A Guide for Healthcare Organizations.pdf
 
Achieving HIPAA Compliance with Postgres Plus Cloud Database
Achieving HIPAA Compliance with Postgres Plus Cloud DatabaseAchieving HIPAA Compliance with Postgres Plus Cloud Database
Achieving HIPAA Compliance with Postgres Plus Cloud Database
 
What Every Physician Needs to Know About Cloud Storage
What Every Physician Needs to Know About Cloud StorageWhat Every Physician Needs to Know About Cloud Storage
What Every Physician Needs to Know About Cloud Storage
 
An Overview of HIPAA Laws and Regulations.pdf
An Overview of HIPAA Laws and Regulations.pdfAn Overview of HIPAA Laws and Regulations.pdf
An Overview of HIPAA Laws and Regulations.pdf
 
Top Three Big Data Governance Issues and How Apache ATLAS resolves it for the...
Top Three Big Data Governance Issues and How Apache ATLAS resolves it for the...Top Three Big Data Governance Issues and How Apache ATLAS resolves it for the...
Top Three Big Data Governance Issues and How Apache ATLAS resolves it for the...
 
Data-driven Healthcare
Data-driven HealthcareData-driven Healthcare
Data-driven Healthcare
 

Recently uploaded

Slack Application Development 101 Slides
Slack Application Development 101 SlidesSlack Application Development 101 Slides
Slack Application Development 101 Slidespraypatel2
 
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...Igalia
 
From Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time AutomationFrom Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time AutomationSafe Software
 
Understanding the Laravel MVC Architecture
Understanding the Laravel MVC ArchitectureUnderstanding the Laravel MVC Architecture
Understanding the Laravel MVC ArchitecturePixlogix Infotech
 
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...shyamraj55
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerThousandEyes
 
08448380779 Call Girls In Greater Kailash - I Women Seeking Men
08448380779 Call Girls In Greater Kailash - I Women Seeking Men08448380779 Call Girls In Greater Kailash - I Women Seeking Men
08448380779 Call Girls In Greater Kailash - I Women Seeking MenDelhi Call girls
 
My Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationMy Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationRidwan Fadjar
 
08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking Men08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking MenDelhi Call girls
 
IAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI SolutionsIAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI SolutionsEnterprise Knowledge
 
Boost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityBoost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityPrincipled Technologies
 
CNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of ServiceCNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of Servicegiselly40
 
Presentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreterPresentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreternaman860154
 
How to convert PDF to text with Nanonets
How to convert PDF to text with NanonetsHow to convert PDF to text with Nanonets
How to convert PDF to text with Nanonetsnaman860154
 
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...Miguel Araújo
 
A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)Gabriella Davis
 
A Call to Action for Generative AI in 2024
A Call to Action for Generative AI in 2024A Call to Action for Generative AI in 2024
A Call to Action for Generative AI in 2024Results
 
Enhancing Worker Digital Experience: A Hands-on Workshop for Partners
Enhancing Worker Digital Experience: A Hands-on Workshop for PartnersEnhancing Worker Digital Experience: A Hands-on Workshop for Partners
Enhancing Worker Digital Experience: A Hands-on Workshop for PartnersThousandEyes
 
WhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure service
WhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure serviceWhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure service
WhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure servicePooja Nehwal
 
The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024Rafal Los
 

Recently uploaded (20)

Slack Application Development 101 Slides
Slack Application Development 101 SlidesSlack Application Development 101 Slides
Slack Application Development 101 Slides
 
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
 
From Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time AutomationFrom Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time Automation
 
Understanding the Laravel MVC Architecture
Understanding the Laravel MVC ArchitectureUnderstanding the Laravel MVC Architecture
Understanding the Laravel MVC Architecture
 
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected Worker
 
08448380779 Call Girls In Greater Kailash - I Women Seeking Men
08448380779 Call Girls In Greater Kailash - I Women Seeking Men08448380779 Call Girls In Greater Kailash - I Women Seeking Men
08448380779 Call Girls In Greater Kailash - I Women Seeking Men
 
My Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationMy Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 Presentation
 
08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking Men08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking Men
 
IAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI SolutionsIAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI Solutions
 
Boost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityBoost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivity
 
CNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of ServiceCNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of Service
 
Presentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreterPresentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreter
 
How to convert PDF to text with Nanonets
How to convert PDF to text with NanonetsHow to convert PDF to text with Nanonets
How to convert PDF to text with Nanonets
 
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
 
A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)
 
A Call to Action for Generative AI in 2024
A Call to Action for Generative AI in 2024A Call to Action for Generative AI in 2024
A Call to Action for Generative AI in 2024
 
Enhancing Worker Digital Experience: A Hands-on Workshop for Partners
Enhancing Worker Digital Experience: A Hands-on Workshop for PartnersEnhancing Worker Digital Experience: A Hands-on Workshop for Partners
Enhancing Worker Digital Experience: A Hands-on Workshop for Partners
 
WhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure service
WhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure serviceWhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure service
WhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure service
 
The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024
 

Hortonworks help customers building a HIPAA compliant Data Lake

  • 1. HIPAA Compliance For Your Apache™ Hadoop® Environment www.hortonworks.com ©2016 Hortonworks A HORTONWORKS WHITE PAPER AUGUST 2016 HIPAA COMPLIANCE FOR YOUR APACHE™ HADOOP® ENVIRONMENT
  • 2. 2HIPAA Compliance For Your Apache™ Hadoop® Environment • ©2016 Hortonworks • www.hortonworks.com Contents Overview 3 Covered Entities 4 Penalties For Non-Compliance 5 Breach Notification 6 Building Support For HIPAA Compliance Into Your Hadoop Environment 7 How Hortonworks Helps You Meet Key HIPAA Requirements 9 Dynamic Metadata-base Access Policies for Real-Time Access Control 10 Physical Safeguards 11 Conclusion12
  • 3. 3HIPAA Compliance For Your Apache™ Hadoop® Environment • ©2016 Hortonworks • www.hortonworks.com In 1996, the U.S. Department of Health and Human Services (HHS) issued the Health Insurance Portability and Accountability Act (HIPAA). The HIPAA Privacy Rule addresses the use and disclosure of individuals’ health information called protected health information or PHI. The Privacy Rule protects all “individually identifiable health information” stored or transmitted by a covered entity or its business associate, in any form or media. PHI is a core component of HIPAA and includes demographic data such as name, address, birthdate, social security number or other information that can be useful in identifying an individual and can be used to relate his or her: • Past, present or future physical and mental health or condition • Provision of health care to the individual • Payment for the provision of health care to the individual According to the American Health Information Management Association (AHIMA), during a hospitalization, an average of 150 people have access to a patient’s medical records including x-ray technicians, nursing staff and billing personnel among others. While many of these individuals have a legitimate need to see all or part of a patient’s records, prior to HIPAA and the Privacy Rule, there wasn’t any regulation that specified the: • Personnel who could access patient records • Type of information these personnel are able to see • Actions these personnel are permitted to perform with the information once they have access to it The objective of the Privacy Rule is to strike a balance between protecting individuals’ health information and facilitating the flow of health information that is required to provide and promote high quality health care. HIPAA and the Privacy Rule have important data security and governance implications that have led to the application of technologies to control access to personal health information. These applications include the authentication and authorization of individuals who have access to patient information and the establishment of audit trails of those accessing or modifying information at different levels. OVERVIEW
  • 4. 4HIPAA Compliance For Your Apache™ Hadoop® Environment • ©2016 Hortonworks • www.hortonworks.com The organizations that are subject to the Privacy Rule are knows as covered entities. These include: • Health plans that provide or pay for the cost of medical care • Health Care Providers that includes hospitals, physicians, dentists and any person or organization that furnishes, bills, or is paid for health care services • Health Care Clearinghouses that includes billing services, re-pricing companies, community health management information systems and value-added networks and switches. Covered Entities 1 Covered entities are required to comply with every Security Rule classified as “Standard.” Certain implementation specifications within those standards are classified as “addressable,” while others as “required.” The “required” implementation specifications must be implemented. The “addressable” designation permits covered entities to determine whether the implementation specification is reasonable and appropriate for that covered entity. HIPAA TECHNICAL SAFEGUARDS – REQUIREMENT SUMMARY Standards Section Implementation Specifications Required (R) or Addressable (A)1 Hortonworks Competency Access Control § 164.312(a)(1) 1. Unique User Identification R Kerberos, Knox Ranger with LDAP/ AD 2. Automatic Logoff A Ranger 3. Encryption and Decryption A Wire encryption in Hadoop HDFS encryption w/ Ranger Audit Control § 164.312(b) Hardware software procedural controls R Falcon, Ranger, Atlas Integrity § 164.312(c)(1) Mechanism to Authenticate Electronic Protected Health Information A Ranger, Kerberos Knox Person or Entity Authentication § 164.312(d) R Ranger Knox Transmission Security § 164.312(e)(1) 1. Integrity Controls A 2. Encryption A Wire encryption in Hadoop HDFS encryption w/ Ranger
  • 5. 5HIPAA Compliance For Your Apache™ Hadoop® Environment • ©2016 Hortonworks • www.hortonworks.com The Office for Civil Rights (OCR) in Health and Human Services is responsible for implementing and enforcing the Privacy Rule. OCR may impose a penalty on a covered entity for failure to comply with the Privacy Rule. Penalties take into considerations factors such as the date of the violation, whether the covered entity knew or should have known that it’s in non-compliance, or whether the covered entity’s failure to comply was due to willful neglect. In 2009 Congress passed the Health Information Technology for Economic and Clinical Health Act (HITECH). The HITECH Act strengthened HIPAA enforcement by greatly increasing the penalties for violations and authorizing states’ attorneys general for its enforcement. The HITECH Act also outlined the requirements for first federal data security breach notification, and mandated HHS to conduct privacy and security audits. Penalties For Non-Compliance FOR VIOLATIONS OCCURRING PRIOR TO 2/18/2009 FOR VIOLATIONS OCCURRING ON OR AFTER 2/18/2009 Penalty Amount Up to $100 per violation $100 to $50,000 or more per violation Calendar Year Cap $25,000 $1,500,000 A person who knowingly obtains or discloses individually identifiable health information may face a criminal penalty of up to $50,000 and up to one-year imprisonment. The criminal penalties increase to $100,000 and up to five years imprisonment if the wrongful conduct involves false pretenses, and to $250,000 and up to 10 years imprisonment if the wrongful conduct involves the intent to sell, transfer, or use identifiable health information for commercial advantage, personal gain or malicious harm. PCI is designed to protect both cardholder and sensitive authentication data relating to payment methods used by consumers.
  • 6. 6HIPAA Compliance For Your Apache™ Hadoop® Environment • ©2016 Hortonworks • www.hortonworks.com The HIPAA Breach Notification Rule, requires covered entities to notify the affected individuals, the Secretary, and in certain circumstances the media of a breach of protected health information. A breach is defined as an impermissible use or disclosure that compromises the security or privacy of the protected health information. Also the business associate has responsibility to notify the covered entities, in case the breach occurs at or by the business associate. Breach Notification
  • 7. 7HIPAA Compliance For Your Apache™ Hadoop® Environment • ©2016 Hortonworks • www.hortonworks.com The Hortonworks Data Platform (HDP®) forms the core of the modern data architecture that allows enterprises to store and process massive amount of structured, semi-structured, and unstructured data. Trusted by more than 700 customers and proven in highly security-conscious environments, HDP provides robust security capabilities to help businesses meet HIPAA requirements. Based on a holistic approach to protection, the Hortonworks security framework revolves around five pillars: administration, authentication/ perimeter security, authorization, audit and data protection. Rather than applying protection as an afterthought, Hortonworks prevents gaps and inconsistencies through a bottom-up platform approach that makes it possible to enforce and manage security across the stack through a central point of administration and management built into the DNA of HDP. By implementing security at the platform level, Hortonworks ensures that security is consistently administered to all the applications across the stack, and makes the process of adding or retiring applications operating on top of Hadoop seamless. The Hortonworks security architecture begins with Apache Ranger, Apache Knox and Kerberos. Ranger serves as the central interface for security administration. Users can create and update policies, which are then stored in a policy database. Ranger plugins consisting of lightweight Java programs are embedded within processes of each cluster component. These plugins pull in policies from a central server and store them locally in a file. When a user request comes through the component, these plugins intercept the request and evaluate it against the security policy. Plugins also collect data from the user request and follow a separate thread to send this data back to the audit server. This platform approach ensures that each of the five pillars of Hadoop security complement each other effectively to enable comprehensive protection. Figure 1: Comprehensive security in HDP ADMINISTRATION Ranger provides a single pane of glass to define, administer and manage security policies consistently across all the components of the Hadoop stack. Ranger enhances the productivity of security administrators and reduces potential errors by empowering them to define security policy once and apply it to all the applicable components across the Hadoop stack from a central location. Building Support For HIPAA Compliance Into Your Hadoop Environment CENTRALIZED SECURITY ADMINISTRATION WITH RANGER HDP AUTHENTICATION/ PERIMETER SECURITY • Kerberos • Perimeter security with Apache Knox • Fine grain access control with Apache Ranger • Centralized audit reporting with Apache Ranger • Wire encryption in Hadoop • HDFS Encryption with Ranger KMS AUTHORIZATION AUDIT DATA PROTECTION
  • 8. 8HIPAA Compliance For Your Apache™ Hadoop® Environment • ©2016 Hortonworks • www.hortonworks.com AUTHENTICATION AND PERIMETER SECURITY Apache Knox Gateway ensures perimeter security for Hortonworks customers, providing a way for users to reliably identify themselves and then have that identity propagated throughout the Hadoop cluster to access resources such as files and directories, and to perform tasks such as running MapReduce jobs. Hortonworks uses Kerberos, an industry standard, to authenticate users and resources within Hadoop cluster. Hortonworks has also simplified Kerberos setup, configuration and maintenance through Ambari 2.0. With Knox, enterprises can confidently extend the Hadoop REST API to new users without Kerberos complexities, while also maintaining compliance with enterprise security policies. Knox provides a central gateway for Hadoop REST APIs that have varying degrees of authorization, authentication, SSL and SSO capabilities to enable a single access point for Hadoop. AUTHORIZATION Ranger manages fine-grained access control through a rich user interface that ensures consistent policy administration across Hadoop data access components. Security administrators have the flexibility to define security policies for a database, table and column or a file, and administer permissions for specific LDAP based groups or individual users. Rules based on dynamic conditions such as time or geography can also be added to an existing policy rule. The Ranger authorization model is highly pluggable and can be easily extended to any data source using a service-based definition. Ranger works with standard authorization APIs in each Hadoop component and is able to enforce centrally administered policies for any method of accessing the data lake. The combination of Ranger’s rich user interface with deep audit visibility makes it highly intuitive to use, enhancing productivity for security administrators. AUDIT Established by Hortonworks with Aetna, Merck, Target and SAS, the Data Governance Initiative (DGI) introduced a common approach to Hadoop data governance into the open source community. This initiative has since evolved into a new open source project called Apache Atlas, a set of core foundational governance services that enables enterprises to effectively and efficiently meet their compliance requirements within Hadoop and allows integration with the complete enterprise data ecosystem. Ranger also provides a centralized framework for collecting access audit history and easily reporting on this data, including the ability to filter data based on various parameters. Together with Apache Atlas, this makes it possible for users to gain a comprehensive view of data lineage and access audit, with an ability to query and filter audit based on data classification, users or groups, and other filters. DATA PROTECTION HDP adds a robust layer of security by making data unreadable in transit over the network or at rest on a disk. Hortonworks has recently introduced HDFS encryption for encrypting HDFS files, complemented with a Ranger-embedded open source Hadoop key management store (KMS). Ranger provides security administrators with the ability to manage keys and authorization policies for KMS. Hortonworks is also working extensively with its encryption partners to integrate HDFS encryption with enterprise-grade key management frameworks. With Hortonworks, our customers have the flexibility to leverage an open source key management system (KMS), or use enterprise wide KMS solutions provided by the partner ecosystem. Encryption in HDFS, combined with KMS access policies maintained by Ranger, prevents rogue Linux or Hadoop administrators from accessing data and supports segregation of duties for both data access and encryption.
  • 9. 9HIPAA Compliance For Your Apache™ Hadoop® Environment • ©2016 Hortonworks • www.hortonworks.com A covered entity must maintain reasonable and appropriate technical and physical safeguards to prevent intentional or unintentional disclosure of protected health information in violation of the Privacy Rule. How Hortonworks Helps You Meet Key HIPAA Requirements HIPAA REQUIREMENT HOW HORTONWORKS CAN HELP? Access Controls: A covered entity must implement technical policies and procedures that allow only authorized persons to access electronic protected health information (e-PHI) HDP supports implementation of Kerberos for authentication within a cluster. Apache Knox provides authentication for REST-based services and can be integrated with AD/LDAP or other authentication mechanisms. Integrating Kerberos and Knox with LDAP/ AD also enables organizations to identify unique users. Audit Controls: A covered entity must implement hardware, software, and/or procedural mechanisms to record and examine access and other activity in information systems that contain or use e-PHI Ranger provides a centralized framework to collect access audit history and easily report on this data, including the ability to filter data based on various parameters. The combination of Ranger and Apache Atlas makes it possible for users to gain a comprehensive view of data lineage and access audit, with an ability to query and filter audit based on data classification, users or groups, and other filters. HDP supports wire encryption for all access protocols as well as Solr, Kafka and YARN, with dynamic attributes such as geo, time and data to drive security policy decisions. Integrity Controls: A covered entity must implement policies and procedures to ensure that e-PHI is not improperly altered or destroyed. Electronic measures must be put in place to confirm that e-PHI has not been improperly altered or destroyed Apache Ranger provides fine-grained access control across HDFS, Hive, HBase, Storm, Knox, Solr, Kafka and Yarn. Security administrators have the flexibility to define security policies for a database, table and column or a file, and administer permissions for specific LDAP based groups or individual users. With the introduction of dynamic tag-based policies enabled through the integration of Ranger with Apache Atlas (data governance solution), users now have the flexibility to enforce both role-based (RBAC) and attribute-based (ABAC) access control. For further details, please refer to the section titled Dynamic Tag-Based Access Control. Person or Entity Authentication: Implement procedures to verify that a person or entity seeking access to electronic protected health information is the one claimed HDP supports implementation of Kerberos for authentication within a cluster. Apache Knox provides authentication for REST-based services and can be integrated with AD/LDAP or other authentication mechanisms. Transmission Security: A covered entity must implement technical security measures that guard against unauthorized access to e-PHI that is being transmitted over an electronic network HDP supports encrypting network traffic to provide privacy for data movement, as well as for data at rest. HDP supports wire encryption for all access protocols as well as Solr, Kafka and YARN, with dynamic attributes such as geo, time and data to drive security policy decisions. HDP also supports the ability to encrypt files stored in Hadoop, including a Ranger- embedded open source Hadoop key management store (KMS). Ranger provides security administrators with the ability to manage keys and authorization policies for KMS. With HDP, Hortonworks customers have the flexibility to leverage open source KMS or use enterprise wide KMS solutions provided by the partner ecosystem.
  • 10. 10HIPAA Compliance For Your Apache™ Hadoop® Environment • ©2016 Hortonworks • www.hortonworks.com Apache Atlas was proposed to provide data governance capabilities in Hadoop. At its core, Atlas is designed to exchange metadata both within and outside of the Hadoop stack. By reconciling both logical data models and forensic events, enriched by business taxonomy metadata, Atlas enables a scalable set of core governance services (for detailed information on Apache Atlas, download the Governance White Paper from http://hortonworks.com/hdp/governance/). These services enable enterprises to effectively and efficiently address their compliance requirements by providing: • Search and lineage for datasets • Metadata-driven data access control • Indexed and searchable centralized audit for operational events • Comprehensive data lifecycle management from ingestion to disposition • Metadata interchange with other metadata tools By integrating Ranger with Atlas, Hortonworks empowers enterprises to rationalize compliance policy at runtime. Now Atlas’s data classification schemes can be leveraged by Ranger to enforce a flexible attribute-based policy that prevents violations from occurring as mandated by HIPAA. Ranger’s centralized platform empowers data administrators to define security policy once based on Atlas metadata tags or attributes defined by a data steward or administrators, and apply this policy in real-time to the entire hierarchy of assets. Data stewards can focus on discovery and tagging while another group can manage compliance policy. This decoupling of explicit policy offers two important benefits: • Dynamic policy enforcement: data analysis-driven tags can be enforced immediately • Reusability: One policy can be applied to many assets, simplifying management Apache Ranger enforces both role-based (RBAC) and attribute- based (ABAC) access control to create a flexible security profile that meets the needs of data-driven enterprises. The initial set of policies being constructed within the community are defined as: • Attribute-based access controls: For example, a column in a particular Hive table is marked with the metadata tag “PII” or “HIPAA-Sensitive” This tag is then used to assign multiple entitles to a group. This is an evolution from role-based entitlements, which require discrete and static one-to-one mappings. • Prohibition against dataset combinations: It’s possible for two data sets—for example, one consisting of account numbers and the other of customer names—to be in compliance individually, but pose a violation if combined. Administrators can apply a metadata tag to both sets to prevent them from being combined, helping avoid such violations. • Time-based access policies: Administrators can use metadata to define access according to time windows in order to enforce compliance with regulations such as SOX 90-day reporting rules. • Location-specific access policies: Similar to time-based access policies, administrators can define entitlements differently by geography. For example, a U.S.-based user might be granted access to data while still in a domestic office. If this user travels to Switzerland and try to access the same data, different geographical context would apply to trigger a different set of privacy rules to be evaluated and enforced. These policies can be used in combination to create a very sophisticated security access policy for each user at that point in time and location and can be highly relevant for HIPAA compliance. Of course, the reach that Apache Ranger provides in terms of authorization for an ever growing number of Hadoop ecosystem components (currently eight at the time of this writing) allows organizations to consistently define and apply data access policies based on metadata regardless of the route by which the user or application attempts to access the data itself. Dynamic Metadata-base Access Policies for Real-Time Access Control
  • 11. 11HIPAA Compliance For Your Apache™ Hadoop® Environment • ©2016 Hortonworks • www.hortonworks.com These HIPAA safeguards are related to establishing policies and procedures that enable organizations to respond to an emergency that might damage electronic systems that contain PHI such as fire, vandalism, system failure, and natural disaster. Apache Falcon can play a major role in addressing HIPAA physical safeguards related to Hadoop data replication, business continuity, and lineage tracing by deploying a framework for data management and processing. Falcon can leverage HDP components, such as Pig, HDFS, and Oozie and enables simplified management of data pipeline definition, deployment and management. Data pipelines contain: • A definition of the dataset to be processed. • Interfaces to the Hadoop cluster where the data resides • A process definition that defines how the data is consumed and invokes processing logic Falcon provides users with a central interface to manage data lifecycle, where they can define and manage policies and pipelines for data ingest, processing, and export. Falcon also provides audit and compliance features that enables administrators to visualize data pipeline lineage, track data pipeline audit logs, and tag data with business metadata. Apache Falcon addresses data governance requirements and provides a wizard-like GUI that eliminates hand coding of complex data sets and offers. Physical Safeguards HIPAA REQUIREMENT HOW HORTONWORKS CAN HELP? Data Backup Plan: Establish and implement procedures to create and maintain retrievable exact copies of electronic protected health information Apache Falcon provides organizations with the capability to mirror HDFS directories or Hive tables, which produces an exact copy of the data and keeps both copies synchronized. Companies can also mirror between HDFS and Amazon S3 or Microsoft Azure. Database replication can be performed with Hive. Disaster Recovery Plan: Establish procedures to restore any loss of data In addition to data replication, Falcon provides functionality to trigger processes for retry, and handle late data arrival logic. In addition, Falcon can mirror file systems or Hive and HCatalog on clusters using recipes that enable users to re-use complex workflows Emergency Mode Operation Plan: Establish procedures to enable continuation of critical business processes related to protecting the security of electronic protected health information while operating in emergency mode Falcon provides alerting mechanism for a variety of events to let administrators monitor the health of their data pipelines. All events are logged and administrators can view the events from the log or capture them using a custom interface. Each event logged provides the following information: • Date: Date of action • Action: Event name • Dimensions: List of name/value pairs of various attributes for a given action • Status: Result of the action • Time-taken: Time in nanoseconds for a given action to complete
  • 12. 12HIPAA Compliance For Your Apache™ Hadoop® Environment • ©2016 Hortonworks • www.hortonworks.com © 2011-2016 Hortonworks Inc. All Rights Reserved. Privacy Policy | Terms of Service Hortonworks is a leading innovator at creating, distributing and supporting enterprise-ready open data platforms. Our mission is to manage the world’s data. We have a single-minded focus on driving innovation in open source communities such as Apache Hadoop, NiFi, and Spark. Our open Connected Data Platforms power Modern Data Applications that deliver actionable intelligence from all data: data- in-motion and data-at-rest. Along with our 1600+ partners, we provide the expertise, training and services that allows our customers to unlock the transformational value of data across any line of business. We are Powering the Future of Data™. About Hortonworks +1 408 675-0983 +1 855 8-HORTON INTL: +44 (0) 20 3826 1405 For further information visit www.hortonworks.com Contact To realize the full strategic benefits of Big Data without increasing the risk of compliance violations or data breaches, companies need to ensure that their Hadoop environments meet the strict requirements laid out in the HIPAA standard. The best way to accomplish this is through a holistic approach based on a platform that provides for all five essential pillars of Hadoop security—administration, authentication / perimeter security, authorization, audit and data protection. With HDP, Hortonworks provides robust capabilities to address key HIPAA requirements and facilitate the creation and maintenance of fully compliant modern data architecture. More than 700 customers trust HDP to power their Big Data strategy, including some of the world’s largest retail, e-commerce and telecom companies. To learn more about how Hortonworks can help you address key security and data protection challenges in your organization, we welcome you to download the Hortonworks white paper “Solving Hadoop Security” or speak with a Hortonworks representative at 1-855-8-HORTON. Conclusion