With the California Consumer Privacy Act (CCPA) going into effect in 2020, organizations must comply with a new set of sweeping provisions designed to protect the privacy of consumer data. Organizations inside and outside of the state must assess their exposure to CCPA, then quickly transform how they process, share, and protect sensitive data.
Background if required:
There are several new requirements, five core rights for individuals and a requirement to cure any breach within 30 days of notice. This will require a re-engineering of your systems and processes, plus your culture. The regulator can issue instructions to cease data processing, while non-compliance can lead to loss of brand reputation and trust.
Individuals have a number of rights regarding their personal data, such as:
The right to access it
The right to have it deleted
The right to disclose why the data is collected and whom it is sold to
The right to opt-out of selling personal information. If between 13-16, explicit opt-in.
The right to equal service (no discrimination)
Consent should be freely given, specific, unambiguous and per purpose.
The most fundamental principle of GDPR & CCPA is the obligation to process personal data “lawfully, adequately, accurately and securely “.
Controllers of personal data must implement the principles of “Data Protection by design and by default”, which addresses topics like data minimization, pseudonymizing, transparency and security.
We have 10 blocks of services.
While clients can choose end to end services, they also can choose separate services. We deliver per your needs.
For example, you can only choose Data Discovery, Data Lifecycles and Data Protection services.
This is the original 10 components / workstreams slide.
We recommend that the privacy program consist of these 10 components or work streams.
Led by the Strategy & Governance workstream.
Each workstream addresses specific requirements of a privacy program.
Data breach: there have been breaches across the private and public sector that have cost millions in damages. That’s just considering the data breaches that have made the news and cost millions to resolve—but there are many more breaches that can go undiscovered for months or even years. Customer churn, fines, reputational risk—those are all the real dangers of data breach.
Regulations: HIPAA and PCI-DSS in the U.S., as well as GDPR in EMEA—mandate protection and deidentification of certain kinds of data.
Let’s go into a bit more detail around how Delphix works.
Our Dynamic Data Platform installs on prem or in the cloud and ingests data from various sources– oftentimes this is an RDBMS such as Oracle, SQL Server, DB2, ASE, but we can also work with data from file systems.
Delphix virtualizes that data, allowing users to create lightweight virtual copies that are space-efficient and highly portable. These are complete copies of the source data that fully readable and writable.
We automatically identify sensitive data values and apply data masking within those virtual copies to protect sensitive information. Finally, we package those virtual copies into personal data pods that are delivered to end users in just minutes.
Data pods contain secure, virtual copies of data along with data controls that allows users to manipulate that data: users can instantly refresh, rewind, branch, or share those copies as a self service.
Delphix is totally unique in that we integrate data masking with data delivery. For GDPR compliance, this means that you can ensure that all of the data you’ve sent to non-prod environments has been scrubbed of sensitive information. You no longer have to worry about compliance for a large proportion of your environments.
Continuously Identify Risk
Delphix Notes: Many organizations find themselves in a state where they have limited visibility into their exposure to data risk. If they do have visibility, it may be limited to a certain set of data sources or it might reflect only a specific time period. This slide zeroes in on our ability to assess data risk at an enterprise level, and on an ongoing basis. We provide an easy and automated approach for pinpointing sensitive data that easily scales, is highly automated, and that can be repeated by policy.
Narrative: For many organizations, the first step in securing their data and enabling compliance is determining where their sensitive data resides. Delphix provides a built-in capability to automatically pinpoint sensitive information that might be subject to privacy laws or your own internal security standards. The same capability works across different sources, and allows teams to leverage both pre-configured profiling sets tuned for specific apps, regulations, or verticals, as well as flexible templates that allow you to discover sensitive data types specific to your business. This capability can be deployed across large data estates and be triggered by policy via API, giving you continuous, enterprise-wide visibility into data risk.
We often get questions about the masking process and how it affects the usability of the data and how we know that the data is really secure.
What a masking solution does is that it transforms sensitive data values – names, email addresses, social security numbers, credit card numbers – into fictitious yet realistic values
The key here is that we scramble the data in a way that’s irreversible, secure, and yet intelligent. The data is still usable after its masked.
So if you’re a developer, you often don’t need the actual information resident in the data, but you do need that data to look, feel, and operate like the real thing. That’s what a masking solution does and we offer a variety of different masking algorithms that all achieve this goal.
In this example here, Mary is masked to another name Clara and John is masked to Damian.
We do this very quickly and in a way that preserves the referential integrity of the data.
Again, in his example, Mary is masked to Clara consistently, across the Oracle tables and the SQL Server tables. Referential integrity is a really common requirement that we encounter at Delphix.
Continuously Identify Risk
Delphix Notes: Many organizations find themselves in a state where they have limited visibility into their exposure to data risk. If they do have visibility, it may be limited to a certain set of data sources or it might reflect only a specific time period. This slide zeroes in on our ability to assess data risk at an enterprise level, and on an ongoing basis. We provide an easy and automated approach for pinpointing sensitive data that easily scales, is highly automated, and that can be repeated by policy.
Narrative: For many organizations, the first step in securing their data and enabling compliance is determining where their sensitive data resides. Delphix provides a built-in capability to automatically pinpoint sensitive information that might be subject to privacy laws or your own internal security standards. The same capability works across different sources, and allows teams to leverage both pre-configured profiling sets tuned for specific apps, regulations, or verticals, as well as flexible templates that allow you to discover sensitive data types specific to your business. This capability can be deployed across large data estates and be triggered by policy via API, giving you continuous, enterprise-wide visibility into data risk.
CHALLENGE
BECU needed to quickly roll out enhancements to its online banking systems, without interrupting services or compromising security. A key challenge faced by BECU was building an agile testing infrastructure while upholding the highest data privacy and security standards. BECU needed a secure DataOps solution that met these specific requirements:
Sensitive data discovery: BECU needed a solution that would identify sensitive data values across all of its environments and automate the process of consistently masking those values.
Masking consistency and repeatability: Applications needed to be masked the same way every time and function the same using masked and unmasked data. The solution also needed to maintain referential integrity across masked flat files and databases.
Out-of-the-box masking templates: The BECU team needed a simple solution with preset rules that the company could start using immediately.
Breadth of supported databases and flat files: BECU needed a solution that could mask Oracle, SQL Server, and more than 100 flat files across the nine key applications, including CRM, loan originations, and member portal systems.
Reporting and auditing of masked data: The credit union needed a way to track masked data across sources over time to verify that masked environments were not being polluted with unmasked confidential data.
SOLUTION
After evaluating three separate vendors, BECU selected the Delphix Dynamic Data Platform for masking data in its new testing infrastructure because:
Delphix consistently masks data across relational database platforms and flat files, even as data changes over time. Delphix also maintains the referential integrity of masked data both within and across databases and files.
Delphix addresses the first crucial step in securing sensitive data at risk: discovering where the risk lies by providing built-in data profiling.
Delphix provides pre-built masking functionality that requires no programming knowledge or administrative involvement to create custom masking rules.
Delphix is platform-agnostic, offering a wide range of support for heterogeneous databases—such as Oracle, SQL server, DB2, and file systems.
RESULTS
Delphix enabled BECU to exceed its goal for data masking, helping the firm bolster rigorous standards for protecting confidential information:
BECU masked 662 tables, 3,507 columns, and 680 million rows of data in 15 hours, far exceeding the initial requirement that masking not take more than 24 hours.
They completed the implementation process in 6 weeks, meeting compliance requirements ahead of schedule. The team estimated that competitors’ tools would have taken an estimated 18-24 weeks to install and start masking data with.
Delphix experts worked side-by-side with BECU team members to establish the foundation of a masking Center of Excellence, enabling BECU to continue with minimal support.
BECU also leverages the virtualization capabilities of the Delphix Dynamic Data Platform to reduce the overall time and effort to distribute masked data, which makes it possible for BECU to deploy products up to twice as fast.
CHALLENGE
Dentegra depends heavily on software applications to support the orchestration of core business processes such as contracts management, customer onboarding, and claims processing. Moving to the cloud is part of Dentegra’s long-term digital strategy to improve scalability and time to market across its application portfolio. However, data-related challenges stood in the way of realizing the full potential of cloud:
While Dentegra leveraged Amazon Web Services (AWS) to quickly provision compute and storage resources for dev/test environments, those environments were not complete without application data.
Initial migration efforts involved data extraction followed by the physical shipment of an appliance, a process that took 8 weeks to complete.
Dentegra needed to secure PII (personally identifiable information) and PHI (protected health information) before moving data to AWS.
Amazon Web Services allows teams to quickly spin up and tear down infrastructure at unprecedented speeds. But without a sound methodology to also deliver secure, high-quality data to that infrastructure – and at a similar, accelerated pace – Dentegra would be unable to maximize the cloud-related benefit of rapid time to market.
SOLUTION
Delphix empowers teams to stand up complete dev/test environments in a matter of minutes. Delphix non-disruptively collects data from Dentegra’s production applications and applies masking to that data to protect any confidential information. Delphix then replicates masked data to a second instance of the Dynamic Data Platform that resides in AWS. From that cloud-based instance, teams can instantly provision virtual, space-efficient data copies to dev/test environments running on AWS EC2 instances. Legacy approaches to refreshing data in AWS require teams to repeat the full, manual process of extracting, moving, and importing data to the cloud. But with a hybrid cloud architecture leveraging Delphix, Dentegra can keep on-prem and cloud environments in sync: Delphix continually gathers data from production sources. The data is then masked and replicated to AWS. With fresh, secure data available in the cloud, Dentegra can easily deliver new virtual data copies to a team of over 200 developers, in just minutes.
RESULTS
By leveraging Delphix and AWS solutions together, Dentegra is in process of transforming environment setup processes that can result in lead times that slow down Application Development. Instead of having to wait for multiple, serial processes to procure and configure hardware and software followed by a paralyzingly slow migration of production data, Dentegra can create cloud environments—infrastructure plus data—in a single motion. With Delphix, Dentegra teams have been able to:
Reduce the time it takes to move data to cloud environments from 8 weeks to hours. Mask sensitive PII and PHI before replicating data to AWS.
Decrease storage requirements in AWS by leveraging virtual instead of physical data copies. In addition, for its 16TB on-premise claims processing database, Dentegra has realized significant reduction in the storage it needs for non-production environments.
In concert with AWS, Delphix brings dramatically greater scalability and speed to development: Dentegra can determine requirements for a new application project on one day, then marshal the necessary data and compute resources to execute against those requirements within 24 hours.