While IT teams have automated many parts of the application development process, managing data has emerged as the latest constraint holding DevOps teams back, suppressing the pace of innovation delivery. Data provisioning, versioning, and aligning database code with application code are still manual processes that impede the flow of high-quality, secure data to teams that need it most. But it does not have to be that way.
Artificial intelligence in the post-deep learning era
Let Data Flow: Removing the Latest DevOps Constraints with DataOps
1.
2.
3.
4.
5. APPLICATION
To keep up with the nonstop pace of innovation, organizations are
modernizing application delivery to increase business value.
Automation PipelinesBusiness Drivers
12. DEV PRODSTAGEINT QA
Safely automate database schema changes
• Fast: Provision lightweight, virtual copies of production data in minutes
• Secure: Protect sensitive data from breach and enable regulatory compliance
• Everywhere: Move and manage data in any environment, cloud or on premises
• Faster: Accelerate the application release cycle by 80%
• Safer: Decrease the number of database errors by 90%
• Smarter: Deploy application and database changes in-sync
Secure, on-demand production quality data
32. Reduced product lead
time and eliminate wait-
states
Increased quality,
success, and employee
satisfaction
Minimize risk of
disruption by competitors
by accelerating release
cycles
Enable self-service and
eliminating manual
processes
Eliminate wait:
Production data
delivered in
minutes.
Feedback on
database code in
minutes
Communicate status to
reduce churn
Faster Development Improve Quality FasterTime to Market Reduce Risk
• Test against production
quality data to ensure quality
• Enforce standards and best
practices on database code
• Simulate changes prior to
release to catch problems
before they become
problems
Eliminate errors due to
manual configuration and
delivery
Reduce surface area of
risk and enable
compliance by only
leveraging masked data
in non-production envt’s
Dry-run the production
push in lower
environments
Editor's Notes
R2
R2
ADAM
The DevOps space is rife with tools, with new ones emerging every day. I am sure you are familiar with, and perhaps use, several of these tools yourself.
ADAM
ADAM
ADAM
R2
R2
R2
ADAM
Here is the pipeline view of that job.
The basic flow is as follows:
Refresh the ETM with the latest masked copy of production.
Next, generate the subset exports. In this case, I am doing simple “percentage-based” subsetting using an open-source tool called Jailer.
I generate multiple data subsets between 1 percent and 25percent.
After each subset is generated, a snapshot is instantly created, tagged, and made available to use in the Delphix Dynamic Data Platform
ADAM
Now, I will onboard myself into production as a new employee (number 1).
Now I am in the self-service portal as the “Dev” users.
Here I can see all of the Test Data Catalog items that I am permitted to view and use.
You can probably tell by the names what the different squares represent.
ADAM
But even with hundreds of items in the Test Data Catalog, it is easy to find what I need.
Here I easily search for all the twenty-five percent subsets for my application.
ADAM
Or how about all of the data sets that are available for my application from June sixteenth?
ADAM
I’ll choose the one percent subset from June sixteenth.
Since we are going to use this to do some new feature work,
I’ll tell Delphix to use the data to create a new data branch in Delphix portal that will match my code branch.
ADAM
In just a few minutes, Delphix has completed my request.
ADAM
The Deploy job failed
R2
A closer inspection of the failure shows that Datical caught a major defect in my SQL code: a column name contains a special character.
ADAM
I go back into my code and correct the SQL and then commit and push the changes
ADAM
Once again, the Deploy to Dev job is initiated. And this time the Deploy job completes successfully.
ADAM
Now when I load the Dev instance of my employee application, we see that my record is still there and so is our new Twitter feature.
Let’s ship it!
R2
And push the changes upstream
ADAM
Since I am pushing to master, this triggers a job in Jenkins that runs a battery of tests against the application via Selenium
ADAM
The basic flow is this:
Refresh the QA Data Pod from the assigned data set in the Test Data Catalog, to ensure we are working with pristine data
Build and deploy the application changes, via Maven, to the QA Apache Tomcat Server
Package and deploy the database object changes, via Datical, to the QA Data Pod
Run the Selenium tests against the application
If the tests fail, open a ticket in Bugzilla with the details and bookmark and tag the test data in Delphix