In an era of unprecedented data accumulation and rapid technological evolution, organizations face the challenge of harnessing and leveraging data scattered across diverse IT environments.
As executives advocate for cloud migration and adoption, the demand for seamless data integration becomes paramount. Yet, the question remains: how can IT teams not only bridge data silos but also ensure the data's real-time accuracy and consistency across platforms?
Join our 30-minute session as we dive into the transformative potential of a "streaming-first" mindset, demonstrating how it enables organizations to maximize the value of their existing data assets and infrastructure.
During this session, we will also explore:
Why "streaming-first" is a critical approach to how organizations interact with their dataCustomer stories showing the benefits of using real-time over batch processesHow Precisely enables organizations to stream data they can trust
2. Housekeeping
Webinar Audio
• Today’s webcast audio is streamed through your
computer speakers
• If you need technical assistance with the web interface
or audio, please reach out to us using the Q&A box
Questions Welcome
• Submit your questions at any time during the presentation
using the Q&A box. If we don't get to your question, we will
follow-up via email
Recording and slides
• This webinar is being recorded. You will receive an email
following the webinar with a link to the recording and slides
4. “85% of organizations
will embrace a cloud-first
principle by 2025”
- Gartner
“55% of leaders cite data
modernization as the reason
for their shift to cloud”
- Deloitte
“60% of tech leaders say that
integrating multiple data sources is
their biggest hurdle to accessing
more real-time data”
- Confluent
As technology advances, customer
expectations become more demanding
while their patience diminishes
5. Today’s customers expect data in real-time
Initiate an action
Instant confirmation
Source: https://www.gigaspaces.com/blog/amazon-found-every-100ms-of-latency-cost-them-1-in-sales
~100 ms
in latency
can cost
you…
… 20% of
digital traffic
… $400M in
revenue
6. Traditional pipelines can’t scale to address
customer needs
Data owned by IT
Massive, loosely integrated solutions
Slow, batch ETL processes
Resource-intensive data processing
Monolithic design
Separate business and IT metadata
7. Traditional pipelines can’t scale to address
customer needs
You’ve struggled with traditional solutions. We believe there’s a better way.
Data access owned by IT Collaboration between IT and business data users
Massive, loosely integrated solutions Just the scalable, interoperable capabilities you need
Data must be brought to the solution Workflows designed for the cloud that run alongside data
Slow, batch ETL processes Streaming data pipelines to the cloud
Separate business and IT metadata Scalable, shared catalog of business & technical metadata
Rules-based data management AI-driven quality rules, alerts, and data enrichment
8. Streaming data pipelines close the gap
Declarative
language to
specify logic
CDC instead
of ETL
Agile
Engineering
9. Benefits of real-time data streaming
Improved
operational
efficiency
Better customer
engagement /
retention
Faster / better
decision making
Faster time to
market
Identify new
business
opportunities
Increased revenue Reduced risk
10. Positioning IT Teams for Success
Systems are current
and accurate
Data is democratized
Maximizing value of
current architecture
Setting foundation
for AI use cases
12. A single solution for all your business demands
1
Resilient & Fault-
Tolerant Replication
2
Scalability &
Performance
3
Design Once,
Deploy Anywhere
4
Data When &
Where You Need It
13. Precisely's
single solution
for all your
business
demands
Consistency
Accuracy Context
Data
Integration
Data
Observabil
ity
Data
Quality
Geo
Addressing
Spatial
Analytics
Data
Governanc
e
Data
Enrichment
Data Integrity Foundation
15. OBJECTIVE
Integrate data from mission-critical
mainframes into an API platform
to reduce the time and cost of
delivering applications
Stream audio files to S3 through call
center application, and leverage a
sync connector to move the data
into an analytics workspace in AWS
CHALLENGES
• Increased project delivery costs
over 10+ years by as much as 6x
• Batch or manual connectivity
between systems and partners
• Increasing demand for new
products and services
• Inconsistent and fragmented
experience across channels
Data Replication for Db2 and IMS to Confluent Cloud on AWS
Canadian Life Insurance Company
BENEFITS
• Near real time data availability
• Reduced change time from
30 days to 30 minutes
• Demonstrated lower MIPS
usage than existing solution
• Gave full control of data to
development teams
16. Turnkey solution to address many challenges
1 2 5
4
3 6
Change Data
Capture via Connect
Domain API
External
Providers
Domain
Services
Downstream
Channels
and Systems
Confluent
Cloud
But first, we want to take a minute to discuss the importance of cloud and how it is driving businesses. It is clear that moving to cloud, and doing so quickly is a priority.
Research shows that:
85% of organizations will embrace a cloud-first principle by 2025
55% of leaders site data modernization as the reason for their shift to cloud
Approximately $100 billion of wasted migration spend is expected over the next three years
So despite the excitement around getting to the cloud, organizations need to be careful…
You’ve likely been working on this for a long time, but legacy solutions aren’t serving you today.
We have a vision for delivering data with integrity to your business.
Streaming is a modern approach to building data pipelines that allows teams to share real-time data in many different contexts through a decoupled architecture rather than integrating it in a centralized data warehouse. It uses change data capture (CDC) capabilities to continuously intercept changes to databases as streams—combining, enriching, and analyzing them in motion even before they reach at-rest systems such as the database or data warehouse.
Unlike traditional pipelines, streaming data pipelines can be designed using declarative languages such as SQL to specify the logic of what needs to happen while abstracting away the low-level operational details. This approach helps to maintain the delicate balance between centralized continuous data observability, security, policy management, and compliance standards and the need to make data easily searchable and discoverable so that developers and engineers can innovate faster.
In addition, streaming data pipelines invite IT organizations to apply Agile engineering practices in order to build modular, reusable data flows that can be tested and debugged using version control and CI/CD systems. This characteristic makes streaming data pipelines easier to evolve and maintain—and reduces their total cost of ownership compared to traditional approaches.
Why you would use a technique like change data capture
For a cloud-native solution that addresses all your business demands we have the Data Integrity Suite. The modular, interoperable Precisely Data Integrity Suite contains everything you need to deliver accurate, consistent, contextual data to your business - wherever and whenever it’s needed.
The seven modules of the Data Integrity Suite are built on proven Precisely technology.
Not only do the Suite’s modules work seamlessly together, they also work alongside the portfolio of Precisely products, enabling you to easily adopt Suite capabilities for new use cases whenever you choose
All elements sit on a common foundation that’s modular, interoperable, intelligent, and business friendly. The foundation provides a range of services, including a metadata management engine that shares data between the modules and third-party programs to deliver incremental value.
The first module is Data Integration. Typically, in any major data initiative, you first need to connect to sources, and sometimes move or replicate data to another environment.
With Data Integration, you can easily create streaming data pipelines that integrate data from core environments such as relational, and of course, mainframe and IBM I, with modern cloud-based data platforms like Snowflake to drive analytics and innovation and extend the value of your mission-critical systems.
We understand that pipelines must scale for your needs today and extend for tomorrow.
As you’ll see in the demo in a few minutes, you’ll notice how easy it is to build powerful pipelines fast. And with our open microservices architecture, you’ll be able to easily integrate your data pipelines into larger data transformation initiatives.
Benefits:
Build new revenue-driving applications by integrating data from disparate systems with modern cloud-based platforms
Extend the value of your mission-critical systems – everything from open systems databases to mainframe and IBM i
Democratize your data by making it accessible to more areas of the business
Future-proof your architecture to remain flexible when onboarding new applications and use cases
Integrate complex data formats into projects with no coding or specialized skills necessary
Change Data Capture (CDC) replicates data from core systems in near real time and populates the Kafka data backbone. - Removes the need to work with legacy systems which is costly and time consuming
2. The Kafka data backbone delivers high throughput, low latency, accurate access to consolidated data and minimizes transactions hitting core systems
- Transformational enabler for moving systems from batch integration to real time
3. Domain services normalize raw data from Kafka, originating from CDC into generic “business friendly” domain events and publish them to Kafka
4. Kafka Sink Connector replicates domain data to the database of domain API. APIs are created on the API Platform which implements nonfunctional features out of the box. The API Platform facilitates API design, development, deployment and testing using DevOps and Kubernetes Runtime Environment
- Accelerates time to market for APIs, reducing delivery costs
5. Real time data is delivered to downstream systems and channels via Domain API. This fosters accurate business insight, enables new services and allows Sun Life to respond to events more quickly and in a highly adaptive manner
- Real time data drives improved client experience with timely and accurate data
6. External Providers can access real time data through APIs to create partner ecosystems supporting end to end business transactions