Watch full webinar here: https://buff.ly/49FKgdM
Join us for an exciting webinar that delves into the world of data sharing and its pivotal role in accelerating data-driven decisions. In an era where every second counts, we’ll showcase how data virtualization acts as the indispensable bridge between disparate data sources and swift consumption.
During this webinar, you’ll see:
- The Great Data Race: Imagine a scenario where time is of the essence, and two data experts compete head-to-head to connect with as many data sources as possible. Witness the electrifying race as they navigate through a multitude of data repositories, showcasing their prowess in sourcing valuable information.
- Data Fusion in Real-Time: Once the data sources are harnessed, our experts will demonstrate how to seamlessly blend these disparate datasets into a coherent and insightful data product. Witness firsthand the lightning-fast transformation of raw data into a valuable asset that can drive informed decisions.
- Unleashing Data Across Ecosystems: In today's interconnected world, the real power of data lies in its versatility. Our experts will illustrate how a well-structured data product can be quickly integrated into numerous consuming applications. Discover the sheer speed at which data can be disseminated across your organization's ecosystem.
- Data Virtualization: The Essential Enabler We will emphasize the crucial role of data virtualization in making this entire process possible. Data virtualization acts as the linchpin, seamlessly connecting various data sources, transforming them into a cohesive unit, and facilitating rapid distribution to consuming applications. Learn how it empowers organizations to harness data at the speed of thought.
Don’t miss this unique webinar as we break down the barriers to data sharing and empower you to unlock the true potential of your data. Register now and embark on a journey towards data-driven excellence.
MasterClass Series: Unlocking Data Sharing Velocity with Data Virtualization
1. Unlocking Data Sharing
Velocity with Data
Virtualization
BITanium MasterClass Series
Kevin McKerr BITanium Consulting
Vincent Gaorekwe BITanium Consulting
Chris Fitzpatrick Denodo
2. Agenda
1. The Data Sharing Challenge
2. How does Data Virtualisation enable Data Sharing?
• Data abstraction
• Zero replication, zero relocation
• Real time information
• Self-service data services
• Centralized metadata, security & governance
• Location agnostic architecture
3. The three foundation capabilities
• Connect to any data
• Combine any data
• Consume any data
4. Q&A
4. 4
Business Challenges and Needs
• Need for faster, more accurate decision making
▪ Significant increase in business speed & complexity of requirements
→ IT struggles to deliver in a timely fashion
• Ensure business continuity amidst technology evolution
▪ Migration of legacy systems to cloud, modernization of data and
applications
• Increased risk from regulations, compliance, data
privacy and security
▪ Exponential increase in regulations effecting data across geographies,
departments and industries
5. 5
IT Architecture is Unmanageable & Brittle because:
Business – IT Dilemma
Business wants
all enterprise
data, integrated,
and up-to-date
IT responds by
loosely stitching
together
disparate data
sources
Business Wants All of the Data, Now
– So IT creates 100s to 1000s of brittle direct connections and
replicates large volumes of data
Inventory System
(MS SQL Server)
Product Catalog
(Web Service -SOAP)
BI / Reporting
JDBC, ODBC,
ADO .NET
Web / Mobile
WS – REST JSON,
XML, HTML, RSS
Log files
(.txt/.log files)
CRM
(MySQL)
Billing System
(Web Service -
Rest)
ETL
Portals
JSR168 / 286,
Ms Web Parts
SOA,
Middleware,
Enterprise Apps
WS – SOAP
Java API
Customer Voice
(Internet,
Unstruc)
7. 7
Objectives for the Modern Data-Driven Business
1. Single entry-point to explore and query ALL data
• Users don’t want to waste time searching across different data sources
• IT doesn’t want their users having access to all their production systems
2. Create a self-service culture for data consumers
• Users don’t want to have to learn to code (SQL, Python, Java, etc)
• They want to use the tools they’re most comfortable with
3. Implement security & governance across multiple systems
• Leadership wants to reduce the amount of data that’s copied across the org
• Minimize the risk of a data breach & avoid creating multiple versions of truth
9. 9
Six Essential Capabilities of Data Virtualization
4. Self-service data services
5. Centralized metadata, security &
governance
6. Location-agnostic architecture for
multi-cloud, hybrid acceleration
1. Data abstraction
2. Zero replication, zero relocation
3. Real-time information
10. 10
1. Data abstraction
Abstracts access to disparate data sources.
Acts as a single virtual repository.
Abstracts data complexities like location,
format, protocols
…hides data complexity for ease of data access by business
Enterprise architects must revise their data architecture to meet
the demand for fast data.”
– Create a Road Map For A Real-time, Agile, Self-Service Data
Platform, Forrester Research
11. 11
2. Zero replication, zero relocation
…reduces development time and overall TCO
The Denodo Platform enables us to build and deliver data
services, to our internal and external consumers, within a
day instead of the 1 – 2 weeks it would take with ETL.”
– Manager, Enervus
Leaves the data at its source; extracts only what is
needed, on demand.
Diminishes the need for effort-intensive ETL
processes.
Eliminates unnecessary data redundancy.
12. 12
3. Real-time information
Provisions data in real-time to consumers
Creates real-time logical views of data across many
data sources.
Supports transformations and quality functions
without the latency, redundancy, and rigidity of
legacy approaches
…enables timely decision-making
Denodo’s data fabric design relies on data virtualization to provide
integrated data quickly to business users to effect faster outcomes..”
– Gartner Magic Quadrant for Data Integration Tools, 18 August’ 2020
13. 13
4. Self-service data services
Facilitates access to all data, both internal and external
Enables creation of universal semantic models reflecting
business taxonomy
Connects data silos to provide best available information to
drive business decisions
…enables information discovery and self-service
Impressively quick turn around time to "unlock“ data from
additional siloes and from legacy systems - Few vendors (if any) can
compete with Denodo's support of the Restful/Odata standard -
both to provide data (northbound) and to access data from the
sources (southbound).”
– Business Analyst, Swiss Re
14. 14
5. Centralized metadata, security & governance
Abstracts data source security models and enables single-point
security and governance.
Extends single-point control across cloud and on-premises
architectures
Provides multiple forms of metadata (technical, business,
operational) to facilitate understanding of data.
…simplifies data security, privacy, audit
Our Denodo rollout was one of the easiest and most successful rollouts of critical
enterprise software I have seen. It was successful in handling our initial, security,
use case immediately, and has since shown a strong ability to cover additional
use cases, in particular acting as a Data Abstraction Layer via it's web service
functionality.”
– Enterprise Architect, Asurion
15. 15
6. Location-agnostic architecture for multi-cloud, hybrid acceleration
Optimizes costs by migrating data, applications, and analytics
workloads to cloud without impacting the business
Enables creation of hub architecture to support integration of
data across mixed workloads.
End-to-end management of migrations/promotions and
continuous delivery processes.
…enables cloud adoption
Impressively quick turn around time to "unlock“ data from
additional siloes and from legacy systems - Few vendors (if any) can
compete with Denodo's support of the Restful/Odata standard -
both to provide data (northbound) and to access data from the
sources (southbound).”
– Business Analyst, Swiss Re
16. 16
Data Virtualization: Unified Data Integration and Delivery
• Data Abstraction: decoupling
applications/data usage from data
sources
• Data Integration without replication
or relocation of physical data
• Easy Access to Any Data, high
performant and real-time/ right-
time
• Data Catalog for self-service data
services and easy discovery
• Unified metadata, security &
governance across all data assets
• Data Delivery in any format with
intelligent query optimization that
leverages new and existing
physical data platforms
A logical data layer – a “logical data fabric” – that provides high-performant, real-time, and secure
access to integrated business views of disparate data across the enterprise