Presentation on the "First Time Right Approach to Data Virtualization" by Bart De Groeve (InfoRoad), at the BI & Data Analytics Summit on June 13th, 2019 in Diegem (Belgium)
Improving Passenger Experience at Brussels Airport through (real-time) Analyt...Patrick Van Renterghem
Presentation on the "Use of Analytics at Brussels Airport to Improve Passenger Experience" by Stefan Kennis (Brussels Airport), at the BI & Data Analytics Summit on June 13th, 2019 in Diegem (Belgium)
Data Virtualization at UMC Utrecht: Don't Collect, Connect! by Erik Fransen (...Patrick Van Renterghem
Presentation on "Data Virtualization at UMC Utrecht: What, Why and How" by Erik Fransen (connecteddatagroup), at the BI & Data Analytics Summit on June 13th, 2019 in Diegem (Belgium)
A Logical Architecture is Always a Flexible Architecture (ASEAN)Denodo
Watch full webinar here: https://bit.ly/3joZa0a
The current data landscape is fragmented, not just in location but also in terms of processing paradigms: data lakes, IoT architectures, NoSQL, and graph data stores, SaaS applications, etc. are found coexisting with relational databases to fuel the needs of modern analytics, ML, and AI. The physical consolidation of enterprise data into a central repository, although possible, is both expensive and time-consuming. A logical data warehouse is a modern data architecture that allows organizations to leverage all of their data irrespective of where the data is stored, what format it is stored in, and what technologies or protocols are used to store and access the data.
Watch this session to understand:
- What is a logical data warehouse and how to architect one
- The benefits of logical data warehouse – speed with agility
- Customer use case depicting logical architecture implementation
Accelerate Cloud Modernization using Data VirtualizationDenodo
Watch full webinar here: https://bit.ly/3jxGhIm
Many companies have been modernizing their data infrastructure from legacy on-premises to modern cloud systems. But such a transition has not been easy - many companies had to re-architect their IT landscape to fit the new technology model, disrupting business. Data virtualization provides a layer of abstraction for the IT to transform their systems, while enabling the business users to continue their operations without disruption. In this session, Paul Moxon, SVP Data Architecture and Chief Evangelist at Denodo, will discuss how some of Denodo’s largest customers have successfully modernized their IT infrastructure using data virtualization as the data abstraction layer.
Logical Data Warehouse: The Foundation of Modern Data and AnalyticsDenodo
Watch full webinar here: https://buff.ly/2Vhew78
According to a leading analyst firm, the total spend in data and analytics is expected to reach $104 billion in 2019! Companies are investing in data warehouse modernization and data lake projects for descriptive and advanced analytics; however, for the analysis to be holistic, today’s architects weave disparate data streams together, not only from these analytical sources, but also from operational, third party, and streaming data sources. Logical data warehouse is a modern architectural methodology that virtually combines all the data across the enterprise and makes it available to analytical and visualization tools that facilitate timely, insightful, and impactful decisions throughout the enterprise.
In this session, you will learn:
*What is logical data warehouse and how to architect one
*The benefits of logical data warehouse – speed with agility
*The Insights Development Workbench – a framework in action
Supporting Data Services Marketplace using Data VirtualizationDenodo
Data is treated truly as an asset at Guardian Life. We have created a Data Services Marketplace which contains valuable data from the underlying sources and is used by business users for day-to-day operations. In this presentation, you will see how Data Virtualization can be used to support the marketplace with real-time data services, provision non real-time data into Hadoop, and swap underlying sources without effecting business users.
This presentation is part of the Fast Data Strategy Conference, and you can watch the video here goo.gl/PZ2uFj.
Data Virtualization for Compliance – Creating a Controlled Data EnvironmentDenodo
CIT modernized its data architecture in response to intense regulatory scrutiny. In this presentation, they present how data virtualization is being used to drive standardization, enable cross-company data integration, and serve as a common provisioning point from which to access all authoritative sources of data.
This presentation is part of the Fast Data Strategy Conference, and you can watch the video here goo.gl/CCqUeT.
Analyst Keynote: TDWI: Data Virtualization as a Data Management Strategy for ...Denodo
Watch full webinar here: https://bit.ly/3rnxYzr
In this presentation, TDWI analyst will describe data virtualization as an appropriate data management strategy when advanced analytics applications demand very fresh data or when advanced analytics data is distributed across multiple data platforms in a hybrid data architecture.
Improving Passenger Experience at Brussels Airport through (real-time) Analyt...Patrick Van Renterghem
Presentation on the "Use of Analytics at Brussels Airport to Improve Passenger Experience" by Stefan Kennis (Brussels Airport), at the BI & Data Analytics Summit on June 13th, 2019 in Diegem (Belgium)
Data Virtualization at UMC Utrecht: Don't Collect, Connect! by Erik Fransen (...Patrick Van Renterghem
Presentation on "Data Virtualization at UMC Utrecht: What, Why and How" by Erik Fransen (connecteddatagroup), at the BI & Data Analytics Summit on June 13th, 2019 in Diegem (Belgium)
A Logical Architecture is Always a Flexible Architecture (ASEAN)Denodo
Watch full webinar here: https://bit.ly/3joZa0a
The current data landscape is fragmented, not just in location but also in terms of processing paradigms: data lakes, IoT architectures, NoSQL, and graph data stores, SaaS applications, etc. are found coexisting with relational databases to fuel the needs of modern analytics, ML, and AI. The physical consolidation of enterprise data into a central repository, although possible, is both expensive and time-consuming. A logical data warehouse is a modern data architecture that allows organizations to leverage all of their data irrespective of where the data is stored, what format it is stored in, and what technologies or protocols are used to store and access the data.
Watch this session to understand:
- What is a logical data warehouse and how to architect one
- The benefits of logical data warehouse – speed with agility
- Customer use case depicting logical architecture implementation
Accelerate Cloud Modernization using Data VirtualizationDenodo
Watch full webinar here: https://bit.ly/3jxGhIm
Many companies have been modernizing their data infrastructure from legacy on-premises to modern cloud systems. But such a transition has not been easy - many companies had to re-architect their IT landscape to fit the new technology model, disrupting business. Data virtualization provides a layer of abstraction for the IT to transform their systems, while enabling the business users to continue their operations without disruption. In this session, Paul Moxon, SVP Data Architecture and Chief Evangelist at Denodo, will discuss how some of Denodo’s largest customers have successfully modernized their IT infrastructure using data virtualization as the data abstraction layer.
Logical Data Warehouse: The Foundation of Modern Data and AnalyticsDenodo
Watch full webinar here: https://buff.ly/2Vhew78
According to a leading analyst firm, the total spend in data and analytics is expected to reach $104 billion in 2019! Companies are investing in data warehouse modernization and data lake projects for descriptive and advanced analytics; however, for the analysis to be holistic, today’s architects weave disparate data streams together, not only from these analytical sources, but also from operational, third party, and streaming data sources. Logical data warehouse is a modern architectural methodology that virtually combines all the data across the enterprise and makes it available to analytical and visualization tools that facilitate timely, insightful, and impactful decisions throughout the enterprise.
In this session, you will learn:
*What is logical data warehouse and how to architect one
*The benefits of logical data warehouse – speed with agility
*The Insights Development Workbench – a framework in action
Supporting Data Services Marketplace using Data VirtualizationDenodo
Data is treated truly as an asset at Guardian Life. We have created a Data Services Marketplace which contains valuable data from the underlying sources and is used by business users for day-to-day operations. In this presentation, you will see how Data Virtualization can be used to support the marketplace with real-time data services, provision non real-time data into Hadoop, and swap underlying sources without effecting business users.
This presentation is part of the Fast Data Strategy Conference, and you can watch the video here goo.gl/PZ2uFj.
Data Virtualization for Compliance – Creating a Controlled Data EnvironmentDenodo
CIT modernized its data architecture in response to intense regulatory scrutiny. In this presentation, they present how data virtualization is being used to drive standardization, enable cross-company data integration, and serve as a common provisioning point from which to access all authoritative sources of data.
This presentation is part of the Fast Data Strategy Conference, and you can watch the video here goo.gl/CCqUeT.
Analyst Keynote: TDWI: Data Virtualization as a Data Management Strategy for ...Denodo
Watch full webinar here: https://bit.ly/3rnxYzr
In this presentation, TDWI analyst will describe data virtualization as an appropriate data management strategy when advanced analytics applications demand very fresh data or when advanced analytics data is distributed across multiple data platforms in a hybrid data architecture.
A Successful Data Strategy for Insurers in Volatile Times (ASEAN)Denodo
Watch full webinar here: https://bit.ly/3rpr4La
Data is an insurer’s most valuable asset. Capitalizing on all of that stored and incoming data to draw valuable insights for business decisions is what ultimately makes a competitive difference.
But, insurers face challenges when it comes to modernizing and digitizing their data architectures. Most organizations rely on traditional systems and data integration processes that are time consuming and slow. In addition, as many adopt cloud strategies, these traditional approaches fill the cloud modernization process with downtime and end user frustration.
This is why insurers need a flexible and easily adaptable data integration technology that allows them to keep up with the ever-changing and growing data environment.
Data virtualization is that modern data integration technology. It can support insurers not only on their journey to digitization, but also on their future infrastructure changes and innovations, adding agility, flexibility and efficiency to data architectures. Data virtualization can help insurance companies create 360° views of deals and claims processes as well as gather quick social media or sensor data for on-the-go risk profiling.
Join this on-demand webinar to:
- Find out why data virtualization should be a part of your enterprise data strategy
- See how this technology can help you capitalize on your data
- Hear how many of your peers are already leveraging the Denodo Platform for Data Virtualization and the benefits they’re observing
Modernizing Data Architecture using Data Virtualization for Agile Data DeliveryDenodo
In this presentation, Dave Kay, Data Consultant within the Analytics and Architecture group at Zurich Insurance, explains how Zurich is modernizing their data infrastructure using data virtualization to accelerate delivery of mortgage insurance and intra-day operational reports to business analysts, salespeople, underwriters, managers, and actuarial staff.
This presentation is part of the Fast Data Strategy Conference, and you can watch the video here goo.gl/GLPPg2.
Cloud Migration headache? Ease the pain with Data Virtualization! (EMEA)Denodo
Watch full webinar here: https://bit.ly/3CWIBzd
Moving data to the Cloud is a priority for many organizations. Benefits - in terms of flexibility, agility, and cost savings - are driving Cloud adoption. This journey to the Cloud is not easy: moving application(s) and data to the Cloud can be challenging and entails disruption of business, when not carefully managed.
When systems are being migrated, the resultant hybrid (or even multi-) Cloud architecture is, by definition, more complex AND making it harder/more costly to retrieve the data we need.
Data Virtualization can help organizations at all stages of a Cloud journey - during migration as well as in our “new hybrid multi-Cloud reality”
Watch on-demand this webinar to learn how Data Virtualization can:
- Help organizations manage risk and minimize the disruption caused as systems are moved to the Cloud
- Provide a single point of access for data that is both on-premise and in the Cloud, making it easier for users to find and access the data that they need
- Provide a secure layer to protect and manage data when it's distributed across hybrid or multi-Cloud architectures
… watch a live demo about how to ease the migration.
Reinventing and Simplifying Data Management for a Successful Hybrid and Multi...Denodo
Watch full webinar here: https://bit.ly/3AdAzkW
Hybrid cloud has become the standard for businesses. A successful move will involve using an intelligent and scalable architecture and seeking the right help. At the same time, multi-cloud strategies are on the rise. More enterprise organizations than ever before are analyzing their current technology portfolio and defining a cloud strategy that encompasses multiple cloud platforms to suit specific app workloads, and move those workloads as they see fit. Learn the key challenges in multi-hybrid data management, and how you can accelerate your digital transformation journey in a multi-cloud environment with data virtualization.
A Successful Data Strategy for Insurers in Volatile Times (EMEA)Denodo
Watch full webinar here: https://bit.ly/34gVVzH
To capitalize on all their data, insurers need a flexible and easily adaptable data integration technology that allows them to keep up with the ever-changing and growing data environment.
Data virtualization is that modern data integration technology. It can support insurers not only on their journey to digitization, but also on their future infrastructure changes and innovations, adding agility, flexibility and efficiency to data architectures.
Join this webinar to:
- Find out why data virtualization should be a part of your enterprise data strategy
- See how this technology can help you capitalize on your data
- Hear how many of your peers are already leveraging the Denodo Platform for Data Virtualization and the benefits they’re observing
Data Virtualization enabled Data Fabric: Operationalize the Data Lake (APAC)Denodo
Watch full webinar here: https://bit.ly/3aIofv9
The best of breed big data fabrics should deliver actionable insights to the business users with minimal effort, provide end-to-end security to the entire enterprise data platform and provide real-time data integration, while delivering self-service data platform to business users.
While big data initiatives have become necessary for any business to generate actionable insights, big data fabric has become a necessity for any successful big data initiative. The best of breed big data fabrics should deliver actionable insights to the business users with minimal effort, provide end-to-end security to the entire enterprise data platform and provide real-time data integration, while delivering self-service data platform to business users.
Attend this session to learn how big data fabric enabled by data virtualization:
- Provides lightning fast self-service data access to business users
- Centralizes data security, governance and data privacy
- Fulfills the promise of data lakes to provide actionable insights
Watch full webinar here: https://bit.ly/2Y0vudM
What is Data Virtualization and why do I care? In this webinar we intend to help you understand not only what Data Virtualization is but why it's a critical component of any organization's data fabric and how it fits. How data virtualization liberates and empowers your business users via data discovery, data wrangling to generation of reusable reporting objects and data services. Digital transformation demands that we empower all consumers of data within the organization, it also demands agility too. Data Virtualization gives you meaningful access to information that can be shared by a myriad of consumers.
Register to attend this session to learn:
- What is Data Virtualization?
- Why do I need Data Virtualization in my organization?
- How do I implement Data Virtualization in my enterprise?
Logical Data Warehouse: The Foundation of Modern Data and Analytics (APAC)Denodo
Watch full webinar here: https://bit.ly/3bBArAc
Companies are investing in data warehouse modernization and data lake projects for descriptive and advanced analytics; however, for the analysis to be holistic, today’s architects weave disparate data streams together, not only from these analytical sources, but also from operational, third party, and streaming data sources. Logical data warehouse is a modern architectural methodology that virtually combines all the data across the enterprise and makes it available to analytical and visualization tools that facilitate timely, insightful, and impactful decisions throughout the enterprise.
In this session, you will learn:
- What is logical data warehouse and how to architect one
- The benefits of logical data warehouse – speed with agility
Presentation by Bart Gielen (DataSense) at the Data Vault Modelling and Data ...Patrick Van Renterghem
DataSense is, together with the internal team, implementing a new Enterprise Data Warehouse (EDW) at Bank Degroof Petercam. The goal of the EDW is to integrate and historicize all changes of data of the Belgian and Luxembourg entities in the medium term and also those of the for the other entities in the long term. All of this complying with and respecting regulations (legal reporting) and service obligations (tax certificates, reports, taxes on funds, return on portfolio, ...). The implementation is done according to the principles of Data Vault 2.0 and automation of the RAW Data Vault layer. For this purpose they use Talend (Enterprise Open Source data integration platform), Vaultspeed (Data Vault Automation) and Oracle (db platform).
Denodo DataFest 2016: Enterprise View of Data with Semantic Data LayerDenodo
Watch the full session: Denodo DataFest 2016 sessions: https://goo.gl/kPmzWU
Gaining an enterprise view of the data across different independent lines of businesses is difficult when the operations, systems, and data are inherently siloed. VSP Global is a conglomerate operating different businesses across eyewear insurance, manufacturing, and retail. They are integrating the silos using a semantic data layer.
In this presentation, the Enterprise Data Architect at VSP Global, Tim Fredricks will present:
• The challenges associated with data siloed across different LOBs
• How to build a semantic data layer using data virtualization
• Centralizing business rules in the data virtualization layer
This session also includes a panel discussion with:
• Tim Fredricks, Enterprise Data Architect at VSP Global
• Rick Hart, Director of Global Technology Solutions at BioStorage Technologies
• Jeff Veis, VP Big Data Platform Marketing at HPE
• Mike Litzkow, Sales Director at Denodo (as moderator)
This session is part of the Denodo DataFest 2016 event. You can also watch more Denodo DataFest sessions on demand here: https://goo.gl/VXb6M6
This session is part of the Denodo DataFest 2016 event. You can also watch more Denodo DataFest sessions on demand here: https://goo.gl/VXb6M6
Analyst Keynote: Forrester: Data Fabric Strategy is Vital for Business Innova...Denodo
Watch full webinar here: https://bit.ly/36GEuJO
Traditional data integration is falling short to meet new business requirements - real-time connected data, self-service, automation, speed, and intelligence. Forrester analyst will explain how data fabric is emerging as a hot new market for an intelligent and unified platform.
Multi-Cloud Data Integration with Data Virtualization (APAC)Denodo
Watch full webinar here: https://bit.ly/3cnw5MW
More and more organization are adopting multi-cloud strategies to provide greater flexibility, cost savings, and performance optimization. Even when organizations commit to a single cloud provider, they often have data and applications spread across different cloud regions to support different business units or geographies. The result of this is a high distributed infrastructure that makes finding and accessing the data needed for reporting and analytics even more challenging.
The Denodo Platform Multi-Location Architecture provides quick and easy managed access to data while still providing local control to the 'data owners' and complying with local privacy and data protection regulations (think GDPR and CCPA!).
In this on-demand session, you will learn about:
- The challenges facing organizations as they adopt multi-cloud data strategies
- How the Denodo Platform provides a managed data access layer across the organization
- The different multi-location architectures that can maximize local control over data while still making it readily available
- How organizations have benefited from using the Denodo Platform as a multi-cloud data access layer
Data Virtualization – Gateway to a Digital Business - Barry DevlinDenodo
Next-Generation Data Management Afternoon
with InfoRoad and Denodo. Presentation by Dr Barry Devlin, Founder and Principal 9sight Consulting on data virtualization.
Product Keynote: Denodo 8.0 - A Logical Data Fabric for the Intelligent Enter...Denodo
Watch full webinar here: https://bit.ly/2O9gcBT
Denodo 8 expands data integration and management to data fabric with advanced data virtualization capabilities. What are they? Denodo CTO Alberto Pan will touch upon the key Denodo 8 capabilities.
Centralize Security and Governance with Data VirtualizationDenodo
This webinar is part of the series: Data Virtualization Packed Lunch Webinars: https://goo.gl/W1BeCb
Security, data privacy, and data protection represent concerns for organizations that must comply with policies and regulations that can vary across regions, data assets, and personas.
Attend this session to learn how to employ data virtualization for:
Customizing security policies in the data abstraction layer,
Centralizing security when data is spread across multiple systems residing both on-premises and in the cloud, and
Controlling and auditing data access across different regions.
Agenda:
DV for Security and Governance
Product Demonstration
Summary & Next Steps
Q&A
Watch entire webinar on demand here: https://goo.gl/ipOQmW
A Successful Data Strategy for Insurers in Volatile Times (ASEAN)Denodo
Watch full webinar here: https://bit.ly/3rpr4La
Data is an insurer’s most valuable asset. Capitalizing on all of that stored and incoming data to draw valuable insights for business decisions is what ultimately makes a competitive difference.
But, insurers face challenges when it comes to modernizing and digitizing their data architectures. Most organizations rely on traditional systems and data integration processes that are time consuming and slow. In addition, as many adopt cloud strategies, these traditional approaches fill the cloud modernization process with downtime and end user frustration.
This is why insurers need a flexible and easily adaptable data integration technology that allows them to keep up with the ever-changing and growing data environment.
Data virtualization is that modern data integration technology. It can support insurers not only on their journey to digitization, but also on their future infrastructure changes and innovations, adding agility, flexibility and efficiency to data architectures. Data virtualization can help insurance companies create 360° views of deals and claims processes as well as gather quick social media or sensor data for on-the-go risk profiling.
Join this on-demand webinar to:
- Find out why data virtualization should be a part of your enterprise data strategy
- See how this technology can help you capitalize on your data
- Hear how many of your peers are already leveraging the Denodo Platform for Data Virtualization and the benefits they’re observing
Modernizing Data Architecture using Data Virtualization for Agile Data DeliveryDenodo
In this presentation, Dave Kay, Data Consultant within the Analytics and Architecture group at Zurich Insurance, explains how Zurich is modernizing their data infrastructure using data virtualization to accelerate delivery of mortgage insurance and intra-day operational reports to business analysts, salespeople, underwriters, managers, and actuarial staff.
This presentation is part of the Fast Data Strategy Conference, and you can watch the video here goo.gl/GLPPg2.
Cloud Migration headache? Ease the pain with Data Virtualization! (EMEA)Denodo
Watch full webinar here: https://bit.ly/3CWIBzd
Moving data to the Cloud is a priority for many organizations. Benefits - in terms of flexibility, agility, and cost savings - are driving Cloud adoption. This journey to the Cloud is not easy: moving application(s) and data to the Cloud can be challenging and entails disruption of business, when not carefully managed.
When systems are being migrated, the resultant hybrid (or even multi-) Cloud architecture is, by definition, more complex AND making it harder/more costly to retrieve the data we need.
Data Virtualization can help organizations at all stages of a Cloud journey - during migration as well as in our “new hybrid multi-Cloud reality”
Watch on-demand this webinar to learn how Data Virtualization can:
- Help organizations manage risk and minimize the disruption caused as systems are moved to the Cloud
- Provide a single point of access for data that is both on-premise and in the Cloud, making it easier for users to find and access the data that they need
- Provide a secure layer to protect and manage data when it's distributed across hybrid or multi-Cloud architectures
… watch a live demo about how to ease the migration.
Reinventing and Simplifying Data Management for a Successful Hybrid and Multi...Denodo
Watch full webinar here: https://bit.ly/3AdAzkW
Hybrid cloud has become the standard for businesses. A successful move will involve using an intelligent and scalable architecture and seeking the right help. At the same time, multi-cloud strategies are on the rise. More enterprise organizations than ever before are analyzing their current technology portfolio and defining a cloud strategy that encompasses multiple cloud platforms to suit specific app workloads, and move those workloads as they see fit. Learn the key challenges in multi-hybrid data management, and how you can accelerate your digital transformation journey in a multi-cloud environment with data virtualization.
A Successful Data Strategy for Insurers in Volatile Times (EMEA)Denodo
Watch full webinar here: https://bit.ly/34gVVzH
To capitalize on all their data, insurers need a flexible and easily adaptable data integration technology that allows them to keep up with the ever-changing and growing data environment.
Data virtualization is that modern data integration technology. It can support insurers not only on their journey to digitization, but also on their future infrastructure changes and innovations, adding agility, flexibility and efficiency to data architectures.
Join this webinar to:
- Find out why data virtualization should be a part of your enterprise data strategy
- See how this technology can help you capitalize on your data
- Hear how many of your peers are already leveraging the Denodo Platform for Data Virtualization and the benefits they’re observing
Data Virtualization enabled Data Fabric: Operationalize the Data Lake (APAC)Denodo
Watch full webinar here: https://bit.ly/3aIofv9
The best of breed big data fabrics should deliver actionable insights to the business users with minimal effort, provide end-to-end security to the entire enterprise data platform and provide real-time data integration, while delivering self-service data platform to business users.
While big data initiatives have become necessary for any business to generate actionable insights, big data fabric has become a necessity for any successful big data initiative. The best of breed big data fabrics should deliver actionable insights to the business users with minimal effort, provide end-to-end security to the entire enterprise data platform and provide real-time data integration, while delivering self-service data platform to business users.
Attend this session to learn how big data fabric enabled by data virtualization:
- Provides lightning fast self-service data access to business users
- Centralizes data security, governance and data privacy
- Fulfills the promise of data lakes to provide actionable insights
Watch full webinar here: https://bit.ly/2Y0vudM
What is Data Virtualization and why do I care? In this webinar we intend to help you understand not only what Data Virtualization is but why it's a critical component of any organization's data fabric and how it fits. How data virtualization liberates and empowers your business users via data discovery, data wrangling to generation of reusable reporting objects and data services. Digital transformation demands that we empower all consumers of data within the organization, it also demands agility too. Data Virtualization gives you meaningful access to information that can be shared by a myriad of consumers.
Register to attend this session to learn:
- What is Data Virtualization?
- Why do I need Data Virtualization in my organization?
- How do I implement Data Virtualization in my enterprise?
Logical Data Warehouse: The Foundation of Modern Data and Analytics (APAC)Denodo
Watch full webinar here: https://bit.ly/3bBArAc
Companies are investing in data warehouse modernization and data lake projects for descriptive and advanced analytics; however, for the analysis to be holistic, today’s architects weave disparate data streams together, not only from these analytical sources, but also from operational, third party, and streaming data sources. Logical data warehouse is a modern architectural methodology that virtually combines all the data across the enterprise and makes it available to analytical and visualization tools that facilitate timely, insightful, and impactful decisions throughout the enterprise.
In this session, you will learn:
- What is logical data warehouse and how to architect one
- The benefits of logical data warehouse – speed with agility
Presentation by Bart Gielen (DataSense) at the Data Vault Modelling and Data ...Patrick Van Renterghem
DataSense is, together with the internal team, implementing a new Enterprise Data Warehouse (EDW) at Bank Degroof Petercam. The goal of the EDW is to integrate and historicize all changes of data of the Belgian and Luxembourg entities in the medium term and also those of the for the other entities in the long term. All of this complying with and respecting regulations (legal reporting) and service obligations (tax certificates, reports, taxes on funds, return on portfolio, ...). The implementation is done according to the principles of Data Vault 2.0 and automation of the RAW Data Vault layer. For this purpose they use Talend (Enterprise Open Source data integration platform), Vaultspeed (Data Vault Automation) and Oracle (db platform).
Denodo DataFest 2016: Enterprise View of Data with Semantic Data LayerDenodo
Watch the full session: Denodo DataFest 2016 sessions: https://goo.gl/kPmzWU
Gaining an enterprise view of the data across different independent lines of businesses is difficult when the operations, systems, and data are inherently siloed. VSP Global is a conglomerate operating different businesses across eyewear insurance, manufacturing, and retail. They are integrating the silos using a semantic data layer.
In this presentation, the Enterprise Data Architect at VSP Global, Tim Fredricks will present:
• The challenges associated with data siloed across different LOBs
• How to build a semantic data layer using data virtualization
• Centralizing business rules in the data virtualization layer
This session also includes a panel discussion with:
• Tim Fredricks, Enterprise Data Architect at VSP Global
• Rick Hart, Director of Global Technology Solutions at BioStorage Technologies
• Jeff Veis, VP Big Data Platform Marketing at HPE
• Mike Litzkow, Sales Director at Denodo (as moderator)
This session is part of the Denodo DataFest 2016 event. You can also watch more Denodo DataFest sessions on demand here: https://goo.gl/VXb6M6
This session is part of the Denodo DataFest 2016 event. You can also watch more Denodo DataFest sessions on demand here: https://goo.gl/VXb6M6
Analyst Keynote: Forrester: Data Fabric Strategy is Vital for Business Innova...Denodo
Watch full webinar here: https://bit.ly/36GEuJO
Traditional data integration is falling short to meet new business requirements - real-time connected data, self-service, automation, speed, and intelligence. Forrester analyst will explain how data fabric is emerging as a hot new market for an intelligent and unified platform.
Multi-Cloud Data Integration with Data Virtualization (APAC)Denodo
Watch full webinar here: https://bit.ly/3cnw5MW
More and more organization are adopting multi-cloud strategies to provide greater flexibility, cost savings, and performance optimization. Even when organizations commit to a single cloud provider, they often have data and applications spread across different cloud regions to support different business units or geographies. The result of this is a high distributed infrastructure that makes finding and accessing the data needed for reporting and analytics even more challenging.
The Denodo Platform Multi-Location Architecture provides quick and easy managed access to data while still providing local control to the 'data owners' and complying with local privacy and data protection regulations (think GDPR and CCPA!).
In this on-demand session, you will learn about:
- The challenges facing organizations as they adopt multi-cloud data strategies
- How the Denodo Platform provides a managed data access layer across the organization
- The different multi-location architectures that can maximize local control over data while still making it readily available
- How organizations have benefited from using the Denodo Platform as a multi-cloud data access layer
Data Virtualization – Gateway to a Digital Business - Barry DevlinDenodo
Next-Generation Data Management Afternoon
with InfoRoad and Denodo. Presentation by Dr Barry Devlin, Founder and Principal 9sight Consulting on data virtualization.
Product Keynote: Denodo 8.0 - A Logical Data Fabric for the Intelligent Enter...Denodo
Watch full webinar here: https://bit.ly/2O9gcBT
Denodo 8 expands data integration and management to data fabric with advanced data virtualization capabilities. What are they? Denodo CTO Alberto Pan will touch upon the key Denodo 8 capabilities.
Centralize Security and Governance with Data VirtualizationDenodo
This webinar is part of the series: Data Virtualization Packed Lunch Webinars: https://goo.gl/W1BeCb
Security, data privacy, and data protection represent concerns for organizations that must comply with policies and regulations that can vary across regions, data assets, and personas.
Attend this session to learn how to employ data virtualization for:
Customizing security policies in the data abstraction layer,
Centralizing security when data is spread across multiple systems residing both on-premises and in the cloud, and
Controlling and auditing data access across different regions.
Agenda:
DV for Security and Governance
Product Demonstration
Summary & Next Steps
Q&A
Watch entire webinar on demand here: https://goo.gl/ipOQmW
Real time insights for better products, customer experience and resilient pla...Balvinder Hira
Businesses are building digital platforms with modern architecture principles like domain driven design, microservice based, and event-driven. These platforms are getting ever so modular, flexible and complex.
While they are built with architecture principles like - loose coupling, individually scaling, plug-and-play components; regulations and security considerations on data - complexity leads to many unknown and grey areas in the entire architecture. Details on how the different components of this complex architecture interact with each other are lost. Generating insights becomes multi-teams, multi-staged activity and hence multi-days activity.
Multiple users and stakeholders of the platform want different and timely insights to take both corrective and preventive actions.Business teams want to know how business is doing in every corner of the country near real time at a zipcode granularity. Tech teams want to correlate flow changes with system health including that of downstream stability as it happens.Knowing these details also helps in providing the feedback to the platform itself, to make it more efficient and also to the underlying business process.
In this talk we intend to share how we made all the business and technical insights of a complicated platform available in realtime with limited incremental effort and constant validation of the ideas and slices with business teams. Since the client was a Banking client, we will also touch base handling of financial data in a secure way and still enabling insights for a large group of stakeholders.
We kept the self-service aspect at the center of our solution - to accommodate increasing components in the source platform, evolving requirements, even to support new platforms altogether. Configurability and Scalability were key here, it was important that all the data that was collected from the source platform was discoverable and presentable. This also led to evolving the solution in lines of domain data products, where the data is generated and consumed by those who understand it the best.
Cloud Modernization and Data as a Service OptionDenodo
Watch: https://bit.ly/2E99UNO
The current data landscape is fragmented, not just in location but also in terms of shape and processing paradigms. Cloud has become a key component of modern architecture design. Data lakes, IoT, NoSQL, SaaS, etc. coexist with relational databases to fuel the needs of modern analytics, ML and AI. Exploring and understanding the data available within your organization is a time-consuming task. And all of this without even knowing if that data will be useful at all.
Attend this session to learn:
- How dynamic data challenges and the speed of change requires a new approach to data architecture.
- Learn how logical data architecture can enable organizations to transition data faster to the cloud with zero downtime.
- Explore how data as a service and other API management capabilities is a must in a hybrid cloud environment.
Agile Mumbai 2022
Real-Time Insights and AI for better Products, Customer experience and Resilient Platform
Balvinder Kaur
Principal Consultant, Thoughtworks
Sushant Joshi
Product Manager, Thoughtworks
Contexti / Oracle - Big Data : From Pilot to ProductionContexti
Big Data is moving from hype to reality for many organisations. The value proposition is clear and sponsorship is high, but how do organisations execute?
Join Oracle and Contexti to discuss the typical journey of a big data project from concept to pilot to production.
• Discuss our experience with a regional Telco
• Common Use Cases across key verticals
• Defining and prioritising use cases
• The challenge of moving from Pilot to Production
• Common Operating Models for Big Data
• Funding a Big Data Capability going forward
• Pilots - common mistakes; challenges; success criteria
CSC - Presentation at Hortonworks Booth - Strata 2014Hortonworks
Come hear about how companies are kick-starting their big data projects without having to find good people, hire them, and get IT to prioritize it to get your project off the ground. Remove risk from your project, ensure scalability , and pay for just the nodes you use in a monthly utility pricing model. Worried about Data Governance, Security, want it in the cloud, can’t have it in the cloud….eliminate the hurdles with a fully managed service backed by CSC. Get your modern data architecture up and running in as little as 30 days with the Big Data Platform As A Service offering from CSC. Computer Science Corporation is a Certified Technology Partner of Hortonworks and is a Global System Integrator with over 80,000 employees globally.
Key Considerations While Rolling Out Denodo PlatformDenodo
Watch full webinar here: https://bit.ly/3zaPGLO
Our approach for data virtualization advisory takes the following 3 dimensions/areas into consideration:
- Technology / Architecture
- Business User Groups (your clients)
- IT Organization
To deliver quick results, Q-PERIOR uses a multitude of accelerators in predefined topics within these three dimensions. In our presentation we will elaborate on client examples why such an exercise makes sense before rolling out Denodo and what kind of risks you can avoid doing so.
Denodo Partner Connect: Business Value Demo with Denodo Demo LiteDenodo
Watch full webinar here: https://buff.ly/3OCQvGk
In this session, Denodo Sales Engineer, Yik Chuan Tan, will guide you through the art of delivering a compelling demo of the Denodo Platform with Denodo Demo Lite. Watch to uncover the significant functionalities that set Denodo apart and learn how to effectively win over potential customers.
In this session, we will cover:
Understanding the Denodo Platform & Tailoring Your Demo to Prospect Needs: By gaining a comprehensive understanding of the Denodo Platform, its architecture, and how it addresses data management challenges, you can customize your demo to align with the specific needs and pain points of your prospects, including:
- seamless data integration with real-time access
- data security and governance
- self-service data discovery
- advanced analytics and reporting
- performance optimization scalability and deployment
Watch this Denodo demo session and acquire the skills and knowledge necessary to captivate your prospects. Whether you're a seasoned technical professional or new to the field, this session will equip you with the skills to deliver compelling demos that lead to successful conversions.
SOA with Data Virtualization (session 4 from Packed Lunch Webinar Series)Denodo
A robust SOA infrastructure is the lifeblood to application and process integration within the organization. However, the SOA stack (ESB, BPM, CEP, and so on) has often not mixed well with the traditional data integration stack – in most cases, data integration has been the ‘poor cousin’ in this relationship. Data Virtualization allows you to easily and quickly create a virtual data services layer to integrate cleanly into your SOA infrastructure – and also support new initiatives, such as mobile and cloud applications
More information and FREE registrations for this webinar: http://goo.gl/apGLPt
Landing page for the entire Packed Lunch webinar series: http://goo.gl/NATMHw .
Attend & get unique insights into:
- How Data Virtualization enables a more agile data architecture that better aligns with your SOA infrastructure.
- How to easily and quickly create data services to expose your data sources in a SOA-friendly way.
- Denodo’s unique linked RESTful data services that simplify building mobile and web applications.
- Case studies that demonstrate how Data Virtualization has enhanced existing SOA and BPM systems.
Watch full webinar here: https://bit.ly/2xc6IO0
To solve these challenges, according to Gartner "through 2022, 60% of all organizations will implement data virtualization as one key delivery style in their data integration architecture". It is clear that data virtualization has become a driving force for companies to implement agile, real-time and flexible enterprise data architecture.
In this session we will look at the data integration challenges solved by data virtualization, the main use cases and examine why this technology is growing so fastly. You will learn:
- What data virtualization really is
- How it differs from other enterprise data integration technologies
- Why data virtualization is finding enterprise-wide deployment inside some of the largest organizations
Ethical AI at VDAB, presented by Vincent Buekenhout (Ethical AI Lead, VDAB) a...Patrick Van Renterghem
Vincent Buekenhout presented the various AI initiatives at VDAB, its AI4Good strategy, the way applications are designed, and most of all, the way ethics, measurements through KPI's, explainability and fairness play a role in this. Vincent also explained how ethics-by-design works at VDAB.
Implementing error-proof, business-critical Machine Learning, presentation by...Patrick Van Renterghem
This presentation by Deevid De Meyer outlines how Brainjar uses human-centric design and explainability to create machine learning systems that work together with humans to improve efficiency while reducing error rate.
Building Trust and Explainability into Chatbots: the Partena Ziekenfonds Busi...Patrick Van Renterghem
Chatbots and conversational interfaces are taking over customer service departments by storm. In many companies, they provide first-line support to customers. Based on the Partena Ziekenfonds business case, Karel Kremer shares a few critical success factors...
AI & Ethics: The Belgian Industry Vision & Initiatives, presentation by Jelle...Patrick Van Renterghem
Jelle Hoedemaekers (Agoria) explains why Belgian companies are working on ethical AI, and provides an overview of Belgian and European AI Initiatives with a focus on ethics.
Responsible AI: An Example AI Development Process with Focus on Risks and Con...Patrick Van Renterghem
Organisations need to make sure that they use AI in an appropriate way. Martijn and Hugo explain how to ensure that the developments are ethically sound and comply with regulations, how to have end-to-end governance, and how to address bias and fairness, interpretability and explainability, and robustness and security.
During the conference, we looked at an example AI development process with focussing on the risks to be managed and the controls that can be established.
Fairness and Transparency: Algorithmic Explainability, some Legal and Ethical...Patrick Van Renterghem
In this presentation, Nazanin Gifani discussed some of the ethical and legal issues of automated decision making, including algorithmic fairness, transparency and explainability. The big question here is: can AI help us to make fairer decisions ?
How obedient digital twins and intelligent beings contribute to ethics and ex...Patrick Van Renterghem
Paul Valckenaers explains how intelligence is added to a corresponding reality without introducing limitations into a world-of-interest. The outcome is obedience: a conflict with an obedient digital twin is a conflict with its real-world counterpart. Illustrated by healthcare examples.
He Said, She Said: Finding and Fixing Bias in NLP (Natural Language Processin...Patrick Van Renterghem
Yves Peirsman presents several instances where bias has posed a risk to the successful adoption of NLP systems, and discusses what techniques exist to discover these biases before the systems are put in production.
Introduction to Bias in Machine Learning, presented by Matthias Feys, CTO @ M...Patrick Van Renterghem
In this talk, Matthias Feys explains what bias in Machine Learning models actually means. You will get insights in the complexity of the problem and learn realistic ways to reduce bias.
Business Case: Ozitem Groupe, where 80% of the company is working remotely. R...Patrick Van Renterghem
Roxane Pasina (Ozitem's Chief Marketing and Communication Officer) explains and shows how Ozitem Groupe went in 1 year from an old Intranet to an interactive digital workplace allowing them to overcome their communication challenges, using the Jamespot digital workplace tools
Digital Workplace Case Study: How the Municipality of Duffel successfully swi...Patrick Van Renterghem
In 6 months time, the Gemeente/Municipality of Duffel has come quite close to transform into a forceful, digital local government thanks to the help of Synergics
Unleashing the Full Potential of People, Teams and SOLVAY, presented by Bruce...Patrick Van Renterghem
Bruce Fecheyr-Lippens (then SVP, Global Head Agile Working, Digital HR, People Analytics, and HR Director Excellence Center at Solvay) presented the digital workplace environment of Solvay #DWA19 #presentation #digitalworkplace #huapii
The Building Blocks of a Digital Workplace, presented by Sam Marshall at the ...Patrick Van Renterghem
Sam Marshall, manager of Clearbox Consulting, presented the key building blocks to fulfil the purpose of a digital workplace: to optimise the employee experience #DWA19 #presentation #digitalworkplace #DEX
Engie's Digital Workplace and "Connecting the company" business case, present...Patrick Van Renterghem
Jan Vanoudendycke (Director of Knowledge Management at Engie) presented the vision, roll-out and adoption process of the massive Engie Digital Workplace effort to connect everyone in the 150.000 people company #DWA19 #presentation #engie
Face your communication challenges when implementing a digital workplace, bas...Patrick Van Renterghem
Ellen Geens (ChangeLab) described the communication challenges, and gave tips and tricks for the change communication when implementing a digital workplace at their RIZIV and TVH customers
The first steps in Recticel's Digital Workplace program by Kenneth Meuleman (...Patrick Van Renterghem
Kenneth and Serge presented the first steps in Recticel's digital workplace program, and the managed Microsoft Teams and OneDrive for Business rollout #DWA19 #presentation #recticel
Presentation by Dave Geentjens at the "Successful Digital Workplace Adoption"...Patrick Van Renterghem
Dave Geentjens described the evolution of the Digital Workplace at the Flemish Government / Vlaamse Overheid: the challenges, the opportunities and the realisations so far #DWA19 #presentation #vlaamseoverheid
The central information provision layer within Argenta is the name for the central data hub, based on near real time Data Vault, which on one hand answers the information needs of the bank but also feeds applications such as MIFID. This layer is also the base from which the data governance is enforced. For this purpose they use Oracle Enterprise Metadata Manager and Collibra.
Presentation by Ivan Schotsmans (DV Community) at the Data Vault Modelling an...Patrick Van Renterghem
The start of GDPR implementations in Europe was, for most organizations, also the start of rethinking their Data Warehouse strategy. The experience of past implementations gave a better view on the do's and don'ts. One of the important lessons learned was the approach of handling information quality. It's not something you handle on top of your data warehouse. To be successful, information quality goes hand in hand with your data warehouse implementation.
Presentation by Luc Delanglez (DataLumen) at the Data Vault Modelling and Dat...Patrick Van Renterghem
During this session, Luc Delanglez provides some practical insights to get you started the right way on Traceability, Roles & responsibilities, Data stewardship, Data lineage and impact analysis, Critical functional components, Business Glossary, Data Catalog, Data Quality, Master/Reference Data Management, Compliancy & privacy, all very important aspects of data governance.
Neuro-symbolic is not enough, we need neuro-*semantic*Frank van Harmelen
Neuro-symbolic (NeSy) AI is on the rise. However, simply machine learning on just any symbolic structure is not sufficient to really harvest the gains of NeSy. These will only be gained when the symbolic structures have an actual semantics. I give an operational definition of semantics as “predictable inference”.
All of this illustrated with link prediction over knowledge graphs, but the argument is general.
Key Trends Shaping the Future of Infrastructure.pdfCheryl Hung
Keynote at DIGIT West Expo, Glasgow on 29 May 2024.
Cheryl Hung, ochery.com
Sr Director, Infrastructure Ecosystem, Arm.
The key trends across hardware, cloud and open-source; exploring how these areas are likely to mature and develop over the short and long-term, and then considering how organisations can position themselves to adapt and thrive.
"Impact of front-end architecture on development cost", Viktor TurskyiFwdays
I have heard many times that architecture is not important for the front-end. Also, many times I have seen how developers implement features on the front-end just following the standard rules for a framework and think that this is enough to successfully launch the project, and then the project fails. How to prevent this and what approach to choose? I have launched dozens of complex projects and during the talk we will analyze which approaches have worked for me and which have not.
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024Tobias Schneck
As AI technology is pushing into IT I was wondering myself, as an “infrastructure container kubernetes guy”, how get this fancy AI technology get managed from an infrastructure operational view? Is it possible to apply our lovely cloud native principals as well? What benefit’s both technologies could bring to each other?
Let me take this questions and provide you a short journey through existing deployment models and use cases for AI software. On practical examples, we discuss what cloud/on-premise strategy we may need for applying it to our own infrastructure to get it to work from an enterprise perspective. I want to give an overview about infrastructure requirements and technologies, what could be beneficial or limiting your AI use cases in an enterprise environment. An interactive Demo will give you some insides, what approaches I got already working for real.
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Search and Society: Reimagining Information Access for Radical FuturesBhaskar Mitra
The field of Information retrieval (IR) is currently undergoing a transformative shift, at least partly due to the emerging applications of generative AI to information access. In this talk, we will deliberate on the sociotechnical implications of generative AI for information access. We will argue that there is both a critical necessity and an exciting opportunity for the IR community to re-center our research agendas on societal needs while dismantling the artificial separation between the work on fairness, accountability, transparency, and ethics in IR and the rest of IR research. Instead of adopting a reactionary strategy of trying to mitigate potential social harms from emerging technologies, the community should aim to proactively set the research agenda for the kinds of systems we should build inspired by diverse explicitly stated sociotechnical imaginaries. The sociotechnical imaginaries that underpin the design and development of information access technologies needs to be explicitly articulated, and we need to develop theories of change in context of these diverse perspectives. Our guiding future imaginaries must be informed by other academic fields, such as democratic theory and critical theory, and should be co-developed with social science scholars, legal scholars, civil rights and social justice activists, and artists, among others.
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered QualityInflectra
In this insightful webinar, Inflectra explores how artificial intelligence (AI) is transforming software development and testing. Discover how AI-powered tools are revolutionizing every stage of the software development lifecycle (SDLC), from design and prototyping to testing, deployment, and monitoring.
Learn about:
• The Future of Testing: How AI is shifting testing towards verification, analysis, and higher-level skills, while reducing repetitive tasks.
• Test Automation: How AI-powered test case generation, optimization, and self-healing tests are making testing more efficient and effective.
• Visual Testing: Explore the emerging capabilities of AI in visual testing and how it's set to revolutionize UI verification.
• Inflectra's AI Solutions: See demonstrations of Inflectra's cutting-edge AI tools like the ChatGPT plugin and Azure Open AI platform, designed to streamline your testing process.
Whether you're a developer, tester, or QA professional, this webinar will give you valuable insights into how AI is shaping the future of software delivery.
State of ICS and IoT Cyber Threat Landscape Report 2024 previewPrayukth K V
The IoT and OT threat landscape report has been prepared by the Threat Research Team at Sectrio using data from Sectrio, cyber threat intelligence farming facilities spread across over 85 cities around the world. In addition, Sectrio also runs AI-based advanced threat and payload engagement facilities that serve as sinks to attract and engage sophisticated threat actors, and newer malware including new variants and latent threats that are at an earlier stage of development.
The latest edition of the OT/ICS and IoT security Threat Landscape Report 2024 also covers:
State of global ICS asset and network exposure
Sectoral targets and attacks as well as the cost of ransom
Global APT activity, AI usage, actor and tactic profiles, and implications
Rise in volumes of AI-powered cyberattacks
Major cyber events in 2024
Malware and malicious payload trends
Cyberattack types and targets
Vulnerability exploit attempts on CVEs
Attacks on counties – USA
Expansion of bot farms – how, where, and why
In-depth analysis of the cyber threat landscape across North America, South America, Europe, APAC, and the Middle East
Why are attacks on smart factories rising?
Cyber risk predictions
Axis of attacks – Europe
Systemic attacks in the Middle East
Download the full report from here:
https://sectrio.com/resources/ot-threat-landscape-reports/sectrio-releases-ot-ics-and-iot-security-threat-landscape-report-2024/
5. A Modern Data Virtualization Architecture
DATA CONSUMERS
DISPARATE DATA SOURCES
SQL Queries
(JDBC, ODBC, ADO.NET)
Web Services
(SOAP, REST, OData)
Web-based catalog
& search
Secure delivery
(SSL/TLS)
DATA CONSUMERS
MPP Processing
Relational Cache
Corporate Security
Monitoring &
Auditing
Metadata
Repository
Execution Engine
& Optimizer
DATA VIRTUALIZATION
6. Unified Data Integration & Delivery Platform for Business
4. Access with any tool / protocol –
Data service / API Layer
5. Centralized metadata, security &
governance
6. 90% time to market reductions and
cost savings
1. Single access point to data – Digital
Marketplace
2. Data in business friendly form –
Semantic Layer
3. Data adapted for the needs of each
LOB, type of user and application
8. InfoRoad in a nutshell
BI Service Provider with focus on the data virtualization concept
Founded in 2016, part of the Group
& Reseller
9. InfoRoad services / Denodo Tool consultancy
Denodo professional services, based
on proven best practices.
Introduction & deployment of the
Denodo platform, starting from green
field situation or enhancing your
existing solutions.
From detailed tool expertise to overall
project ownership.
Denodo Tool consultancy
Proof-Of-Concepts
Architecture Advice
Denodo adoption service
Denodo reseller
10. InfoRoad services / Proof-Of-Concepts
Guided introduction to Denodo and data
virtualization to prove the added value,
with your data and/or infrastructure.
Recommended in case you want to try
out DV first to see, feel and experience
the benefits.
3 steps:
– Use case definition & agreement
– Installation & PoC Execution
– Evaluation
Tool consultancy
Proof-Of-Concepts
Architecture Advice
Denodo reseller
Denodo adoption service
11. InfoRoad services / Architecture Advice
We investigate and advise on how to fit
in data virtualization in the customer’s
architecture, infrastructure and
application landscape.
Denodo platform architecture
deployment assistance
Tool consultancy
Proof-Of-Concepts
Architecture Advice
Denodo reseller
Denodo adoption service
12. InfoRoad services / Denodo adoption service
This service defines and delivers the
Denodo implementation strategy, based
on best practices.
The end goal is to achieve an
enterprise adoption of DV within the
company, including successful solution
delivery.
The reference DV Implementation
Strategy is based on three
concurrent tracks:
– DV Platform Capability
– Business Delivery
– Communication program
Tool consultancy
Proof-Of-Concepts
Architecture Advice
Denodo reseller
Denodo adoption service
13. InfoRoad services / Denodo reseller
Tool consultancy
Proof-Of-Concepts
Architecture Advice
Denodo reseller
Denodo adoption service
Identify use cases for data virtualization
Define the business case for data
virtualization
Denodo license model advice
ROI calculation
Act as your local product reseller
15. How to start smoothly with Data Virtualization
InfoRoad / Denodo
“Adoption Service”
16. Reference Implementation Strategy (Adoption service)
This service defines and delivers the Denodo implementation strategy, based on
best practices. The end goal is to achieve an enterprise adoption of DV within the
company, including successful solution delivery and management buy-in.
The reference DV Implementation Strategy is based on three concurrent tracks:
Continuous value delivery by means of projects
Manage user community & communicate and review DV
successes at executive level
Denodo tool adoption in existing application landscape
17. Take-aways & advice
Define upfront the company reference architectures + DV positioning
– Company wide access layer?
– Source of your ETL process?
– Operational and/or analytical usage?
Define for which use cases you will (not) use DV
– ETL / DV decision tree
Implement tool governance in the tool as of day one
– Technical governance & functional governance
– Security, versioning, …
18. Take-aways & advice
Apply best-practices from the very first implementation
– Organisation of views (“view layering”)
– InfoRoad adoption service
Choose the right deployment pattern and growth model for your DV tool
– On premise / Cloud ?
– High availability considerations
– Geographical considerations
– Nature of the use cases (operational / analytical)
– Workload & performance expectations
19. Take-aways & advice
Do a Proof Of Concept (to get over the performance concern)
– Define use cases upfront
– Define evaluation criteria upfront
Realize quick wins and make these visible to your stakeholders
The data virtualization concepts enhances and extends your existing BI
architecture. It doesn’t destroy it ☺
InfoRoad doesn’t have disappointed customers…
20. Gartner Gives DV its Highest Maturity Rating
20
“Data
Virtualization
can be
deployed with
low risk and
effort to
achieve
maximum
value.”