Watch the full webinar here: https://buff.ly/2KBKzLJ
Digital transformation, as cliché as it sounds, is on top of every decision maker’s strategic initiative list. And at the heart of any digital transformation, no matter the industry or the size of the company, there is an application programming interface (API) strategy. While API platforms enable companies to manage large numbers of APIs working in tandem, monitor their usage, and establish security between them, they are not optimized for data integration, so they cannot easily or quickly integrate large volumes of data between different systems. Data virtualization, however, can greatly enhance the capabilities of an API platform, increasing the benefits of an API-based architecture. With data virtualization as part of an API strategy, companies can streamline digital transformations of any size and scope.
Join us for this webinar to see these technologies in action in a demo and to get the answers to the following questions:
*How can data virtualization enhance the deployment and exposure of APIs?
*How does data virtualization work as a service container, as a source for microservices and as an API gateway?
*How can data virtualization create managed data services ecosystems in a thriving API economy?
*How are GetSmarter and others are leveraging data virtualization to facilitate API-based initiatives?
Enabling Data as a Service with the JBoss Enterprise Data Services Platformprajods
This presentation was given at JUDCon 2013, Jan 17,18 at Bangalore. Presented by Prajod Vettiyattil and Gnanaguru Sattanathan. The presentation deals with the Why, What and How of Data Services and Data Services Platforms. It also explains the features of the JBoss Enterprise Data Services Platform.
The need for Data Services is explained with 3 Business use cases:
1. Post purchase customer experience improvement for an Auto manufacturer
2. Enterprise Data Access Layer
3. Data Services for Regulatory Reporting requirements like Dodd Frank
Many companies today move mountains of data using ETL (extract, transform, load) technology. But data volumes are growing too large to move, customers are now expecting real-time data, and ETL costs now account for 10-15% of computing capacity. In this slide presentation, you can see how data virtualization enables data structures that were designed independently to be leveraged together, in real time, and without data movement, reducing complexity, lowering IT costs, and minimizing risk.
An Introduction to Data Virtualization in 2018Denodo
Watch full webinar on demand here: https://goo.gl/Rdrc1w
"Through 2020, 50% of enterprises will implement some form of data virtualization as one enterprise production option for data integration" according to Gartner. It is clear that data virtualization has become a driving force for companies to implement an agile, real-time and flexible enterprise data architecture.
Attend this session to learn:
• What data virtualization actually means and how it differs from traditional data integration approaches
• The all important use cases and key patterns of data virtualization
• What to expect in the upcoming sessions in the Packed Lunch Webinar Series, which will take a deeper dive into various challenges solved by data virtualization in big data analytics, cloud migration and various other scenarios
Agenda:
• Introduction & benefits of DV
• Summary & next steps
• Q&A
Data virtualization, Data Federation & IaaS with Jboss TeiidAnil Allewar
Enterprise have always grappled with the problem of information silos that needed to be merged using multiple data warehouses(DWs) and business intelligence(BI) tools so that enterprises could mine this disparate data for businessdecisions and strategy. Traditionally this data integration was done with ETL by consolidating multiple DBMS into a single data storage facility.
Data virtualization enables abstraction, transformation, federation, and delivery of data taken from variety of heterogeneous data sources as if it is a single virtual data source without the need to physically copy the data for integration. It allows consuming applications or users to access data from these various sources via a request to a single access point and delivers information-as-a-service (IaaS).
In this presentation, we will explore what data virtualization is and how it differs from the traditional data integration architecture. We’ll also look at validating the data virtualization and federation concepts by working through an example(see videos at the GitHub repo) to federate data across 2 heterogeneous data sources; mySQL and MongoDB using the JBoss Teiid data virtualization platform.
Simplifying Cloud Architectures with Data VirtualizationDenodo
Watch here: https://bit.ly/2yxLo6f
Moving applications and data to the Cloud is a priority for many organizations. The benefits - in terms of flexibility, agility, and cost savings - are driving Cloud adoption. However, the journey to the Cloud is not as easy as many people think. The process of moving application and data to the Cloud is challenging and can entail widespread disruption across the organization if not carefully managed. Even when systems are migrated to the Cloud, the resultant hybrid or multi-Cloud architecture is more complex for users to navigate, making it harder for them to get the data that they need to do their jobs.
Data Virtualization can help organizations at all stages of their journey to the Cloud - during migration and also in the resultant hybrid or multi-Cloud architectures. Attend this webinar to learn how Data Virtualization can:
- Help organizations manage risk and minimize the disruption caused as systems are moved to the Cloud
- Provide a single point of access for data that is both on-premise and in the Cloud, making it easier for users to find and access the data that they need
- Provide a security layer to protect and manage your data when it's distributed across hybrid or multi-Cloud architectures
Solution architecture for big data projects
solution architecture,big data,hadoop,hive,hbase,impala,spark,apache,cassandra,SAP HANA,Cognos big insights
Artificial intelligence and machine learning are currently all the rage. Every organisation is trying to jump on this bandwagon and cash in on their data reserves. At ThoughtWorks, we’d agree that this tech has huge potential — but as with all things, realising value depends on understanding how best to use it.
Enabling Data as a Service with the JBoss Enterprise Data Services Platformprajods
This presentation was given at JUDCon 2013, Jan 17,18 at Bangalore. Presented by Prajod Vettiyattil and Gnanaguru Sattanathan. The presentation deals with the Why, What and How of Data Services and Data Services Platforms. It also explains the features of the JBoss Enterprise Data Services Platform.
The need for Data Services is explained with 3 Business use cases:
1. Post purchase customer experience improvement for an Auto manufacturer
2. Enterprise Data Access Layer
3. Data Services for Regulatory Reporting requirements like Dodd Frank
Many companies today move mountains of data using ETL (extract, transform, load) technology. But data volumes are growing too large to move, customers are now expecting real-time data, and ETL costs now account for 10-15% of computing capacity. In this slide presentation, you can see how data virtualization enables data structures that were designed independently to be leveraged together, in real time, and without data movement, reducing complexity, lowering IT costs, and minimizing risk.
An Introduction to Data Virtualization in 2018Denodo
Watch full webinar on demand here: https://goo.gl/Rdrc1w
"Through 2020, 50% of enterprises will implement some form of data virtualization as one enterprise production option for data integration" according to Gartner. It is clear that data virtualization has become a driving force for companies to implement an agile, real-time and flexible enterprise data architecture.
Attend this session to learn:
• What data virtualization actually means and how it differs from traditional data integration approaches
• The all important use cases and key patterns of data virtualization
• What to expect in the upcoming sessions in the Packed Lunch Webinar Series, which will take a deeper dive into various challenges solved by data virtualization in big data analytics, cloud migration and various other scenarios
Agenda:
• Introduction & benefits of DV
• Summary & next steps
• Q&A
Data virtualization, Data Federation & IaaS with Jboss TeiidAnil Allewar
Enterprise have always grappled with the problem of information silos that needed to be merged using multiple data warehouses(DWs) and business intelligence(BI) tools so that enterprises could mine this disparate data for businessdecisions and strategy. Traditionally this data integration was done with ETL by consolidating multiple DBMS into a single data storage facility.
Data virtualization enables abstraction, transformation, federation, and delivery of data taken from variety of heterogeneous data sources as if it is a single virtual data source without the need to physically copy the data for integration. It allows consuming applications or users to access data from these various sources via a request to a single access point and delivers information-as-a-service (IaaS).
In this presentation, we will explore what data virtualization is and how it differs from the traditional data integration architecture. We’ll also look at validating the data virtualization and federation concepts by working through an example(see videos at the GitHub repo) to federate data across 2 heterogeneous data sources; mySQL and MongoDB using the JBoss Teiid data virtualization platform.
Simplifying Cloud Architectures with Data VirtualizationDenodo
Watch here: https://bit.ly/2yxLo6f
Moving applications and data to the Cloud is a priority for many organizations. The benefits - in terms of flexibility, agility, and cost savings - are driving Cloud adoption. However, the journey to the Cloud is not as easy as many people think. The process of moving application and data to the Cloud is challenging and can entail widespread disruption across the organization if not carefully managed. Even when systems are migrated to the Cloud, the resultant hybrid or multi-Cloud architecture is more complex for users to navigate, making it harder for them to get the data that they need to do their jobs.
Data Virtualization can help organizations at all stages of their journey to the Cloud - during migration and also in the resultant hybrid or multi-Cloud architectures. Attend this webinar to learn how Data Virtualization can:
- Help organizations manage risk and minimize the disruption caused as systems are moved to the Cloud
- Provide a single point of access for data that is both on-premise and in the Cloud, making it easier for users to find and access the data that they need
- Provide a security layer to protect and manage your data when it's distributed across hybrid or multi-Cloud architectures
Solution architecture for big data projects
solution architecture,big data,hadoop,hive,hbase,impala,spark,apache,cassandra,SAP HANA,Cognos big insights
Artificial intelligence and machine learning are currently all the rage. Every organisation is trying to jump on this bandwagon and cash in on their data reserves. At ThoughtWorks, we’d agree that this tech has huge potential — but as with all things, realising value depends on understanding how best to use it.
Why Data Virtualization? An Introduction by DenodoJusto Hidalgo
Data Virtualization means Real-time Data Access and Integration. But why do I need it? This presentation tries to answer it in a simple yet clear way.
By Alberto Pan, CTO of Denodo, and Justo Hidalgo, VP Product Management.
Big Data Fabric for At-Scale Real-Time Analysis by Edwin RobbinsData Con LA
Abstract:- Companies are adopting big data for performing high-velocity real-time analytics on very large volumes of data to enable rapid analysis for business users using self-service and never-before-realized use cases. However, such projects have yielded limited value because these big data systems have become siloed from the rest of the enterprise systems holding critical business operational data. Big Data Fabric is a modern data architecture combining data virtualization, data prep, and lineage capabilities to seamlessly integrate at scale these huge, siloed volumes of structured and unstructured data with other enterprise data assets. This presentation will demonstrate with proven customer case studies in big data and IoT about the value of using big data fabric as a logical data lake for big data analytics.
Modern Data Management for Federal ModernizationDenodo
Watch full webinar here: https://bit.ly/2QaVfE7
Faster, more agile data management is at the heart of government modernization. However, Traditional data delivery systems are limited in realizing a modernized and future-proof data architecture.
This webinar will address how data virtualization can modernize existing systems and enable new data strategies. Join this session to learn how government agencies can use data virtualization to:
- Enable governed, inter-agency data sharing
- Simplify data acquisition, search and tagging
- Streamline data delivery for transition to cloud, data science initiatives, and more
Where does Fast Data Strategy Fit within IT ProjectsDenodo
Fast Data Strategy is a must for organizations to become and be competitive. There are four use cases where Fast Data Strategy fits within IT Projects - Agile BI, Big Data/ Cloud, Data Services, and Single View. In this presentation, you will discover how four customers used data virtualization and Fast Data Strategy for these use cases.
This presentation is part of the Fast Data Strategy Conference, and you can watch the video here goo.gl/UxHMuJ.
An introduction to data virtualization in business intelligenceDavid Walker
A brief description of what Data Virtualisation is and how it can be used to support business intelligence applications and development. Originally presented to the ETIS Conference in Riga, Latvia in October 2013
Creating a Modern Data Architecture for Digital TransformationMongoDB
By managing Data in Motion, Data at Rest, and Data in Use differently, modern Information Management Solutions are enabling a whole range of architecture and design patterns that allow enterprises to fully harness the value in data flowing through their systems. In this session we explored some of the patterns (e.g. operational data lakes, CQRS, microservices and containerisation) that enable CIOs, CDOs and senior architects to tame the data challenge, and start to use data as a cross-enterprise asset.
Big data insights with Red Hat JBoss Data VirtualizationKenneth Peeples
You’re hearing a lot about big data these days. And big data and the technologies that store and process it, like Hadoop, aren’t just new data silos. You might be looking to integrate big data with existing enterprise information systems to gain better understanding of your business. You want to take informed action.
During this session, we’ll demonstrate how Red Hat JBoss Data Virtualization can integrate with Hadoop through Hive and provide users easy access to data. You’ll learn how Red Hat JBoss Data Virtualization:
Can help you integrate your existing and growing data infrastructure.
Integrates big data with your existing enterprise data infrastructure.
Lets non-technical users access big data result sets.
We’ll also provide typical uses cases and examples and a demonstration of the integration of Hadoop sentiment analysis with sales data.
Denodo 6.0: Self Service Search, Discovery & Governance using an Universal Se...Denodo
Presentation slides taken from Fast Data Strategy Roadshow San Francisco Bay Area.
For more Denodo 6-0 demos, please follow this link:https://goo.gl/XkxJjX
Partner Enablement: Key Differentiators of Denodo Platform 6.0 for the FieldDenodo
If you’re a Denodo Partner, this presentation is for you. Learn how to gain a competitive edge in the marketplace with Denodo Platform 6.0, and leverage off the new features and functionality.
This presentation is part of the Fast Data Strategy Conference, and you can watch the video here goo.gl/Qh8MeX.
Augmentation, Collaboration, Governance: Defining the Future of Self-Service BIDenodo
Watch full webinar here: https://bit.ly/3zVJRRf
According to Dresner Advisory’s 2020 Self-Service Business Intelligence Market Study, 62% of the responding organizations say self-service BI is critical for their business. If we look deeper into the need for today’s self-service BI, it’s beyond some Executives and Business Users being enabled by IT for self-service dashboarding or report generation. Predictive analytics, self-service data preparation, collaborative data exploration are all different facets of new generation self-service BI. While democratization of data for self-service BI holds many benefits, strict data governance becomes increasingly important alongside.
In this session we will discuss:
- The latest trends and scopes of self-service BI
- The role of logical data fabric in self-service BI
- How Denodo enables self-service BI for a wide range of users - Customer case study on self-service BI
Parallel In-Memory Processing and Data Virtualization Redefine Analytics Arch...Denodo
To watch full webinar, follow this link: https://goo.gl/3s9hRG
The tide is changing for analytics architectures. Traditional approaches, from the data warehouse to the data lake, implicitly assume that all relevant data can be stored in a single, centralized repository. But this approach is slow and expensive, and sometimes not even feasible, because some data sources are too big to be replicated, and data is often too distributed such as those found in cloud data sources to make a “full centralization” strategy successful.
Attend this webinar to learn:
• Why Logical architectures are the best option when integrating Big Data.
• How Denodo’s parallel in-memory capabilities with dynamic query optimization redefine analytics architectures.
• How IT can meet business demands for data much faster with Data Virtualization.
Agenda:
• Challenges with traditional approaches for analytics architectures.
• Overview of Denodo's parallel in-memory capabilities.
• Product Demo of parallel in-memory capabilities accelerating analytics performance.
• Q&A.
To watch all webinars in Denodo's Packed Lunch Webinar Series, follow this link: https://goo.gl/4xL9wM
The global need to securely derive (instant) insights, have motivated data architectures from distributed storage, to data lakes, data warehouses and lake-houses. In this talk we describe Tag.bio, a next generation data mesh platform that embeds vital elements such as domain centricity/ownership, Data as Products, Self-serve architecture, with a federated computational layer. Tag.bio data products combine data sets, smart APIs, statistical and machine learning algorithms into decentralized data products for users to discover insights using FAIR Principles. Researchers can use its point and click (no-code) system to instantly perform analysis and share versioned, reproducible results. The platform combines a dynamic cohort builder with analysis protocols and applications (low-code) to drive complex analysis workflows. Applications within data products are fully customizable via R and Python plugins (pro-code), and the platform supports notebook based developer environments with individual workspaces.
Join us for a talk/demo session on Tag.bio data mesh platform and learn how major pharma industries and university health systems are using this technology to promote value based healthcare, precision healthcare, find cures for disease, and promote collaboration (without explicitly moving data around). The talk also outlines Tag.bio secure data exchange features for real world evidence datasets, privacy centric data products (confidential computing) as well as integration with cloud services
Data Lakes: 8 Enterprise Data Management RequirementsSnapLogic
2016 is the year of the data lake. As you consider adopting an enterprise data lake strategy to manage more dynamic, poly-structured data, your data integration strategy must also evolve to handle new requirements. Thinking you can simply hire more developers to write code or rely on your legacy rows-and-columns centric tools is a recipe to sink in a data swamp instead of swimming in a data lake.
In this presentation, you'll learn about eight enterprise data management requirements that must be addressed in order to get maximum value from your big data technology investments.
To learn more, visit: https://www.snaplogic.com/big-data
Yelp has operated our connector ecosystem to feed vital data to domain-specific teams and data stores. We share some of our learning and experiences on operating such system. We will touch on what is the next phase of the system evolution.
Data Mesh in Practice: How Europe’s Leading Online Platform for Fashion Goes ...Databricks
The Data Lake paradigm is often considered the scalable successor of the more curated Data Warehouse approach when it comes to democratization of data. However, many who went out to build a centralized Data Lake came out with a data swamp of unclear responsibilities, a lack of data ownership, and sub-par data availability.
The Role of Data Virtualization in an API EconomyDenodo
You can watch the webinar on demand here: https://buff.ly/2RQltuF
Digital transformation, even though a cliché, is definitely on top of every CEO's strategic initiative list. At the heart of any digital transformation, no matter the industry or the size of the company, there is an API strategy. Application programming interfaces (APIs) are the connection points between one application and another, and as such, they enable applications to build on each other, extend each other, and work with each other. Taken together, APIs represent a thriving ecosystem of developers that is showing no sign of slowing down.
Attend this webinar to learn:
• How data virtualization greatly enhances the capabilities of an API
• How data virtualization works as a service container, as a source for microservices and as an API gateway
• How data virtualization can create managed data services ecosystems in a thriving API economy
Denodo as the Core Pillar of your API StrategyDenodo
Watch full webinar here: https://buff.ly/2KTz2IB
Most people associate data virtualization with BI and analytics. However, one of the core ideas behind data virtualization is the decoupling of the consumption method from the data model. Why should the need for data requests in JSON over HTTP require extra development? Denodo provides immediate access to its datasets via REST, OData 4, GeoJSON and other protocols, with no coding involved. Easy to scale, cloud friendly and ready to integrate with API management tools, Denodo can be the perfect tool to fulfill your API strategy!
Attend this session to learn:
- What’s the role of Denodo in an API strategy
- Integration between Denodo and other elements of the API stack, like API management tools
- How easy it is to access Denodo as a RESTful endpoint
- Advanced options of Denodo web services: OAuth, OpenAPI, geographical capabilities, etc.
Why Data Virtualization? An Introduction by DenodoJusto Hidalgo
Data Virtualization means Real-time Data Access and Integration. But why do I need it? This presentation tries to answer it in a simple yet clear way.
By Alberto Pan, CTO of Denodo, and Justo Hidalgo, VP Product Management.
Big Data Fabric for At-Scale Real-Time Analysis by Edwin RobbinsData Con LA
Abstract:- Companies are adopting big data for performing high-velocity real-time analytics on very large volumes of data to enable rapid analysis for business users using self-service and never-before-realized use cases. However, such projects have yielded limited value because these big data systems have become siloed from the rest of the enterprise systems holding critical business operational data. Big Data Fabric is a modern data architecture combining data virtualization, data prep, and lineage capabilities to seamlessly integrate at scale these huge, siloed volumes of structured and unstructured data with other enterprise data assets. This presentation will demonstrate with proven customer case studies in big data and IoT about the value of using big data fabric as a logical data lake for big data analytics.
Modern Data Management for Federal ModernizationDenodo
Watch full webinar here: https://bit.ly/2QaVfE7
Faster, more agile data management is at the heart of government modernization. However, Traditional data delivery systems are limited in realizing a modernized and future-proof data architecture.
This webinar will address how data virtualization can modernize existing systems and enable new data strategies. Join this session to learn how government agencies can use data virtualization to:
- Enable governed, inter-agency data sharing
- Simplify data acquisition, search and tagging
- Streamline data delivery for transition to cloud, data science initiatives, and more
Where does Fast Data Strategy Fit within IT ProjectsDenodo
Fast Data Strategy is a must for organizations to become and be competitive. There are four use cases where Fast Data Strategy fits within IT Projects - Agile BI, Big Data/ Cloud, Data Services, and Single View. In this presentation, you will discover how four customers used data virtualization and Fast Data Strategy for these use cases.
This presentation is part of the Fast Data Strategy Conference, and you can watch the video here goo.gl/UxHMuJ.
An introduction to data virtualization in business intelligenceDavid Walker
A brief description of what Data Virtualisation is and how it can be used to support business intelligence applications and development. Originally presented to the ETIS Conference in Riga, Latvia in October 2013
Creating a Modern Data Architecture for Digital TransformationMongoDB
By managing Data in Motion, Data at Rest, and Data in Use differently, modern Information Management Solutions are enabling a whole range of architecture and design patterns that allow enterprises to fully harness the value in data flowing through their systems. In this session we explored some of the patterns (e.g. operational data lakes, CQRS, microservices and containerisation) that enable CIOs, CDOs and senior architects to tame the data challenge, and start to use data as a cross-enterprise asset.
Big data insights with Red Hat JBoss Data VirtualizationKenneth Peeples
You’re hearing a lot about big data these days. And big data and the technologies that store and process it, like Hadoop, aren’t just new data silos. You might be looking to integrate big data with existing enterprise information systems to gain better understanding of your business. You want to take informed action.
During this session, we’ll demonstrate how Red Hat JBoss Data Virtualization can integrate with Hadoop through Hive and provide users easy access to data. You’ll learn how Red Hat JBoss Data Virtualization:
Can help you integrate your existing and growing data infrastructure.
Integrates big data with your existing enterprise data infrastructure.
Lets non-technical users access big data result sets.
We’ll also provide typical uses cases and examples and a demonstration of the integration of Hadoop sentiment analysis with sales data.
Denodo 6.0: Self Service Search, Discovery & Governance using an Universal Se...Denodo
Presentation slides taken from Fast Data Strategy Roadshow San Francisco Bay Area.
For more Denodo 6-0 demos, please follow this link:https://goo.gl/XkxJjX
Partner Enablement: Key Differentiators of Denodo Platform 6.0 for the FieldDenodo
If you’re a Denodo Partner, this presentation is for you. Learn how to gain a competitive edge in the marketplace with Denodo Platform 6.0, and leverage off the new features and functionality.
This presentation is part of the Fast Data Strategy Conference, and you can watch the video here goo.gl/Qh8MeX.
Augmentation, Collaboration, Governance: Defining the Future of Self-Service BIDenodo
Watch full webinar here: https://bit.ly/3zVJRRf
According to Dresner Advisory’s 2020 Self-Service Business Intelligence Market Study, 62% of the responding organizations say self-service BI is critical for their business. If we look deeper into the need for today’s self-service BI, it’s beyond some Executives and Business Users being enabled by IT for self-service dashboarding or report generation. Predictive analytics, self-service data preparation, collaborative data exploration are all different facets of new generation self-service BI. While democratization of data for self-service BI holds many benefits, strict data governance becomes increasingly important alongside.
In this session we will discuss:
- The latest trends and scopes of self-service BI
- The role of logical data fabric in self-service BI
- How Denodo enables self-service BI for a wide range of users - Customer case study on self-service BI
Parallel In-Memory Processing and Data Virtualization Redefine Analytics Arch...Denodo
To watch full webinar, follow this link: https://goo.gl/3s9hRG
The tide is changing for analytics architectures. Traditional approaches, from the data warehouse to the data lake, implicitly assume that all relevant data can be stored in a single, centralized repository. But this approach is slow and expensive, and sometimes not even feasible, because some data sources are too big to be replicated, and data is often too distributed such as those found in cloud data sources to make a “full centralization” strategy successful.
Attend this webinar to learn:
• Why Logical architectures are the best option when integrating Big Data.
• How Denodo’s parallel in-memory capabilities with dynamic query optimization redefine analytics architectures.
• How IT can meet business demands for data much faster with Data Virtualization.
Agenda:
• Challenges with traditional approaches for analytics architectures.
• Overview of Denodo's parallel in-memory capabilities.
• Product Demo of parallel in-memory capabilities accelerating analytics performance.
• Q&A.
To watch all webinars in Denodo's Packed Lunch Webinar Series, follow this link: https://goo.gl/4xL9wM
The global need to securely derive (instant) insights, have motivated data architectures from distributed storage, to data lakes, data warehouses and lake-houses. In this talk we describe Tag.bio, a next generation data mesh platform that embeds vital elements such as domain centricity/ownership, Data as Products, Self-serve architecture, with a federated computational layer. Tag.bio data products combine data sets, smart APIs, statistical and machine learning algorithms into decentralized data products for users to discover insights using FAIR Principles. Researchers can use its point and click (no-code) system to instantly perform analysis and share versioned, reproducible results. The platform combines a dynamic cohort builder with analysis protocols and applications (low-code) to drive complex analysis workflows. Applications within data products are fully customizable via R and Python plugins (pro-code), and the platform supports notebook based developer environments with individual workspaces.
Join us for a talk/demo session on Tag.bio data mesh platform and learn how major pharma industries and university health systems are using this technology to promote value based healthcare, precision healthcare, find cures for disease, and promote collaboration (without explicitly moving data around). The talk also outlines Tag.bio secure data exchange features for real world evidence datasets, privacy centric data products (confidential computing) as well as integration with cloud services
Data Lakes: 8 Enterprise Data Management RequirementsSnapLogic
2016 is the year of the data lake. As you consider adopting an enterprise data lake strategy to manage more dynamic, poly-structured data, your data integration strategy must also evolve to handle new requirements. Thinking you can simply hire more developers to write code or rely on your legacy rows-and-columns centric tools is a recipe to sink in a data swamp instead of swimming in a data lake.
In this presentation, you'll learn about eight enterprise data management requirements that must be addressed in order to get maximum value from your big data technology investments.
To learn more, visit: https://www.snaplogic.com/big-data
Yelp has operated our connector ecosystem to feed vital data to domain-specific teams and data stores. We share some of our learning and experiences on operating such system. We will touch on what is the next phase of the system evolution.
Data Mesh in Practice: How Europe’s Leading Online Platform for Fashion Goes ...Databricks
The Data Lake paradigm is often considered the scalable successor of the more curated Data Warehouse approach when it comes to democratization of data. However, many who went out to build a centralized Data Lake came out with a data swamp of unclear responsibilities, a lack of data ownership, and sub-par data availability.
The Role of Data Virtualization in an API EconomyDenodo
You can watch the webinar on demand here: https://buff.ly/2RQltuF
Digital transformation, even though a cliché, is definitely on top of every CEO's strategic initiative list. At the heart of any digital transformation, no matter the industry or the size of the company, there is an API strategy. Application programming interfaces (APIs) are the connection points between one application and another, and as such, they enable applications to build on each other, extend each other, and work with each other. Taken together, APIs represent a thriving ecosystem of developers that is showing no sign of slowing down.
Attend this webinar to learn:
• How data virtualization greatly enhances the capabilities of an API
• How data virtualization works as a service container, as a source for microservices and as an API gateway
• How data virtualization can create managed data services ecosystems in a thriving API economy
Denodo as the Core Pillar of your API StrategyDenodo
Watch full webinar here: https://buff.ly/2KTz2IB
Most people associate data virtualization with BI and analytics. However, one of the core ideas behind data virtualization is the decoupling of the consumption method from the data model. Why should the need for data requests in JSON over HTTP require extra development? Denodo provides immediate access to its datasets via REST, OData 4, GeoJSON and other protocols, with no coding involved. Easy to scale, cloud friendly and ready to integrate with API management tools, Denodo can be the perfect tool to fulfill your API strategy!
Attend this session to learn:
- What’s the role of Denodo in an API strategy
- Integration between Denodo and other elements of the API stack, like API management tools
- How easy it is to access Denodo as a RESTful endpoint
- Advanced options of Denodo web services: OAuth, OpenAPI, geographical capabilities, etc.
Data Services and the Modern Data Ecosystem (ASEAN)Denodo
Watch full webinar here: https://bit.ly/2YdstdU
Digital Transformation has changed IT the way information services are delivered. The pace of business engagement, the rise of Digital IT (formerly known as “Shadow IT), has also increased demands on IT, especially in the area of Data Management.
Data Services exploits widely adopted interoperability standards, providing a strong framework for information exchange but also has enabled growth of robust systems of engagement that can now exploit information that was normally locked away in some internal silo with Data Virtualization.
We will discuss how a business can easily support and manage a Data Service platform, providing a more flexible approach for information sharing supporting an ever-diverse community of consumers.
Watch this on-demand webinar as we cover:
- Why Data Services are a critical part of a modern data ecosystem
- How IT teams can manage Data Services and the increasing demand by businesses
- How Digital IT can benefit from Data Services and how this can support the need for rapid prototyping allowing businesses to experiment with data and fail fast where necessary
- How a good Data Virtualization platform can encourage a culture of Data amongst business consumers (internally and externally)
Cloud Modernization and Data as a Service OptionDenodo
Watch: https://bit.ly/2E99UNO
The current data landscape is fragmented, not just in location but also in terms of shape and processing paradigms. Cloud has become a key component of modern architecture design. Data lakes, IoT, NoSQL, SaaS, etc. coexist with relational databases to fuel the needs of modern analytics, ML and AI. Exploring and understanding the data available within your organization is a time-consuming task. And all of this without even knowing if that data will be useful at all.
Attend this session to learn:
- How dynamic data challenges and the speed of change requires a new approach to data architecture.
- Learn how logical data architecture can enable organizations to transition data faster to the cloud with zero downtime.
- Explore how data as a service and other API management capabilities is a must in a hybrid cloud environment.
How to Manage APIs in your Enterprise for Maximum Reusability and GovernanceHARMAN Services
Trying to form an API/service strategy to keep pace with the IoT revolution? Know how you can address issues and challenges your enterprise might face while implementing it and know how you can address the same.
This webinar will also explains how WSO2 API Manager and WSO2 Governance Registry have helped enterprises overcome the following challenges:
1. How the number of services and their users affect service discoverability, catalog, and re-usability.
2. Mistrust among producers and consumers
3. Reliability, stability, and availability of services
4. How externally built common and reusable services meet requirements (anti-patterns - NIH)
Watch here: https://bit.ly/3i2iJbu
You will often hear that "data is the new gold". In this context, data management is one of the areas that has received more attention by the software community in recent years. From Artificial Intelligence and Machine Learning to new ways to store and process data, the landscape for data management is in constant evolution. From the privileged perspective of an enterprise middleware platform, we at Denodo have the advantage of seeing many of these changes happen.
Join us for an exciting session that will cover:
- The most interesting trends in data management.
- Our predictions on how those trends will change the data management world.
- How these trends are shaping the future of data virtualization and our own software.
Toyota Financial Services Digital Transformation - Think 2019Slobodan Sipcic
Toyota Financial Services (TFS) and IBM partnered to develop Data & Integration Platform (D&IP) to be the hub around which all current and future TFS data sources, services, and processes interact. To that end IBM have architected and deployed a FOAK event-based data stream processing and streaming integration platform. The main components of the architecture include: Kubernetes, Apache NiFi, Apache Kafka, Schema Registry, Jenkins, S3 and MongoDB. The platform is essential for realizing the TFS' strategic data stream processing and integration needs.
API and App Ecosystems - Build The Best: a deep diveCisco DevNet
A session in the DevNet Zone at Cisco Live, Berlin. This presentation presents our perspective and guidance on full life-cycle management and governance of API's from defining with the customer in mind, building, publishing on a single platform, supporting and retiring API's for the business outcomes you're driving!
API Enablement on Mainframes. How to API enable mainframe applications & services. How to integrated mainframe services and applications to mobile, cloud and external apps. This white paper covers couple of patterns to API enable mainframe based applications and services.
Watch full webinar here: https://bit.ly/3mdj9i7
You will often hear that "data is the new gold"? In this context, data management is one of the areas that has received more attention from the software community in recent years. From Artificial Intelligence and Machine Learning to new ways to store and process data, the landscape for data management is in constant evolution. From the privileged perspective of an enterprise middleware platform, we at Denodo have the advantage of seeing many of these changes happen.
In this webinar, we will discuss the technology trends that will drive the enterprise data strategies in the years to come. Don't miss it if you want to keep yourself informed about how to convert your data to strategic assets in order to complete the data-driven transformation in your company.
Watch this on-demand webinar as we cover:
- The most interesting trends in data management
- How to build a data fabric architecture?
- How to manage your data integration strategy in the new hybrid world
- Our predictions on how those trends will change the data management world
- How can companies monetize the data through data-as-a-service infrastructure?
- What is the role of voice computing in future data analytic
OPEN'17_4_Postgres: The Centerpiece for Modernising IT InfrastructuresKangaroot
Postgres is the leading open source database management system that is being developed by a very active community for more than 15 years. Gaby Schilders is Sales Engineer at EnterpriseDB, supplier of the EDB Postgres data platform.
Gaby Schilders, Sales Engineer at EnterpriseDB, will be explaining why companies take open source as the centerpiece for modernising their IT infrastructure, thus increasing their scalability and taking full advantage today's technologies offer them.
Webinar presented live on April 4, 2017
The Cloud Standards Customer Council has published an API Management reference architecture. APIs allow companies to open up data and services to external third party developers, business partners, and internal departments within the company to create innovative channel applications and new business opportunities. An effective API management platform provides a layer of controlled and secure self-service access to core business assets for reuse.
In this webinar, the authors of the reference architecture will cover the architectural components and capabilities that make up a superior API Management Platform and will also cover important runtime characteristics and deployment considerations.
Read the CSCC's paper here: http://www.cloud-council.org/deliverables/cloud-customer-architecture-for-api-management.htm
App modernization projects are hard. Enterprises are looking to cloud-native platforms like Pivotal Cloud Foundry to run their applications, but they’re worried about the risks inherent to any replatforming effort.
Fortunately, several repeatable patterns of successful incremental migration have emerged.
In this webcast, Google Cloud’s Prithpal Bhogill and Pivotal’s Shaun Anderson will discuss best practices for app modernization and securely and seamlessly routing traffic between legacy stacks and Pivotal Cloud Foundry.
How to reinvent your organization in an iterative and pragmatic way? This is the result of using our digital toolbox. It allows you to transform your business model, expand your ecosystem by setting up your digital platform. This reinvention is also supported by the adaptation of your governance allowing you to innovate while guaranteeing the performance of your organization. For any information / suggestion / collaboration - william.poos@nrb.be
Comment réinventer votre organisation de manière itérative et pragmatique ? C'est le résultat de l'utilisation de notre boîte à outils digitale. Elle vous permet de transformer votre modèle métier, d'étendre votre écosystème en mettant en place votre plateforme digitale. Cette réinvention est également supportée par l'adaptation de votre gouvernance vous permettant d'innover tout en garantissant la performance de votre organisation. Pour toute information / suggestion / collaboration - william.poos@nrb.be
SA 2014 - Integrating the heterogeneous enterpriseDavid Graham
Mulesoft Connect Content presented in South Africa in November 2014. Unpacking the trends that are influencing the connected renaissance across systems whether they are on premise; off-site or in the cloud.
Similar to Enabling digital transformation api ecosystems and data virtualization (20)
Enterprise Monitoring and Auditing in DenodoDenodo
Watch full webinar here: https://buff.ly/3P3l4oK
Proper monitoring of an enterprise system is critical to understanding its capacity and growth, anticipating potential issues, and even understanding key ROI metrics. This also facilitates the implementation of policies and user access audits which are key to optimizing the resource utilization in an organization. Do you want to learn more about the new Denodo features for monitoring, auditing, and visualizing enterprise monitoring data?
Join us for the session with Vijayalakshmi Mani, Data Engineer at Denodo, to understand how the new features and components help in monitoring your Denodo Servers and the resource utilizations and how to extract the most out of the logs that the Denodo Platform generates including FinOps information.
Watch on-demand and Learn:
- What is a Denodo Monitor and what’s new in it?
- How to visualize the Denodo Monitor Information and use of Diagnostics & Monitoring Tool
- Introduction to the new Denodo Dashboard
- Demonstration on the Denodo Dashboard
Lunch and Learn ANZ: Mastering Cloud Data Cost Control: A FinOps ApproachDenodo
Watch full webinar here: https://buff.ly/4bYOOgb
With the rise of cloud-first initiatives and pay-per-use systems, forecasting IT costs has become a challenge. It's easy to start small, but it's equally easy to get skyrocketing bills with little warning. FinOps is a discipline that tries to tackle these issues, by providing the framework to understand and optimize cloud costs in a more controlled manner. The Denodo Platform, being a middleware layer in charge of global data delivery, sits in a privileged position not only to help us understand where costs are coming from, but also to take action, manage, and reduce them.
Attend this session to learn:
- The importance of FinOps in a cloud architecture.
- How the Denodo Platform can help you collect and visualize key FinOps metrics to understand where your costs are coming from?
- What actions and controls the Denodo Platform offers to keep costs at bay.
Achieving Self-Service Analytics with a Governed Data Services LayerDenodo
Watch full webinar here: https://buff.ly/3wBhxYb
In an increasingly distributed and complex data landscape, it is becoming increasingly difficult to govern and secure data effectively throughout the enterprise. Whether it be securing data across different repositories or monitoring access across different business units, the proliferation of data technologies and repositories across both on-premises and in the cloud is making the task unattainable. The challenge is only made greater by the ongoing pressure to offer self-service data access to business users.
Watch on-demand and learn:
- How to use a logical data fabric to build an enterprise-wide data access role model.
- Centralise security when data is spread across multiple systems residing both on-premises and in the cloud.
- Control and audit data access across different regions.
What you need to know about Generative AI and Data Management?Denodo
Watch full webinar here: https://buff.ly/3UXy0A2
It should be no surprise that Generative AI will have a profound impact to data management in years to come. Much like other areas of the technology sector, the opportunities presented by GenAI will accelerate our efforts around all aspects of data management, including self-service, automation, data governance and security. On the other hand, it is also becoming clearer that to unleash the true potential of AI assistants powered by GenAI, we need novel implementation strategies and a reimagined data architecture. This presents an exhilarating yet challenging future, demanding innovative thinking and methodologies in data management.
Join us on this webinar to learn about:
- The opportunities and challenges presented by GenAI today.
- Exploiting GenAI to democratize data management.
- How to augment GenAI applications with corporate data and knowledge.
- How to get started.
Mastering Data Compliance in a Dynamic Business LandscapeDenodo
Watch full webinar here: https://buff.ly/48rpLQ3
Join us for an enlightening webinar, "Mastering Data Compliance in a Dynamic Business Landscape," presented by Denodo Technologies and W5 Consulting. This session is tailored for business leaders and decision-makers who are navigating the complexities of data compliance in an ever-evolving business environment.
This webinar will focus on why data compliance is crucial for your business. Discover how to turn compliance into a competitive advantage, enhancing operational efficiency and market trust. We'll also address the risks of non-compliance, including financial penalties and the loss of customer trust, and provide strategies to proactively overcome these challenges.
Key Takeaways:
- How can your business leverage data management practices to stay agile and compliant in a rapidly changing regulatory landscape?
- Keys to balancing data accessibility with security and privacy in today's data-driven environment.
- What are the common pitfalls in achieving compliance with regulations like GDPR, CCPA, and HIPAA, and how can your business avoid them?
We will go beyond the technical aspects and delve into how you can strategically position your organization in the realm of data management and compliance. Learn how to craft a data compliance strategy that aligns with your business goals, enhances operational efficiency, and builds stakeholder trust.
Denodo Partner Connect: Business Value Demo with Denodo Demo LiteDenodo
Watch full webinar here: https://buff.ly/3OCQvGk
In this session, Denodo Sales Engineer, Yik Chuan Tan, will guide you through the art of delivering a compelling demo of the Denodo Platform with Denodo Demo Lite. Watch to uncover the significant functionalities that set Denodo apart and learn how to effectively win over potential customers.
In this session, we will cover:
Understanding the Denodo Platform & Tailoring Your Demo to Prospect Needs: By gaining a comprehensive understanding of the Denodo Platform, its architecture, and how it addresses data management challenges, you can customize your demo to align with the specific needs and pain points of your prospects, including:
- seamless data integration with real-time access
- data security and governance
- self-service data discovery
- advanced analytics and reporting
- performance optimization scalability and deployment
Watch this Denodo demo session and acquire the skills and knowledge necessary to captivate your prospects. Whether you're a seasoned technical professional or new to the field, this session will equip you with the skills to deliver compelling demos that lead to successful conversions.
Expert Panel: Overcoming Challenges with Distributed Data to Maximize Busines...Denodo
Watch full webinar here: https://buff.ly/3wdI1il
As organizations compete in new markets and new channels, business data requirements include new data platforms and applications. Migration to the cloud typically adds more distributed data when operations set up their own data platforms. This spreads important data across on-premises and cloud-based data platforms. As a result, data silos proliferate and become difficult to access, integrate, manage, and govern. Many organizations are using cloud data platforms to consolidate data, but distributed environments are unlikely to go away.
Organizations need holistic data strategies for unifying distributed data environments to improve data access and data governance, optimize costs and performance, and take advantage of modern technologies as they arrive. This TDWI Expert Panel will focus on overcoming challenges with distributed data to maximize business value.
Key topics this panel will address include:
- Developing the right strategy for your use cases and workloads in distributed data environments, such as data fabrics, data virtualization, and data mesh
- Deciding whether to consolidate data silos or bridge them with distributed data technologies
- Enabling easier self-service access and analytics across a distributed data environment
- Maximizing the value of data catalogs and other data intelligence technologies for distributed data environments
- Monitoring and data observability for spotting problems and ensuring business satisfaction
Watch full webinar here: https://buff.ly/3UE5K5l
The ability to recognize and flag sensitive information within corporate datasets is essential for compliance with emerging privacy laws, for completing a privacy impact assessment (PIA) or data subject access request (DSAR), and also for cyber-insurance compliance. During this session, we will discuss data privacy laws, the challenges they present, and how they can be applied with modern tools.
Join us for the session driven by Mark Rowan, CEO at Data Sentinel, and Bhavita Jaiswal, SE at Denodo, who will show how a data classification engine augments Data Catalog to support data governance and compliance objectives.
Watch on-demand & Learn:
- Changing landscape of data privacy laws and compliance requirements
- How to create a data classification framework
- How Data Sentinel classifies data and this can be integrated into Denodo
- Using the enhanced data classifications via consuming tools such as Data Catalog and Power BI
Знакомство с виртуализацией данных для профессионалов в области данныхDenodo
Watch full webinar here: https://buff.ly/3OETC08
По данным аналитической компании Gartner, "к 2022 году 60% предприятий включат виртуализацию данных в качестве основного метода доставки данных в свою интеграционную архитектуру". Компания Gartner назвала Denodo лидером в Магическом квадранте 2020 года по инструментам интеграции данных.
В ходе этого 1,5-часового занятия вы узнаете, как виртуализация данных революционизирует бизнес и ИТ-подход к доступу, доставке, потреблению, управлению и защите данных, независимо от возраста вашей технологии, формата данных или их местонахождения. Эта зрелая технология устраняет разрыв между ИТ и бизнес-пользователями и обеспечивает значительную экономию средств и времени.
**ФОРМАТ
Онлайн-семинар продолжительностью 1 час 30 минут.
Благодаря записи вы можете выполнять упражнения в своем собственном темпе.
**ДЛЯ КОГО ЭТОТ СЕМИНАР?
ИТ-менеджеры / архитекторы
Специалисты по анализу данных / аналитики
CDO
**СОДЕРЖАНИЕ
В программе: введение в суть виртуализации данных, примеры использования, реальные примеры из практики клиентов и демонстрация возможностей платформы Denodo Platform:
Интеграция и предоставление данных быстро и легко с помощью платформы Denodo Platform 8.0
Оптимизатор запросов Denodo предоставляет данные в режиме реального времени, по запросу, даже для очень больших наборов данных
Выставлять данные в качестве "сервисов данных" для потребления различными пользователями и инструментами
Каталог данных: Открывайте и документируйте данные с помощью нашего Каталога данных
пространства для самостоятельного доступа к данным.
Виртуализация данных играет ключевую роль в управлении и обеспечении безопасности данных в вашей организации
**ПОВЕСТКА
Введение в виртуализацию данных
Примеры использования и примеры из практики клиентов
Архитектура - Управление и безопасность
Производительность
Демо
Следующие шаги: как самостоятельно протестировать и внедрить платформу
Интерактивная сессия вопросов и ответов
Data Democratization: A Secret Sauce to Say Goodbye to Data FragmentationDenodo
Watch full webinar here: https://buff.ly/41Zf31D
Despite recent and evolving technological advances, the vast amounts of data that exist in a typical enterprise is not always available to all stakeholders when they need it. In modern enterprises, there are broad sets of users, with varying levels of skill sets, who strive to make data-driven decisions daily but struggle to gain access to the data needed in a timely manner.
Join our webinar to learn how to:
- Unlock the Power of Your Data: Discover how data democratization can transform your organization by giving every user access to the data they need, when they need it.
- Say 'Goodbye' to Data Fragmentation: Learn practical strategies to break down data silos and foster a more collaborative and efficient data environment.
- Realize the Full Potential of Your Data: Hear success stories about industry leaders who have embraced data democratization and witnessed tangible results.
Denodo Partner Connect - Technical Webinar - Ask Me AnythingDenodo
Watch full webinar here: https://buff.ly/48ZpEf1
In this session, we will cover a deeper dive into the Denodo Platform 8.0 Certified Architect Associate (DEN80EDUCAA) exam by answering any questions that have developed since the previous session.
Additionally, we invite partners to bring any general questions related to Denodo, the Denodo Platform, or data management.
Lunch and Learn ANZ: Key Takeaways for 2023!Denodo
Watch full webinar here: https://buff.ly/3SnH5QY
2023 is coming to an end where organisations dependency on trusted, accurate, secure and contextual data only grows more challenging. The perpetual aspect in seeking new architectures, processes, organisational team structures to "get the business their data" and reduce the operating costs continues unabated. While confidence from the business in what "value" is being derived or "to be" delivered from these investments in data, is being heavily scrutinised. 2023 saw significant new releases from vendors, focusing on the Data Fabric.
At this session we will look at these topics and key takeaways for 2023, including;
- Data management and data integration market highlights for 2023
- Key achievements for Denodo in their journey as a leader in this market
- A few case studies from Australian organisations in how they are delivering strategic business value through Denodo's Data Fabric platform and what they have been doing differently
It’s a Wrap! 2023 – A Groundbreaking Year for AI and The Way ForwardDenodo
Watch full webinar here: https://buff.ly/3S4Y49o
A little over a year ago, we would not have expected the disruptions caused by the rise of Generative AI. If 2023 was a groundbreaking year for AI, what will 2024 bring? More importantly, what can you do now to take advantage of these trends and ensure you are future-proof?
For example:
- Generative AI will become more powerful and user-friendly, enabling novel and realistic content creation and automation.
- Data Architectures will need to adapt to feed these powerful new models.
- Data ecosystems are moving to the cloud, but there is a growing need to maintain control of costs and optimize workloads better.
Join us for a discussion on the most significant trends in the Data & AI space, and how you can prepare to ride this wave!
Quels sont les facteurs-clés de succès pour appliquer au mieux le RGPD à votr...Denodo
Watch full webinar here: https://buff.ly/3O7rd2R
Afin d’être conformes au RGPD, les entreprises ont besoin d'avoir une vue d'ensemble sur toutes leurs données et d'établir des contrôles de sécurité sur toute l'infrastructure. La virtualisation des données de Denodo permet de rassembler les multiples sources de données, de les rendre accessibles à partir d'une seule couche, et offre des capacités de monitoring pour surveiller les changements.
Pour cela, Square IT Services a développé pour l’un de ses grands clients français prestigieux dans le secteur du luxe une interface utilisateur ergonomique qui lui permet de consulter les informations personnelles de ses clients, vérifier leur éligibilité à pratiquer leur droit à l'oubli, et de désactiver leurs différents canaux de notification. Elle dispose aussi d'une fonctionnalité d'audit qui permet de tracer l'historique des opérations effectuées, et lui permet donc de retrouver notamment la date à laquelle la personne a été anonymisée.
L'ensemble des informations remontées au niveau de l'application sont récupérées à partir des APIs REST exposées par Denodo.
Dans ce webinar, nous allons détailler l’ensemble des fonctionnalités de l’application DPO-Cockpit autour d’une démo, et expliquer à chaque étape le rôle central de Denodo pour réussir à simplifier la gestion du RGPD tout en étant compliant.
Les points clés abordés:
- Contexte client face aux enjeux du RGPD
- Défis et challenges rencontrés
- Options et choix retenu (Denodo)
- Démarche: architecture de la solution proposée
- Démo de l'outil: fonctionnalités principales
Lunch and Learn ANZ: Achieving Self-Service Analytics with a Governed Data Se...Denodo
Watch full webinar here: https://buff.ly/48zzN2h
In an increasingly distributed and complex data landscape, it is becoming increasingly difficult to govern and secure data effectively throughout the enterprise. Whether it be securing data across different repositories or monitoring access across different business units, the proliferation of data technologies and repositories across both on-premises and in the cloud is making the task unattainable. The challenge is only made greater by the ongoing pressure to offer self-service data access to business users.
Tune in and learn:
- How to use a logical data fabric to build an enterprise-wide data access role model.
- Centralise security when data is spread across multiple systems residing both on-premises and in the cloud.
- Control and audit data access across different regions.
How to Build Your Data Marketplace with Data Virtualization?Denodo
Watch full webinar here: https://buff.ly/4aAi0cS
Organizations continue to collect mounds of data and it is spread over different locations and in different formats. The challenge is navigating the vastness and complexity of the modern data ecosystem to find the right data to suit your specific business purpose. Data is an important corporate asset and it needs to be leveraged but also protected.
By adopting an alternate approach to data management and adapting a logical data architecture, data can be democratized while providing centralized control within a distributed data landscape. The web-based Data Catalog tool acts as a single access point for secure enterprise-wide data access and governance. This corporate data marketplace provides visibility into your data ecosystem and allows data to be shared without compromising data security policies.
Catch this live webinar to understand how this approach can transform how you leverage data across the business:
- Empower the knowledge worker with data and increase productivity
- Promote data accuracy and trust to encourage re-use of important data assets
- Apply consistent security and governance policies across the enterprise data landscape
Webinar #2 - Transforming Challenges into Opportunities for Credit UnionsDenodo
Watch full webinar here: https://buff.ly/3vhzqL5
Join our exclusive webinar series designed to empower credit unions with transformative insights into the untapped potential of data. Explore how data can be a strategic asset, enabling credit unions to overcome challenges and foster substantial growth.
This webinar will delve into how data can serve as a catalyst for addressing key challenges faced by credit unions, propelling them towards a future of enhanced efficiency and growth.
Enabling Data Catalog users with advanced usabilityDenodo
Watch full webinar here: https://buff.ly/48A4Yu1
Data catalogs are increasingly important in any modern data-driven organization. They are essential to manage and make the most of the huge amount of data that any organization uses. As this information is continuously growing in size and complexity, data catalogs are key to providing Data Discovery, Data Governance, and Data Lineage capabilities.
Join us for the session driven by David Fernandez, Senior Technical Account Manager at Denodo, to review the latest features aimed at improving the usability of the Denodo Data Catalog.
Watch on-demand & Learn:
- Enhanced search capabilities using multiple terms.
- How to create workflows to manage internal requests.
- How to leverage the AI capabilities of Data Catalog to generate SQL queries from natural language.
Watch full webinar here: https://buff.ly/3vjrn0s
The purpose of the Denodo Platform 8.0 Certified Architect Associate (DEN80EDUCAA) exam is to provide organizations that use Denodo Platform 8.0 with a means of identifying suitably qualified data architects who understand the role and position of the Denodo Platform within their broader information architecture.
This exam covers the following technical topics and subject areas:
- Denodo Platform functionality, including
- Governance and metadata management
- Security
- Performance optimization
- Caching
- Defining Denodo Platform use scenarios
Along with some sample questions, a Denodo Sales Engineer will help you prepare for exam topics and ace the exam.
Join us now to start your journey toward becoming a Certified Denodo Architect Associate!
GenAI y el futuro de la gestión de datos: mitos y realidadesDenodo
Watch full webinar here: https://buff.ly/3NLMSNM
El Generative AI y los Large Language Models (LLMs), encabezados por GPT de OpenAI, han supuesto la mayor revolución en el mundo de la computación de los últimos años. Pero ¿Cómo afectan realmente a la gestión de datos? ¿Reemplazarán los LLMs al profesional de la gestion de datos? ¿Cuánto hay de mito y cuánto de realidad?
En esta sesión revisaremos:
- Que es la Generative AI y por qué es importante para la gestión de datos
- Presente y futuro de aplicación de genAI en el mundo de los datos
- Cómo preparar tu organización para la adopción de genAI
Levelwise PageRank with Loop-Based Dead End Handling Strategy : SHORT REPORT ...Subhajit Sahu
Abstract — Levelwise PageRank is an alternative method of PageRank computation which decomposes the input graph into a directed acyclic block-graph of strongly connected components, and processes them in topological order, one level at a time. This enables calculation for ranks in a distributed fashion without per-iteration communication, unlike the standard method where all vertices are processed in each iteration. It however comes with a precondition of the absence of dead ends in the input graph. Here, the native non-distributed performance of Levelwise PageRank was compared against Monolithic PageRank on a CPU as well as a GPU. To ensure a fair comparison, Monolithic PageRank was also performed on a graph where vertices were split by components. Results indicate that Levelwise PageRank is about as fast as Monolithic PageRank on the CPU, but quite a bit slower on the GPU. Slowdown on the GPU is likely caused by a large submission of small workloads, and expected to be non-issue when the computation is performed on massive graphs.
3. 1. Digital Transformation and the API Economy
2. Current Trends with APIs
3. Data Virtualization in the API Ecosystem
4. Demo
5. Q&A
6. Next Steps
Agenda
5. 5
Digital Transformation
• Digital transformation is a strategic initiative for most
organizations
• The concept reflects technology’s role in strategic decision-
making, with its ability to automate and simplify business
processes, improve customer relationships, enhance
productivity, and cost savings
• Driven from CEO’s office: Highest level of visibility & fully
funded
‒ Gartner – 28% of CIO budget in 2018
‒ IDC – 2/3 of CEOs in global 2000 have digital transformation in the center
of their corporate strategy
• Seen as do-or-die initiative
‒ “If you don’t, someone else will”
6. 6
APIs – Building Blocks of Digital Transformation
• API stands for Application Programming Interface
• APIs are the foundations of digital transformation
‒ Enable the integration of diverse IT systems, building
more collaborative and self-service IT environments
‒ Exposing data and processes as APIs allows creation of
new products and business models
‒ Create revenues from existing IT assets
• These initiatives have created an API Economy
7. 7
Example from the API Economy – Uber
Google Maps
MapKit (iOS)
Twilio
SendGrid
• APIs have allowed Uber to focus
on its core business
• Connecting drivers with riders
• Critical services are provided by
technology partners via APIs
• Braintree for payments, Google
Maps for driver tracking, etc.
• Uber has also exposed APIs to
partners
• Uber has become a platform
8. 8
The Value of Platforms
Driven by APIs
• Five most valuable* companies in 2018 were platform providers:
Apple $890B
Google $768B
Microsoft $680B
Amazon $592B
Facebook $545B
* Valuation based on market capitalization
Exxon $492B
General Electric $358B
Microsoft $313B
AT&T $238B
Proctor & Gamble $236B
Source: Bloomberg, Google
Compare to 2008 top 5:
9. 9
• Google have exposed Maps functionality
through APIs
• Many companies use these APIs to show
locations
• e.g. Uber, Fedex for tracking
• e.g. retailers – e.g. Starbucks – for store
locations
• e.g. companies – such as Denodo – for
office locations
API Example – Google Maps
11. 11
API Protocols
• The 90’s: CORBA
‒ Common Object Request Broker Architecture
‒ Available since 1991
‒ Programming specification for distributed applications in heterogeneous environments
‒ Relatively abstract and complex model; difficult to implement
• The 2000’s: SOAP
‒ Simple Object Access Protocol
‒ Available since 1999
‒ Industrial standard of the World Wide Web Consortiums (W3C)
‒ Application protocol: SOAP over HTTP
‒ Payload: Services Definition Language (WSDL) authored in XML
‒ Complex and heavyweight/verbose protocol, slow parsing
• The 2010’s: REST
‒ Became the de facto standard for web services
12. 12
RESTful API
The de facto standard for web services
• Not a detailed specification, but a programming paradigm
‒ 6 architectural constraints: Client/Server, stateless, uniform interface, cacheable, layered system, code on demand
• HTTP as the application protocol
‒ The HTTP methods (GET, POST, PUT, ...) indicate which operation a service should perform
‒ Light-weight payloads in HTML, JSON or XML
‒ Services are addressed via URL
‒ Examples from Salesforce
▪ https://salesforce.../Query/?q=SELECT+Name+From+Account+WHERE+ShippingCity='San+Francisco'
▪ https://salesforce.../Account/001D000000INjVe?fields=AccountNumber,BillingPostalCode
• Many other protocols have been created to facilitate this approach
‒ Consumption Protocol: OData, GData
‒ Authentication: OAuth, SAML
‒ Interface specifications: WADL, OpenAPI (i.e. Swagger)
13. 13
Microservices
Evolving software-driven, cloud-native service delivery infrastructures
• Microservices philosophy
‒ Applications should be built from small, modular, lightweight, and
independently deployable components (called microservices)
• Microservices are reusable and easily scalable
• Microservices are independently replaceable and
upgradeable
• Microservices can use different languages and technologies
• They are typically exposed as RESTful Web Services
14. 14
API Management and Microservices
• Gateway: network routing, throttling and security
• Publishing: deploy APIs, versioning and coordinate
the overall API lifecycle
• Portal: community site for API users, information
and functionality including documentation
• Reporting and analytics: functionality to monitor
API usage and load
• Monetization: metering and chargeback based on
usage
Gartner: “Full life cycle application programming interface (API) management is about the planning,
design, implementation, testing, publication, operation, consumption, maintenance, versioning and
retirement of APIs.”
Example: Azure API Management
16. 16
Data Virtualization in the API Ecosystem
• Data virtualization platforms like Denodo
can play a significant role in an API
ecosystem
• Let’s review three common architectures:
1. Data Virtualization as a Data Service provider
2. Data Virtualization as an integration layer for
Microservices
3. Data Virtualization as a Data Services Layer for
Microservices
17. 17
?
Denodo as a Data Service Provider
API GATEWAY
DATA CONSUMERSDATA CONSUMERS
MICROSERVICES
DISPARATE DATA SOURCES
18. 18
Denodo as a Data Service Provider
• CONNECT: connect disparate data from any source (enterprise, Big Data, cloud, Excel) or
location
• COMBINE: define data transformations and combinations that meet your business needs
• PUBLISH: one-click REST web services deployment, establishing the REST practices
‒ Protocol: Denodo’s REST format, OData 2.0 and OData 4.0
‒ Payload: XML, JSON, RSS and HTML
‒ Authentication: Basic HTTP, SPNEGO (Kerberos), OAuth, SAML
‒ Documentation: OpenAPI (i.e. Swagger)
19. 19
Denodo as an Integration Layer for Microservices
API GATEWAY
DATA CONSUMERSDATA CONSUMERS
MICROSERVICESDISPARATE DATA SOURCES
20. 20
Denodo as Integration Layer for Microservices
• Keep the Microservice lightweight
‒ Leverage Denodo’s data management services
▪ caching, security, auditing, data cleansing, resource
management, …
• Combine Data
‒ Transform and combine data from different
sources
• Better Usability
‒ Single point of entry
‒ Same protocol
‒ Global Data Catalog
21. 21
Fundamental Data APIs (Read-only)
Business Logic APIs Business Process APIs Data Aggregation APIs
Denodo as a Data Services Layer for Microservices
Price Promo Customer Location Tax Order Item Worker Vendor
MICROSERVICES
22. 22
Business Process APIsBusiness Process MicroservicesBusiness Logic APIsBusiness Logic Microservices
MICROSERVICES
Fundamental Data APIs (Read-only)
Data Aggregation APIs
Denodo as a Data Services Layer for Microservices
Price Promo Customer Location Tax Order Item Worker Vendor
Fundamental Data APIs (Read-only)
Data Aggregation APIs
23. 24
Denodo as a Data Services Layer for Microservices
• Keep the Microservice lightweight
‒ Avoid programming data access patterns into
microservice logic (e.g. SQL queries)
• Keep the Microservice database independent
‒ No direct access to databases – data accessed from
Data Services Layer
‒ Easier deployment & migration
‒ True Microservice scalability
• Leverage advanced optimizations of Denodo
Platform
‒ Avoid having to tune queries in Microservices layer
24. 25
Business Process MicroservicesBusiness Logic Microservices
MICROSERVICES
Fundamental Data APIs (Read-only)
Data Aggregation APIs
Denodo as a Data Services Layer for Microservices – CQRS
Price Promo Customer Location Tax Order Item Worker Vendor
Fundamental Data APIs (Read-only)
Data Aggregation APIs
Write APIs
CQRS = Command-Query Responsibility Separation
26. 27
Case Study – Leading Fast Food Restaurant
• Leading ‘fast food’ restaurant chain in North America
• Over 2,300 locations
• Built smart phone app – used by millions of customers
• Order food for pick up, find locations, review nutrition information, etc.
• Menu (main, sides, drinks) can vary by location
• Menu details and food nutrition information retrieved by app based on user location
• Data retrieved via REST APIs served by Denodo Platform
• Menu data in databases, files, Excel spreadsheets
• Read by Denodo Platform, exposed to app via REST APIs
28. 29
Case Study – GetSmarter, Leader in Online Education
• GetSmarter has implemented a microservice
architecture that works in tandem with the Denodo
platform to simplify data access throughout its
organization.
• With Denodo Platform, GetSmarter provides data to
stakeholders, from any type of system, microservice
or historical system, in real time, without affecting
business flows.
• Not only does the new architecture provide a single
view of core business entities, but it also allows
GetSmarter to migrate data from legacy systems in
the background, without users noticing the change.
More info: http://www.datavirtualizationblog.com/getsmarter-accelerated-business-decisions-single-view-across-rapidly-evolving-infrastructure/
32. 33
Use Data Virtualization to simplify and accelerate your
API initiatives…
• Connect to all your data
• Central point for data transformation, combination,
security and governance
• One-click REST Web Services deployment
API-tize Your Data