Myth Busters II: BI Tools and Data Virtualization are InterchangeableDenodo
Watch Here: https://bit.ly/2NcqU6F
We take on the 2nd myth about data virtualization and it’s one that suggests a BI tool can substitute a data virtualization software.
You might be thinking: If I can have multi-source queries and define a logical model in my reporting tool, why would I need a data virtualization software?
Reporting tools, no doubt important and necessary, focus on the visualization of data and it’s presentation to the business user. Data virtualization is a governed data access layer designed to connect to and provide transparency of all enterprise data.
Yet the myth suggests that these technologies are interchangeable. So we’re going to take it on!
Watch this webinar as we compare and contrast BI tools and data virtualization to draw a final conclusion.
Enabling Data as a Service with the JBoss Enterprise Data Services Platformprajods
This presentation was given at JUDCon 2013, Jan 17,18 at Bangalore. Presented by Prajod Vettiyattil and Gnanaguru Sattanathan. The presentation deals with the Why, What and How of Data Services and Data Services Platforms. It also explains the features of the JBoss Enterprise Data Services Platform.
The need for Data Services is explained with 3 Business use cases:
1. Post purchase customer experience improvement for an Auto manufacturer
2. Enterprise Data Access Layer
3. Data Services for Regulatory Reporting requirements like Dodd Frank
Enabling digital transformation api ecosystems and data virtualizationDenodo
Watch the full webinar here: https://buff.ly/2KBKzLJ
Digital transformation, as cliché as it sounds, is on top of every decision maker’s strategic initiative list. And at the heart of any digital transformation, no matter the industry or the size of the company, there is an application programming interface (API) strategy. While API platforms enable companies to manage large numbers of APIs working in tandem, monitor their usage, and establish security between them, they are not optimized for data integration, so they cannot easily or quickly integrate large volumes of data between different systems. Data virtualization, however, can greatly enhance the capabilities of an API platform, increasing the benefits of an API-based architecture. With data virtualization as part of an API strategy, companies can streamline digital transformations of any size and scope.
Join us for this webinar to see these technologies in action in a demo and to get the answers to the following questions:
*How can data virtualization enhance the deployment and exposure of APIs?
*How does data virtualization work as a service container, as a source for microservices and as an API gateway?
*How can data virtualization create managed data services ecosystems in a thriving API economy?
*How are GetSmarter and others are leveraging data virtualization to facilitate API-based initiatives?
Data virtualization, Data Federation & IaaS with Jboss TeiidAnil Allewar
Enterprise have always grappled with the problem of information silos that needed to be merged using multiple data warehouses(DWs) and business intelligence(BI) tools so that enterprises could mine this disparate data for businessdecisions and strategy. Traditionally this data integration was done with ETL by consolidating multiple DBMS into a single data storage facility.
Data virtualization enables abstraction, transformation, federation, and delivery of data taken from variety of heterogeneous data sources as if it is a single virtual data source without the need to physically copy the data for integration. It allows consuming applications or users to access data from these various sources via a request to a single access point and delivers information-as-a-service (IaaS).
In this presentation, we will explore what data virtualization is and how it differs from the traditional data integration architecture. We’ll also look at validating the data virtualization and federation concepts by working through an example(see videos at the GitHub repo) to federate data across 2 heterogeneous data sources; mySQL and MongoDB using the JBoss Teiid data virtualization platform.
Myth Busters II: BI Tools and Data Virtualization are InterchangeableDenodo
Watch Here: https://bit.ly/2NcqU6F
We take on the 2nd myth about data virtualization and it’s one that suggests a BI tool can substitute a data virtualization software.
You might be thinking: If I can have multi-source queries and define a logical model in my reporting tool, why would I need a data virtualization software?
Reporting tools, no doubt important and necessary, focus on the visualization of data and it’s presentation to the business user. Data virtualization is a governed data access layer designed to connect to and provide transparency of all enterprise data.
Yet the myth suggests that these technologies are interchangeable. So we’re going to take it on!
Watch this webinar as we compare and contrast BI tools and data virtualization to draw a final conclusion.
Enabling Data as a Service with the JBoss Enterprise Data Services Platformprajods
This presentation was given at JUDCon 2013, Jan 17,18 at Bangalore. Presented by Prajod Vettiyattil and Gnanaguru Sattanathan. The presentation deals with the Why, What and How of Data Services and Data Services Platforms. It also explains the features of the JBoss Enterprise Data Services Platform.
The need for Data Services is explained with 3 Business use cases:
1. Post purchase customer experience improvement for an Auto manufacturer
2. Enterprise Data Access Layer
3. Data Services for Regulatory Reporting requirements like Dodd Frank
Enabling digital transformation api ecosystems and data virtualizationDenodo
Watch the full webinar here: https://buff.ly/2KBKzLJ
Digital transformation, as cliché as it sounds, is on top of every decision maker’s strategic initiative list. And at the heart of any digital transformation, no matter the industry or the size of the company, there is an application programming interface (API) strategy. While API platforms enable companies to manage large numbers of APIs working in tandem, monitor their usage, and establish security between them, they are not optimized for data integration, so they cannot easily or quickly integrate large volumes of data between different systems. Data virtualization, however, can greatly enhance the capabilities of an API platform, increasing the benefits of an API-based architecture. With data virtualization as part of an API strategy, companies can streamline digital transformations of any size and scope.
Join us for this webinar to see these technologies in action in a demo and to get the answers to the following questions:
*How can data virtualization enhance the deployment and exposure of APIs?
*How does data virtualization work as a service container, as a source for microservices and as an API gateway?
*How can data virtualization create managed data services ecosystems in a thriving API economy?
*How are GetSmarter and others are leveraging data virtualization to facilitate API-based initiatives?
Data virtualization, Data Federation & IaaS with Jboss TeiidAnil Allewar
Enterprise have always grappled with the problem of information silos that needed to be merged using multiple data warehouses(DWs) and business intelligence(BI) tools so that enterprises could mine this disparate data for businessdecisions and strategy. Traditionally this data integration was done with ETL by consolidating multiple DBMS into a single data storage facility.
Data virtualization enables abstraction, transformation, federation, and delivery of data taken from variety of heterogeneous data sources as if it is a single virtual data source without the need to physically copy the data for integration. It allows consuming applications or users to access data from these various sources via a request to a single access point and delivers information-as-a-service (IaaS).
In this presentation, we will explore what data virtualization is and how it differs from the traditional data integration architecture. We’ll also look at validating the data virtualization and federation concepts by working through an example(see videos at the GitHub repo) to federate data across 2 heterogeneous data sources; mySQL and MongoDB using the JBoss Teiid data virtualization platform.
Informatica Solution for SWIFT IntegrationKim Loughead
Overview of Informatica's solution for financial services organizations who need to exchange payment data including SWIFT, NACHA, SEPA, FIX, etc. messages with other financial institutions
Denodo as the Core Pillar of your API StrategyDenodo
Watch full webinar here: https://buff.ly/2KTz2IB
Most people associate data virtualization with BI and analytics. However, one of the core ideas behind data virtualization is the decoupling of the consumption method from the data model. Why should the need for data requests in JSON over HTTP require extra development? Denodo provides immediate access to its datasets via REST, OData 4, GeoJSON and other protocols, with no coding involved. Easy to scale, cloud friendly and ready to integrate with API management tools, Denodo can be the perfect tool to fulfill your API strategy!
Attend this session to learn:
- What’s the role of Denodo in an API strategy
- Integration between Denodo and other elements of the API stack, like API management tools
- How easy it is to access Denodo as a RESTful endpoint
- Advanced options of Denodo web services: OAuth, OpenAPI, geographical capabilities, etc.
Do you lose precious time due to data quality problems?
Do you need to integrate data from multiples sources and provide an integrated view of your customer or product attributes to other systems?
SQL Server 2016 Data Quality and Master Data Services can help you.
Many companies today move mountains of data using ETL (extract, transform, load) technology. But data volumes are growing too large to move, customers are now expecting real-time data, and ETL costs now account for 10-15% of computing capacity. In this slide presentation, you can see how data virtualization enables data structures that were designed independently to be leveraged together, in real time, and without data movement, reducing complexity, lowering IT costs, and minimizing risk.
Data Integration through Data Virtualization (SQL Server Konferenz 2019)Cathrine Wilhelmsen
Data Integration through Data Virtualization - PolyBase and new SQL Server 2019 Features (Presented at SQL Server Konferenz 2019 on February 21st, 2019)
Webinar: How MongoDB is Used to Manage Reference Data - May 2014MongoDB
Managing and distributing reference data globally has always been a challenge for financial institutions. Managing and maintaining database schemas while integrating and replicating that data across geographies is costly and time consuming. MongoDB's native replication capabilities and partitioned architecture make it simple to distribute and synchronize data efficiently across the globe. MongoDB’s dynamic schema dramatically reduces database maintenance for schema migrations – data structure changes can be applied with no down time, and with no impact to existing applications. For example, by migrating its reference data management application to MongoDB, a Tier 1 bank dramatically reduced the license and hardware costs associated with the proprietary relational database it previously ran.
“A broad category of applications and technologies for gathering, storing, analyzing, sharing and providing access to data to help enterprise users make better business decisions” -Gartner
Business world today change rapidly and people needs change day by day. Due to the technology changes people adopt new technologies and devices. After introducing mobile people try to do their day to day activates through the mobile devices. To retrieve this data vendors developed mobile applications and they needed back end DB server but limited resources in mobiles such as memory, power, processing power and connection types they can’t used normal database, therefore, vendors introduced mobile database tools.
Microsoft master data services mds overviewEugene Zozulya
Master data management (MDM) is a technology discipline in which business and IT work together to ensure the uniformity, accuracy, stewardship, semantic consistency and accountability of the enterprise's official shared master data assets.
Master data management tools can be used to support master data management by removing duplicates, standardizing data (mass maintaining), and incorporating rules to eliminate incorrect data from entering the system in order to create an authoritative source of master data.
Microsoft Master Data Services (MDS) is the SQL Server solution for master data management. Master data management (MDM) describes the efforts made by an organization to discover and define non-transactional lists of data, with the goal of compiling maintainable master lists. An MDM project generally includes an evaluation and restructuring of internal business processes along with the implementation of MDM technology. The result of a successful MDM solution is reliable, centralized data that can be analyzed, resulting in better business decisions.
Other Master Data Services features include hierarchies, granular security, transactions, data versioning, and business rules.
Master Data Services includes the following components and tools:
- Master Data Services Configuration Manager, a tool you use to create and configure Master Data Services databases and web applications.
- Master Data Manager, a web application you use to perform administrative tasks (like creating a model or business rule), and that users access to update data.
- MDSModelDeploy.exe, a tool you use to create packages of your model objects and data so you can deploy them to other environments.
- Master Data Services web service, which developers can use to extend or develop custom solutions for Master Data Services.
Self Service Analytics and a Modern Data Architecture with Data Virtualizatio...Denodo
Watch full webinar here: https://bit.ly/32TT2Uu
Data virtualization is not just for self-service, it’s also a first-class citizen when it comes to modern data platform architectures. Technology has forced many businesses to rethink their delivery models. Startups emerged, leveraging the internet and mobile technology to better meet customer needs (like Amazon and Lyft), disrupting entire categories of business, and grew to dominate their categories.
Schedule a complimentary Data Virtualization Discovery Session with g2o.
Traditional companies are still struggling to meet rising customer expectations. During this webinar with the experts from g2o and Denodo we covered the following:
- How modern data platforms enable businesses to address these new customer expectation
- How you can drive value from your investment in a data platform now
- How you can use data virtualization to enable multi-cloud strategies
Leveraging the strategy insights of g2o and the power of the Denodo platform, companies do not need to undergo the costly removal and replacement of legacy systems to modernize their systems. g2o and Denodo can provide a strategy to create a modern data architecture within a company’s existing infrastructure.
Integration intervention: Get your apps and data up to speedKenneth Peeples
SOA has been the defacto methodology for enterprise application and process integration, because loosely coupled components and composite applications are more agile and efficient. The perfect solution? Not quite.
The data’s always been the problem. The most efficient and agile applications and services can be dragged down by the point-to-point data connections of a traditional data integration stack. Virtualized data services can eliminate the friction and get your applications up to speed.
In this webinar we'll show you how to (replay at http://www.redhat.com/en/about/events/integration-intervention-get-your-apps-and-data-speed):
-Quickly and easily create a virtual data services layer to plug data into your SOA infrastructure for an agile and efficient solution
-Derive more business value from your services.
Informatica Solution for SWIFT IntegrationKim Loughead
Overview of Informatica's solution for financial services organizations who need to exchange payment data including SWIFT, NACHA, SEPA, FIX, etc. messages with other financial institutions
Denodo as the Core Pillar of your API StrategyDenodo
Watch full webinar here: https://buff.ly/2KTz2IB
Most people associate data virtualization with BI and analytics. However, one of the core ideas behind data virtualization is the decoupling of the consumption method from the data model. Why should the need for data requests in JSON over HTTP require extra development? Denodo provides immediate access to its datasets via REST, OData 4, GeoJSON and other protocols, with no coding involved. Easy to scale, cloud friendly and ready to integrate with API management tools, Denodo can be the perfect tool to fulfill your API strategy!
Attend this session to learn:
- What’s the role of Denodo in an API strategy
- Integration between Denodo and other elements of the API stack, like API management tools
- How easy it is to access Denodo as a RESTful endpoint
- Advanced options of Denodo web services: OAuth, OpenAPI, geographical capabilities, etc.
Do you lose precious time due to data quality problems?
Do you need to integrate data from multiples sources and provide an integrated view of your customer or product attributes to other systems?
SQL Server 2016 Data Quality and Master Data Services can help you.
Many companies today move mountains of data using ETL (extract, transform, load) technology. But data volumes are growing too large to move, customers are now expecting real-time data, and ETL costs now account for 10-15% of computing capacity. In this slide presentation, you can see how data virtualization enables data structures that were designed independently to be leveraged together, in real time, and without data movement, reducing complexity, lowering IT costs, and minimizing risk.
Data Integration through Data Virtualization (SQL Server Konferenz 2019)Cathrine Wilhelmsen
Data Integration through Data Virtualization - PolyBase and new SQL Server 2019 Features (Presented at SQL Server Konferenz 2019 on February 21st, 2019)
Webinar: How MongoDB is Used to Manage Reference Data - May 2014MongoDB
Managing and distributing reference data globally has always been a challenge for financial institutions. Managing and maintaining database schemas while integrating and replicating that data across geographies is costly and time consuming. MongoDB's native replication capabilities and partitioned architecture make it simple to distribute and synchronize data efficiently across the globe. MongoDB’s dynamic schema dramatically reduces database maintenance for schema migrations – data structure changes can be applied with no down time, and with no impact to existing applications. For example, by migrating its reference data management application to MongoDB, a Tier 1 bank dramatically reduced the license and hardware costs associated with the proprietary relational database it previously ran.
“A broad category of applications and technologies for gathering, storing, analyzing, sharing and providing access to data to help enterprise users make better business decisions” -Gartner
Business world today change rapidly and people needs change day by day. Due to the technology changes people adopt new technologies and devices. After introducing mobile people try to do their day to day activates through the mobile devices. To retrieve this data vendors developed mobile applications and they needed back end DB server but limited resources in mobiles such as memory, power, processing power and connection types they can’t used normal database, therefore, vendors introduced mobile database tools.
Microsoft master data services mds overviewEugene Zozulya
Master data management (MDM) is a technology discipline in which business and IT work together to ensure the uniformity, accuracy, stewardship, semantic consistency and accountability of the enterprise's official shared master data assets.
Master data management tools can be used to support master data management by removing duplicates, standardizing data (mass maintaining), and incorporating rules to eliminate incorrect data from entering the system in order to create an authoritative source of master data.
Microsoft Master Data Services (MDS) is the SQL Server solution for master data management. Master data management (MDM) describes the efforts made by an organization to discover and define non-transactional lists of data, with the goal of compiling maintainable master lists. An MDM project generally includes an evaluation and restructuring of internal business processes along with the implementation of MDM technology. The result of a successful MDM solution is reliable, centralized data that can be analyzed, resulting in better business decisions.
Other Master Data Services features include hierarchies, granular security, transactions, data versioning, and business rules.
Master Data Services includes the following components and tools:
- Master Data Services Configuration Manager, a tool you use to create and configure Master Data Services databases and web applications.
- Master Data Manager, a web application you use to perform administrative tasks (like creating a model or business rule), and that users access to update data.
- MDSModelDeploy.exe, a tool you use to create packages of your model objects and data so you can deploy them to other environments.
- Master Data Services web service, which developers can use to extend or develop custom solutions for Master Data Services.
Self Service Analytics and a Modern Data Architecture with Data Virtualizatio...Denodo
Watch full webinar here: https://bit.ly/32TT2Uu
Data virtualization is not just for self-service, it’s also a first-class citizen when it comes to modern data platform architectures. Technology has forced many businesses to rethink their delivery models. Startups emerged, leveraging the internet and mobile technology to better meet customer needs (like Amazon and Lyft), disrupting entire categories of business, and grew to dominate their categories.
Schedule a complimentary Data Virtualization Discovery Session with g2o.
Traditional companies are still struggling to meet rising customer expectations. During this webinar with the experts from g2o and Denodo we covered the following:
- How modern data platforms enable businesses to address these new customer expectation
- How you can drive value from your investment in a data platform now
- How you can use data virtualization to enable multi-cloud strategies
Leveraging the strategy insights of g2o and the power of the Denodo platform, companies do not need to undergo the costly removal and replacement of legacy systems to modernize their systems. g2o and Denodo can provide a strategy to create a modern data architecture within a company’s existing infrastructure.
Integration intervention: Get your apps and data up to speedKenneth Peeples
SOA has been the defacto methodology for enterprise application and process integration, because loosely coupled components and composite applications are more agile and efficient. The perfect solution? Not quite.
The data’s always been the problem. The most efficient and agile applications and services can be dragged down by the point-to-point data connections of a traditional data integration stack. Virtualized data services can eliminate the friction and get your applications up to speed.
In this webinar we'll show you how to (replay at http://www.redhat.com/en/about/events/integration-intervention-get-your-apps-and-data-speed):
-Quickly and easily create a virtual data services layer to plug data into your SOA infrastructure for an agile and efficient solution
-Derive more business value from your services.
Enterprise data is often the backbone to providing timely and effective information for any enterprise application landscape, but we struggle to integrate our vital and often diverse sources of information to our applications in a timely and effective manner.
No more.
Whether you are a Data Analyst, Business Analyst or in IT Strategy, this webinar will illustrate how easy it is to integrate disparate data spread across your organization when modeling and automating your business processes with modern BPM tools.
We will take you through an in depth sample solution that simulates a Travel Agency with examples of some of the complexities you will encounter:
• realtime disparate data integration
• rule based data validation
• rule based fraud detection for payment processing
• service integration
You will receive an advanced overview of the capabilities of both the Red Hat JBoss Data Virtualization and Red Hat JBoss BPM Suite while left with a sample project to evaluate the solution.
SAP Analytics Cloud: Haben Sie schon alle Datenquellen im Live-Zugriff?Denodo
Watch full webinar here: https://bit.ly/3hfEO6d
Die SAP Analytics Cloud (kurz "SAC" genannt) ist ein Service in der Cloud, der umfangreiche Analysefunktionen für Benutzer in einem Produkt bereit stellt. Wie immer bei der SAP ist auch die SAC technologisch gut integriert in die Welt der SAP Systeme.
Doch die Daten, die Unternehmen heutzutage analysieren möchten, befinden sich sehr häufig in den unterschiedlichsten Datenquellen: In relationalen Datenbanken, in Data Lakes, in Webservices, in Dateien, in NoSQL Datenbanken,... Und so stellt sich zwangsläufig die Frage, wie Sie aus der SAC heraus alle Daten konnektieren, transformieren und kombinieren können. Und das möglichst live, d.h. mit Abfragen auf Echtzeit-Daten! Hier kommt die Datenvirtualisierung ins Spiel: Sie bietet Anwendungen (so auch der SAC) einen einheitlichen, integrierten und performanten Zugriff auf SAP Daten und non-SAP Daten.
Erfahren Sie in diesem Webcast:
- Wie die Datenvirtualisierung funktioniert (in a Nutshell)
- Wie Sie aus der SAC heraus auf alle ihre Daten in Echtzeit zugreifen können ("Live Data Connection" genannt)
- Wie die Datenvirtualisierung die Performance auch für Abfragen auf grossen Datenmengen optimiert
Denodo: Enabling a Data Mesh Architecture and Data Sharing Culture at Landsba...Denodo
Sylvain Dutilh, INFORMATION INTELLIGENCE SPECIALIST, Landsbankinn
Traditional data processing leaves large pools of replicated and unsynchronized data sets behind. In an era when data grows exponentially and is disconnected and spread across silos, it has never been more unnecessary to replicate data. In this session, Sylvain from Landsbankinn will walk us through his organization's journey of implementing a Logical Data Warehouse and a data-sharing program by leveraging Data Virtualization capability that allowed it to build a central, secure business rules repository and an agile, modern data mesh architecture.
Data API as a Foundation for Systems of EngagementVictor Olex
From the creators of SlashDB (http://www.slashdb.com).
Enterprise evolution to Systems of Engagement will only succeed if they can leverage existing Systems of Record - databases. But database content can be difficult to discover and share.
We are introducing the idea of Resource Oriented Architecture as a foundation for building enterprise systems of engagement. ROA is a data abstraction layer (API), which uses URLs as references to the data at source (database).
- Triumph over data silos
- Enable data science and self-service reporting
- Develop enterprise mobile applications
Data Con LA 2018 - A tale of two BI standards: Data warehouses and data lakes...Data Con LA
A tale of two BI standards: Data warehouses and data lakes by Shant Hovsepian, Co-Founder and CTO, Arcadia Data
Data lakes as part of the logical data warehouse (LDW) have entered the trough of disillusionment. Some failures are due to lack of value from businesses focusing on the big data challenges and not the big analytics opportunity. After all, data is just data until you analyze it. While the data management aspect has been fairly well understood over the years, the success of business intelligence (BI) and analytics on data lakes lags behind. In fact, data lakes often fail because they are only accessible by highly skilled data scientists and not by business users. But BI tools have been able to access data warehouses for years, so what gives? Shant Hovsepian explains why existing BI tools are architected well for data warehouses but not data lakes, the pros and cons of each architecture, and why every organization should have two BI standards: one for data warehouses and one for data lakes.
How to Place Data at the Center of Digital Transformation in BFSIDenodo
Watch full webinar here: https://bit.ly/3j7E9Jo
Consumers are increasingly using digital banking tools and insurance models, and these numbers will only continue to grow. Financial and insurance organizations have to adapt to the new and always changing situation while complying with new regulations, such as IFRS17, and embracing ESG criteria.
At the heart of any digital transformation is data. Therefore, it is not a stretch to say that data management and analytics strategies differentiate many of the leaders from the laggards in the banking, financial services and insurance (BFSI) industry. BFSI organizations still relying on slow, traditional systems and data management processes will find themselves falling behind their competition. In addition, as many adopt cloud strategies, these traditional approaches fill the cloud modernization process with downtime and end user frustration. In fact, according to a McKinsey article, cloud combined with distributed data infrastructure will define how consumers and providers adopt digital insurance models for the next decade.
Hear how the BFSI industry is leveraging data virtualization to deploy data fabric or data mesh architectures for enterprise-wide digital transformation.
Join this webinar to learn:
- The latest trends in BFSI for 2023 and how data and analytics is reshaping the industry
- How a logical data architecture can help you capitalize on your data
- How Denodo customers digitally transformed themselves using the Denodo Platform
Data Ninja Webinar Series: Realizing the Promise of Data LakesDenodo
Watch the full webinar: Data Ninja Webinar Series by Denodo: https://goo.gl/QDVCjV
The expanding volume and variety of data originating from sources that are both internal and external to the enterprise are challenging businesses in harnessing their big data for actionable insights. In their attempts to overcome big data challenges, organizations are exploring data lakes as consolidated repositories of massive volumes of raw, detailed data of various types and formats. But creating a physical data lake presents its own hurdles.
Attend this session to learn how to effectively manage data lakes for improved agility in data access and enhanced governance.
This is session 5 of the Data Ninja Webinar Series organized by Denodo. If you want to learn more about some of the solutions enabled by data virtualization, click here to watch the entire series: https://goo.gl/8XFd1O
Modern Data Management for Federal ModernizationDenodo
Watch full webinar here: https://bit.ly/2QaVfE7
Faster, more agile data management is at the heart of government modernization. However, Traditional data delivery systems are limited in realizing a modernized and future-proof data architecture.
This webinar will address how data virtualization can modernize existing systems and enable new data strategies. Join this session to learn how government agencies can use data virtualization to:
- Enable governed, inter-agency data sharing
- Simplify data acquisition, search and tagging
- Streamline data delivery for transition to cloud, data science initiatives, and more
Denodo Partner Connect: A Review of the Top 5 Differentiated Use Cases for th...Denodo
Watch full webinar here: https://buff.ly/46pRfV7
This Denodo session explores the power of data virtualization, shedding light on its architecture, customer value, and a diverse range of use cases. Attendees will discover how the Denodo Platform enables seamless connectivity to various data sources while effortlessly combining, cleansing, and delivering data through 5 differentiated use cases.
Architecture: Delve into the core architecture of the Denodo Platform and learn how it empowers organizations to create a unified virtual data layer. Understand how data is accessed, integrated, and delivered in a real-time, agile manner.
Value for the Customer: Explore the tangible benefits that Denodo offers to its customers. From cost savings to improved decision-making, discover how the Denodo Platform helps organizations derive maximum value from their data assets.
Five Different Use Cases: Uncover five real-world use cases where Denodo's data virtualization platform has made a significant impact. From data governance to analytics, Denodo proves its versatility across a variety of domains.
- Logical Data Fabric
- Self Service Analytics
- Data Governance
- 360 degree of Entities
- Hybrid/Multi-Cloud Integration
Watch this illuminating session to gain insights into the transformative capabilities of the Denodo Platform.
Cloud Modernization and Data as a Service OptionDenodo
Watch here: https://bit.ly/36tEThx
The current data landscape is fragmented, not just in location but also in terms of shape and processing paradigms. Cloud has become a key component of modern architecture design. Data lakes, IoT, NoSQL, SaaS, etc. coexist with relational databases to fuel the needs of modern analytics, ML and AI. Exploring and understanding the data available within your organization is a time-consuming task. Dealing with bureaucracy, different languages and protocols, and the definition of ingestion pipelines to load that data into your data lake can be complex. And all of this without even knowing if that data will be useful at all.
Attend this session to learn:
- How dynamic data challenges and the speed of change requires a new approach to data architecture – one that is real-time, agile and doesn’t rely on physical data movement.
- Learn how logical data architecture can enable organizations to transition data faster to the cloud with zero downtime and ultimately deliver faster time to insight.
- Explore how data as a service and other API management capabilities is a must in a hybrid cloud environment.
Data - and the things we want to do with data - exist in many different forms. Getting those formats and tasks to play nicely together can sometimes be a painstaking grind. The difficulty escalates if we need to switch between specialized tools, designed to address only a small subset of what we need to accomplish.
Enter, the Composable DataFlow.
Composable DataFlows are event-driven pipelines that consist of functional modules, strung together to form full analytical workflows. For developers, DataFlows can represent independently-deployable microservices, and can be used as part of a broader Microservice Architecture.
In this session, we will use a Composable DataFlow to extract data via API, transform JSON into a tabular structure, and load that data into a database of our own creation (using Composable DataPortal). We will also explore the DataFlow's Module Library to see what other options we have to help make our data... flow.
Meetup: https://www.meetup.com/boston-data-engineering/events/289525162/
Myth Busters IV: I Access My Data Through APIs–Data Virtualization Can't Do ThisDenodo
Watch full webinar here: https://bit.ly/3frkTmj
When you hear data virtualization, do you think BI and analytics? If so, you’re misinformed and missing out on a whole set of possibilities and capabilities of this technology. It is probably also why you think data virtualization is of no use to you if you need to access your data through APIs.
This is why we’re back with another episode of Myth Busters!
We enter the world, or ecosystem, of APIs and API-based architectures to investigate whether data virtualization plays a role in it. Maybe we’ll even learn that it possibly can enhance API capabilities and increase the benefits?
Here’s what we’ll be exploring:
- Is there a place for data virtualization in an API strategy?
- Can data virtualization enhance the deployment and exposure of APIs?
- Can data virtualization work as a service container or as an API gateway?
- Data virtualization and GraphQL...are they really like oil and water?
Big Data LDN 2018: A TALE OF TWO BI STANDARDS: DATA WAREHOUSES AND DATA LAKESMatt Stubbs
Date: 13th November 2018
Location: Self-Service Analytics Theatre
Time: 14:30 - 15:00
Speaker: Zaf Khan
Organisation: Arcadia Data
About: The use of data lakes continue to grow, and a recent survey by Eckerson Group shows that organizations are getting real value from their deployments. However, there’s still a lot of room for improvement when it comes to giving business users access to the wealth of potential insights in the data lake.
While the data management aspect has been fairly well understood over the years, the success of business intelligence (BI) and analytics on data lakes lags behind. In fact, organizations often struggle with data lakes because they are only accessible by highly-skilled data scientists and not by business users. But BI tools have been able to access data warehouses for years, so what gives?
In this talk, we’ll discuss:
• Why traditional BI tools are architected well for data warehouses, but not data lakes.
• Why every organization should have two BI standards: one for data warehouses and one for data lakes.
• Innovative capabilities provided by BI for data lakes
Cross Domain Solutions for SolarWinds from Sterling ComputersDLT Solutions
Ed Bender, Senior Federal SE Manager, SolarWinds, and Ben Chernicoff, Software Architect, Sterling Computers, present how Sterling’s technology partnership with SolarWinds addresses issues through specialized cross domain products and solutions that enable users to access information across multiple domains, while still ensuring fidelity of information within the appropriate classification level.
Don Maclean, Chief Cybersecurity Technologist, DLT Solutions, and Mav Turner, IT Security Business Unit, SolarWinds, share the most important things you can do to keep your networks and data safe, and what tools are available to help.
Symantec and ForeScout Delivering a Unified Cyber Security SolutionDLT Solutions
Tom Blauvelt from Symantec and Sean Telles and Chris Dullea from ForeScout share how both companies together can deliver a unified cyber security solution.
Deploying and Managing Red Hat Enterprise Linux in Amazon Web ServicesDLT Solutions
The Federal Cloud First policy mandates that agencies take full advantage of cloud computing benefits to maximize capacity utilization, improve IT flexibility and responsiveness, and minimize cost. But how can you safely and reliably begin to deploy and manage your Red Hat instances at cloud scale? With IT automation, you can more easily deploy and manage your Red Hat instances in the Amazon Web Services (AWS) public cloud.
In this webinar, we’ll demonstrate how to:
Automate the creation of Red Hat Enterprise Linux-based AWS instances
Apply a security baseline to the instances
Deploy and manage an application
Regardless of where you are in the cloud adoption process, leveraging IT automation can help smooth the transition to the cloud. Join the webinar to learn how.
T.J. Meehan, AIA, LEED AP / Vice President of Professional Services, shares how to implement BIM for owners from the Government Solutions Breakfast hosted by DLT Solutions and CADD Microsystems.
Autodesk Infrastructure Solutions for Government AgenciesDLT Solutions
Donnie Gladfelter, Technical Product and Online Manager, shares Autodesk infrastructure solutions for government agencies from the Government Solutions Breakfast hosted by DLT Solutions and CADD Microsystems.
Federated data organizations in public sector face more challenges today than ever before. As discovered via research performed by North Highland Consulting, these are the top issues you are most likely experiencing:
• Knowing what data is available to support programs and other business functions
• Data is more difficult to access
• Without insight into the lineage of data, it is risky to use as the basis for critical decisions
• Analyzing data and extracting insights to influence outcomes is difficult at best
The solution to solving these challenges lies in creating a holistic enterprise data governance program and enforcing the program with a full-featured enterprise data management platform. Kreig Fields, Principle, Public Sector Data and Analytics, from North Highland Consulting and Rob Karel, Vice President, Product Strategy and Product Marketing, MDM from Informatica will walk through a pragmatic, “How To” approach, full of useful information on how you can improve your agency’s data governance initiatives.
Learn how to kick start your data governance intiatives and how an enterprise data management platform can help you:
• Innovate and expose hidden opportunities
• Break down data access barriers and ensure data is trusted
• Provide actionable information at the speed of business
TROUBLESHOOTING 9 TYPES OF OUTOFMEMORYERRORTier1 app
Even though at surface level ‘java.lang.OutOfMemoryError’ appears as one single error; underlyingly there are 9 types of OutOfMemoryError. Each type of OutOfMemoryError has different causes, diagnosis approaches and solutions. This session equips you with the knowledge, tools, and techniques needed to troubleshoot and conquer OutOfMemoryError in all its forms, ensuring smoother, more efficient Java applications.
Check out the webinar slides to learn more about how XfilesPro transforms Salesforce document management by leveraging its world-class applications. For more details, please connect with sales@xfilespro.com
If you want to watch the on-demand webinar, please click here: https://www.xfilespro.com/webinars/salesforce-document-management-2-0-smarter-faster-better/
Cyaniclab : Software Development Agency Portfolio.pdfCyanic lab
CyanicLab, an offshore custom software development company based in Sweden,India, Finland, is your go-to partner for startup development and innovative web design solutions. Our expert team specializes in crafting cutting-edge software tailored to meet the unique needs of startups and established enterprises alike. From conceptualization to execution, we offer comprehensive services including web and mobile app development, UI/UX design, and ongoing software maintenance. Ready to elevate your business? Contact CyanicLab today and let us propel your vision to success with our top-notch IT solutions.
First Steps with Globus Compute Multi-User EndpointsGlobus
In this presentation we will share our experiences around getting started with the Globus Compute multi-user endpoint. Working with the Pharmacology group at the University of Auckland, we have previously written an application using Globus Compute that can offload computationally expensive steps in the researcher's workflows, which they wish to manage from their familiar Windows environments, onto the NeSI (New Zealand eScience Infrastructure) cluster. Some of the challenges we have encountered were that each researcher had to set up and manage their own single-user globus compute endpoint and that the workloads had varying resource requirements (CPUs, memory and wall time) between different runs. We hope that the multi-user endpoint will help to address these challenges and share an update on our progress here.
Unleash Unlimited Potential with One-Time Purchase
BoxLang is more than just a language; it's a community. By choosing a Visionary License, you're not just investing in your success, you're actively contributing to the ongoing development and support of BoxLang.
Quarkus Hidden and Forbidden ExtensionsMax Andersen
Quarkus has a vast extension ecosystem and is known for its subsonic and subatomic feature set. Some of these features are not as well known, and some extensions are less talked about, but that does not make them less interesting - quite the opposite.
Come join this talk to see some tips and tricks for using Quarkus and some of the lesser known features, extensions and development techniques.
Large Language Models and the End of ProgrammingMatt Welsh
Talk by Matt Welsh at Craft Conference 2024 on the impact that Large Language Models will have on the future of software development. In this talk, I discuss the ways in which LLMs will impact the software industry, from replacing human software developers with AI, to replacing conventional software with models that perform reasoning, computation, and problem-solving.
Software Engineering, Software Consulting, Tech Lead.
Spring Boot, Spring Cloud, Spring Core, Spring JDBC, Spring Security,
Spring Transaction, Spring MVC,
Log4j, REST/SOAP WEB-SERVICES.
Prosigns: Transforming Business with Tailored Technology SolutionsProsigns
Unlocking Business Potential: Tailored Technology Solutions by Prosigns
Discover how Prosigns, a leading technology solutions provider, partners with businesses to drive innovation and success. Our presentation showcases our comprehensive range of services, including custom software development, web and mobile app development, AI & ML solutions, blockchain integration, DevOps services, and Microsoft Dynamics 365 support.
Custom Software Development: Prosigns specializes in creating bespoke software solutions that cater to your unique business needs. Our team of experts works closely with you to understand your requirements and deliver tailor-made software that enhances efficiency and drives growth.
Web and Mobile App Development: From responsive websites to intuitive mobile applications, Prosigns develops cutting-edge solutions that engage users and deliver seamless experiences across devices.
AI & ML Solutions: Harnessing the power of Artificial Intelligence and Machine Learning, Prosigns provides smart solutions that automate processes, provide valuable insights, and drive informed decision-making.
Blockchain Integration: Prosigns offers comprehensive blockchain solutions, including development, integration, and consulting services, enabling businesses to leverage blockchain technology for enhanced security, transparency, and efficiency.
DevOps Services: Prosigns' DevOps services streamline development and operations processes, ensuring faster and more reliable software delivery through automation and continuous integration.
Microsoft Dynamics 365 Support: Prosigns provides comprehensive support and maintenance services for Microsoft Dynamics 365, ensuring your system is always up-to-date, secure, and running smoothly.
Learn how our collaborative approach and dedication to excellence help businesses achieve their goals and stay ahead in today's digital landscape. From concept to deployment, Prosigns is your trusted partner for transforming ideas into reality and unlocking the full potential of your business.
Join us on a journey of innovation and growth. Let's partner for success with Prosigns.
Understanding Globus Data Transfers with NetSageGlobus
NetSage is an open privacy-aware network measurement, analysis, and visualization service designed to help end-users visualize and reason about large data transfers. NetSage traditionally has used a combination of passive measurements, including SNMP and flow data, as well as active measurements, mainly perfSONAR, to provide longitudinal network performance data visualization. It has been deployed by dozens of networks world wide, and is supported domestically by the Engagement and Performance Operations Center (EPOC), NSF #2328479. We have recently expanded the NetSage data sources to include logs for Globus data transfers, following the same privacy-preserving approach as for Flow data. Using the logs for the Texas Advanced Computing Center (TACC) as an example, this talk will walk through several different example use cases that NetSage can answer, including: Who is using Globus to share data with my institution, and what kind of performance are they able to achieve? How many transfers has Globus supported for us? Which sites are we sharing the most data with, and how is that changing over time? How is my site using Globus to move data internally, and what kind of performance do we see for those transfers? What percentage of data transfers at my institution used Globus, and how did the overall data transfer performance compare to the Globus users?
Code reviews are vital for ensuring good code quality. They serve as one of our last lines of defense against bugs and subpar code reaching production.
Yet, they often turn into annoying tasks riddled with frustration, hostility, unclear feedback and lack of standards. How can we improve this crucial process?
In this session we will cover:
- The Art of Effective Code Reviews
- Streamlining the Review Process
- Elevating Reviews with Automated Tools
By the end of this presentation, you'll have the knowledge on how to organize and improve your code review proces
We describe the deployment and use of Globus Compute for remote computation. This content is aimed at researchers who wish to compute on remote resources using a unified programming interface, as well as system administrators who will deploy and operate Globus Compute services on their research computing infrastructure.
Providing Globus Services to Users of JASMIN for Environmental Data AnalysisGlobus
JASMIN is the UK’s high-performance data analysis platform for environmental science, operated by STFC on behalf of the UK Natural Environment Research Council (NERC). In addition to its role in hosting the CEDA Archive (NERC’s long-term repository for climate, atmospheric science & Earth observation data in the UK), JASMIN provides a collaborative platform to a community of around 2,000 scientists in the UK and beyond, providing nearly 400 environmental science projects with working space, compute resources and tools to facilitate their work. High-performance data transfer into and out of JASMIN has always been a key feature, with many scientists bringing model outputs from supercomputers elsewhere in the UK, to analyse against observational or other model data in the CEDA Archive. A growing number of JASMIN users are now realising the benefits of using the Globus service to provide reliable and efficient data movement and other tasks in this and other contexts. Further use cases involve long-distance (intercontinental) transfers to and from JASMIN, and collecting results from a mobile atmospheric radar system, pushing data to JASMIN via a lightweight Globus deployment. We provide details of how Globus fits into our current infrastructure, our experience of the recent migration to GCSv5.4, and of our interest in developing use of the wider ecosystem of Globus services for the benefit of our user community.
How Recreation Management Software Can Streamline Your Operations.pptxwottaspaceseo
Recreation management software streamlines operations by automating key tasks such as scheduling, registration, and payment processing, reducing manual workload and errors. It provides centralized management of facilities, classes, and events, ensuring efficient resource allocation and facility usage. The software offers user-friendly online portals for easy access to bookings and program information, enhancing customer experience. Real-time reporting and data analytics deliver insights into attendance and preferences, aiding in strategic decision-making. Additionally, effective communication tools keep participants and staff informed with timely updates. Overall, recreation management software enhances efficiency, improves service delivery, and boosts customer satisfaction.
top nidhi software solution freedownloadvrstrong314
This presentation emphasizes the importance of data security and legal compliance for Nidhi companies in India. It highlights how online Nidhi software solutions, like Vector Nidhi Software, offer advanced features tailored to these needs. Key aspects include encryption, access controls, and audit trails to ensure data security. The software complies with regulatory guidelines from the MCA and RBI and adheres to Nidhi Rules, 2014. With customizable, user-friendly interfaces and real-time features, these Nidhi software solutions enhance efficiency, support growth, and provide exceptional member services. The presentation concludes with contact information for further inquiries.
How Does XfilesPro Ensure Security While Sharing Documents in Salesforce?XfilesPro
Worried about document security while sharing them in Salesforce? Fret no more! Here are the top-notch security standards XfilesPro upholds to ensure strong security for your Salesforce documents while sharing with internal or external people.
To learn more, read the blog: https://www.xfilespro.com/how-does-xfilespro-make-document-sharing-secure-and-seamless-in-salesforce/
Multiple Your Crypto Portfolio with the Innovative Features of Advanced Crypt...Hivelance Technology
Cryptocurrency trading bots are computer programs designed to automate buying, selling, and managing cryptocurrency transactions. These bots utilize advanced algorithms and machine learning techniques to analyze market data, identify trading opportunities, and execute trades on behalf of their users. By automating the decision-making process, crypto trading bots can react to market changes faster than human traders
Hivelance, a leading provider of cryptocurrency trading bot development services, stands out as the premier choice for crypto traders and developers. Hivelance boasts a team of seasoned cryptocurrency experts and software engineers who deeply understand the crypto market and the latest trends in automated trading, Hivelance leverages the latest technologies and tools in the industry, including advanced AI and machine learning algorithms, to create highly efficient and adaptable crypto trading bots
A Comprehensive Look at Generative AI in Retail App Testing.pdfkalichargn70th171
Traditional software testing methods are being challenged in retail, where customer expectations and technological advancements continually shape the landscape. Enter generative AI—a transformative subset of artificial intelligence technologies poised to revolutionize software testing.
In software engineering, the right architecture is essential for robust, scalable platforms. Wix has undergone a pivotal shift from event sourcing to a CRUD-based model for its microservices. This talk will chart the course of this pivotal journey.
Event sourcing, which records state changes as immutable events, provided robust auditing and "time travel" debugging for Wix Stores' microservices. Despite its benefits, the complexity it introduced in state management slowed development. Wix responded by adopting a simpler, unified CRUD model. This talk will explore the challenges of event sourcing and the advantages of Wix's new "CRUD on steroids" approach, which streamlines API integration and domain event management while preserving data integrity and system resilience.
Participants will gain valuable insights into Wix's strategies for ensuring atomicity in database updates and event production, as well as caching, materialization, and performance optimization techniques within a distributed system.
Join us to discover how Wix has mastered the art of balancing simplicity and extensibility, and learn how the re-adoption of the modest CRUD has turbocharged their development velocity, resilience, and scalability in a high-growth environment.
3. “Bad Company”
7/19/16 DLT Solutions LLC – Proprietary & Confidential 3
“Kiss” “Whitesnake” “Poison”
Data
WarehouseData Virtualization Server
4. What does Data Virtualization software do?
7/19/16 DLT Solutions LLC – Proprietary & Confidential 4
Virtual Consolidated Data Source
BI Reports
Data Virtualization Software
•Consume
•Compose
•Connect
SAP Salesforce.comOracle DW XML, CSV
& Excel files
Siloed &
Complex
Virtualize
Abstract
Federate
Easy,
Real-time
Information
Access
Applications
DATA CONSUMERS
DATA SOURCES
5. “Bad Company”
7/19/16 DLT Solutions LLC – Proprietary & Confidential 5
“Kiss” “Whitesnake” “Poison”
Data
WarehouseData Virtualization Server
6. “Bad Company”
7/19/16 DLT Solutions LLC – Proprietary & Confidential 6
“Kiss” “Whitesnake” “Poison”
Data
WarehouseData Virtualization Server
7. Data Challenges Getting Bigger
7/19/16 DLT Solutions LLC – Proprietary & Confidential 7
BI Reports
Operational
Reports
Enterprise
Applications
Cloud Native
Applications
Mobile
Applications
Hadoop NoSQL Cloud Apps Data Warehouse
& Databases
Mainframe XML, CSV
& Excel Files
Enterprise Apps
Integration Complexity
Consumption & Creation
Siloed
How to Integrate?
8. Improve Access to Your Data
7/19/16 DLT Solutions LLC – Proprietary & Confidential 8
BI Reports
Operational
Reports
Enterprise
Applications
Cloud Native
Applications
Mobile
Applications
Hadoop NoSQL Cloud Apps Data Warehouse
& Databases
Mainframe XML, CSV
& Excel Files
Enterprise Apps
Broad & Streamlined
Adaptable & Secure
Federated & MeaningfulData Virtualization Server
9. Simplify Access to Your Data
7/19/16 DLT Solutions LLC – Proprietary & Confidential 9
streaming
databases
social
media data
production
application
big data
stores
website
ESB
analytics
& reporting
unstructured
data
mobile
App
data
warehouse
& data marts
internal
portal dashboard
external
data
private
data
ODBC/SQL JDBC/SQL XML/SOAP REST/JSON OData SQL
JMS SQL JDBC OData Hive RSS Excel JSONREST SOAP
JMS message SQL statement SOAP messageData Virtualization Server
production
databases
applications
10. Turn Siloed Data into Actionable Information
7/19/16 DLT Solutions LLC – Proprietary & Confidential 10
Connect
Compose
Consume
BI Reports & Analytics
Mobile Applications
Applications & PortalsESB, ETL
Native Data Connectivity
Standard based Data Provisioning
JDBC, ODBC, SOAP, REST, OData
JBoss
Data
Virtual-
ization
Data
Consumers
Data
Sources
Design Tools
Dashboard
Optimization
Caching
Security
Metadata
Hadoop NoSQL Cloud Apps Data Warehouse
& Databases
Mainframe
XML, CSV
& Excel Files
Enterprise Apps
Siloed &
Complex
Virtualize
Transform
Federate
Easy,
Real-time
Information
Access
Unified Virtual Database / Common Data Model
Data Transformations
11. Supported Data Sources
7/19/16
DLT Solutions LLC – Proprietary & Confidential 11
Enterprise RDBMS:
•Oracle
•IBM DB2
•Microsoft SQL Server
•Sybase ASE
•MySQL
•MariaDB
•PostgreSQL
•Ingres
Enterprise EDW:
•Teradata
•Netezza
•Greenplum
Search:
•Apache SOLR
Hadoop:
•Apache
•HortonWorks
•Cloudera
•More coming…
Office Productivity:
•Microsoft Excel
•Microsoft Access
•Google Spreadsheets
Specialty Data
Sources:
•ModeShape Repository
•Mondrian
•MetaMatrix
•LDAP
•Apache POI for Excel
NoSQL:
•JBoss Data Grid
•MongoDB
•Cassandra
•More coming…
Enterprise & Cloud
Applications:
•Salesforce.com
•SAP
Technology
Connectors:
•Flat Files, XML Files,
XML over HTTP
•SOAP Web Services
•REST Web Services
•OData Services
7/19/16
12. Data As A Service
DLT Solutions LLC – Proprietary & Confidential 127/19/16
Contextual view of disparate
source data
Single point of access
Standard based interfaces
Shareable integration and
transformation logic
Reusable data services
But you cannot achieve this by
writing more application code…
Hadoop NoSQL Cloud Apps Data Warehouse
& Databases
Mainframe XML, CSV
& Excel Files
Enterprise Apps
JBoss Data Virtualization
BI Dashboard
& Reports
Analytical
Applications
ESB/SOA
Integration
BPM
Applications
Mobile
Applications
SQL Statement SOAP MessageREST Message
REST Request
JSON Result
SQL Request
SQL Result
16. 7/19/16 DLT Solutions LLC – Proprietary & Confidential 16
Tooling VirtualDB Engine Server
Users create data models
based on metadata:
•Imported from data
sources
•Supplied via DDL
•Provided by Engine
•Specified by user
Models are packaged in a
Virtual Database (VDB)
Physical Models representing actual data sources
Logical Models
17. 7/19/16 DLT Solutions LLC – Proprietary & Confidential 17
Tooling VirtualDB Engine Server
Build XML Document
models from XML Schemas
Map XML Document
models to other data models
Enable data access via
XML
18. 7/19/16 DLT Solutions LLC – Proprietary & Confidential 18
Tooling VirtualDB Engine Server
Virtual Databases (VDBs) are deployment
archives similar to .WAR.
VDBs contain
•Source metadata and models
•View metadata and models
•System metadata
•Connection information, which is bound to
sources at deployment time
VDBs are deployed to the query engine
VDB Internals
Source Models
Connector
Binding
Properties
View Models
Manifesto Info
19. 7/19/16
19
Tooling VirtualDB Engine Server
JBoss Data Virtualization can offer finer-grained
security control:
Authentication: Kerberos, LDAP, WS-UsernameToken,
HTTP Basic, SAML
Authorization: Virtual data views, Role based access
control
Administration: Centralized management of Virtual DB
privileges
Audit: Centralized audit logging and dashboard
Protection:
Row and column masking
SSL encryption (ODBC and JDBC)
DLT Solutions LLC – Proprietary & Confidential
20. 7/19/16 DLT Solutions LLC – Proprietary & Confidential 20
Tooling VirtualDB Engine Server
Query Engine
JDBC API
VDB
Connector
Binding (1)
Connector
Binding (2)
C1 C2
DB
Oracle
DB
SQL Server
Data Consumer Apps
Query Engine is core data
virtualization functionality: Federating
relational query engine. Rule and cost
based optimizer, advanced query
planner, caching, hint processing.
Query Engine hosts VDBs, binds to
data sources, performs query
execution and results processing.
21. 7/19/16
21
Tooling VirtualDB Engine Server
The Teiid Query engine is hosted in
JBoss EAP and uses key container-
provided services:
•Transaction manager
•JAAS security framework
•Container managed data sources
•EAP management infrastructure
•EAP deployment
The Server exposes views /services
to consumers and managed
connections and connection pools for
data sources.
DLT Solutions LLC – Proprietary & Confidential
JBoss EAP
Applications
Security
JAAS
Transaction
Manager
JDV Runtime Engine
BufferMgr
Threading
Local Caches
etc.
VDB
VDBs
ODBC Socket
Transport
Admin Socket
Transport
JDBC Socket
Transport
Profile
Service
ODBC
JDBC
Admin /
AdminShell
JON
DS
DS
DS
DS
JCA
Translators
Embedded DS
xxx-ds.xml
yyy-ds.xml
zzz-ds.xml
22. 7/19/16
22
Tooling VirtualDB Engine Server
DLT Solutions LLC – Proprietary & Confidential
CACHING & MATERIALIZATION
Multiple levels of caching to meet
performance requirements and manage load
on source systems:
Materialized Views
–External or Internal materialized views
–Ability to override use of materialized
views
Result set Caching
–Applied to results return from user queries
and virtual procedure calls
–Configurable time to live and max. number
of entries
Code Table Caching
–Suited for integrating reference data with
transaction/operational data e.g. Country
code, State Code etc.
QUERY
Access Patterns – criteria requirements on
pushdown queries
Pushdown – decompose user query into
source queries
–Projection minimization to remove unused
select items
–Decompose aggregates over joins/unions
–Generating SQL matching Teiid system
functions
Dependent Joins (can use hints) – feed equi-
join values from one side of the join to the
other
Partition aware aggregation and joins
Copy Criteria – uses criteria transitivity to
minimize join tuples.
PERFORMANCE OPTIMIZATION
24. Bring It All Together
7/19/16 DLT Solutions LLC – Proprietary & Confidential 24
Hadoop
Data Integration
JBoss Data Virtualization
In-memory Cache
JBoss Data Grid
BI Analytics
(historical, operational, predictive)
Composite Applications
Messaging and Event Processing
JBoss A-MQ and JBoss BRMS
J
Structured Data
Streaming
Data
Semi-Structured
Data
Capture&ProcessIntegrate&Analyze
Red Hat Storage
28. JBoss Data Virtualization – Use Cases
7/19/16 DLT Solutions LLC – Proprietary & Confidential 28
Self-Service
Business
Intelligence
The virtual, reusable data model provides business-friendly representation of data,
allowing the user to interact with their data without having to know the complexities of
their database or where the data is stored and allowing multiple BI tools to acquire data
from centralized data layer. Gain better insights from Big Data using JBoss Data
Virtualization to integrate with existing information sources.
360◦
Unified
View
Deliver a complete view of master & transactional data in real-time. The virtual data layer
serves as a unified, enterprise-wide view of business information that improves users’ ability
to understand and leverage enterprise data.
Agile SOA
Data
Services
A data virtualization layer deliver the missing data services layer to SOA applications. JBoss
Data Virtualization increases agility and loose coupling with virtual data stores without the
need to touch underlying sources and creation of data services that encapsulate the data
access logic and allowing multiple business service to acquire data from centralized data
layer.
Regulatory
Compliance
Data Virtualization layer deliver the data firewall functionality. JBoss Data Virtualization
improves data quality via centralized access control, robust security infrastructure and
reduction in physical copies of data thus reducing risk. Furthermore, the metadata
repository catalogs enterprise data locations and the relationships between the data in
various data stores, enabling transparency and visibility.
29. 7/19/16 DLT Solutions LLC – Proprietary & Confidential 29
BA C D
JBoss Data Virtualization
Leveraged TPC-H like schema, data and queries
Use 4 different commercial enterprise RDBMS
Each database with 1 TB data representing
•150 million customers, with over
•600 million order records, and
•6 billion order line items.
•Total 4 TB of data
Findings:
•No measurable JDV queries overhead vs. direct queries
•Queries to federated data from four data sources ran
61.7 percent faster vs. baseline
•Scaling queries workload by 2x resulted in <10% impact
on response time
Download Benchmark Study @ http://www.redhat.com/en/resources/jboss-data-virtualization-query-performance-benchmark-study