The innovations keep coming! Discover what’s new on the Adabas & Natural product roadmap that can help you optimize, modernize & transform your systems. Hear how customers successfully embraced digital transformation using APIs and data integration. Get tips from our services and solutions experts on how to address staffing challenges, end of maintenance, and demand for data for analytics.
Join your peers and experts from Software AG to explore:
• Adabas & Natural 2050+ innovations & roadmap
• Mainframe modernization and cross-agency data sharing at DELJIS
• Bi-directional API implementation at TRS
• Options to train new talent and address staffing gaps
• End of support considerations for Natural 8 on z/OS
• How to liberate data for modern data analytics
• Adabas & Natural for z/OS License Key Management
To learn more about Software AG Adabas & Natural, please visit www.adabasnatural.com
My Talk at GCPUG-Taiwan on 2015/5/8.
You use BigQuery with SQL, but the internal work of BigQuery is very different from traditional Relational Database systems you may familiar with.
One of the way to understand how BigQuery works is to see it from the cost you pay for BigQuery. Knowing how to save money while using BigQuery is to know how BigQuery works to some extent.
In this session, let’s talk about practical knowledge (saving money) and exciting technology (how BigQuery works)!
In recent years, we have seen an overwhelming number of TV commercials that promise that the Cloud can help with many problems, including some family issues. What stands behind the terms “Cloud” and “Cloud Computing,” and what we can actually expect from this phenomenon? A group of students of the Computer Systems Technology department and Dr. T. Malyuta, whom has been working with the Cloud technologies since its early days, will provide an overview of the business and technological aspects of the Cloud.
Presented at Percona Live Amsterdam 2016, this is an in-depth look at MariaDB Server right up to MariaDB Server 10.1. Learn the differences. See what's already in MySQL. And so on.
Metadata is hotter than ever, according a number of recent DATAVERSITY surveys. More and more organizations are realizing that in order to drive business value from data, robust metadata is needed to gain the necessary context and lineage around key data assets. At the same time, industry regulations are driving the need for better transparency and understanding of information.
While metadata has been managed for decades, new strategies & approaches have been developed to support the ever-evolving data landscape, and provide more innovative ways to drive business value from metadata. This webinar will provide an overview of metadata strategies & technologies available to today’s organization, and provide insights into building successful business strategies for metadata adoption & use.
• History of Data Management
• Business Drivers for implementation of data governance • Building Data Strategy & Governance Framework
• Data Management Maturity Models
• Data Quality Management
• Metadata and Governance
• Metadata Management
• Data Governance Stakeholder Communication Strategy
Streaming Real-time Data to Azure Data Lake Storage Gen 2Carole Gunst
Check out this presentation to learn the basics of using Attunity Replicate to stream real-time data to Azure Data Lake Storage Gen2 for analytics projects.
My Talk at GCPUG-Taiwan on 2015/5/8.
You use BigQuery with SQL, but the internal work of BigQuery is very different from traditional Relational Database systems you may familiar with.
One of the way to understand how BigQuery works is to see it from the cost you pay for BigQuery. Knowing how to save money while using BigQuery is to know how BigQuery works to some extent.
In this session, let’s talk about practical knowledge (saving money) and exciting technology (how BigQuery works)!
In recent years, we have seen an overwhelming number of TV commercials that promise that the Cloud can help with many problems, including some family issues. What stands behind the terms “Cloud” and “Cloud Computing,” and what we can actually expect from this phenomenon? A group of students of the Computer Systems Technology department and Dr. T. Malyuta, whom has been working with the Cloud technologies since its early days, will provide an overview of the business and technological aspects of the Cloud.
Presented at Percona Live Amsterdam 2016, this is an in-depth look at MariaDB Server right up to MariaDB Server 10.1. Learn the differences. See what's already in MySQL. And so on.
Metadata is hotter than ever, according a number of recent DATAVERSITY surveys. More and more organizations are realizing that in order to drive business value from data, robust metadata is needed to gain the necessary context and lineage around key data assets. At the same time, industry regulations are driving the need for better transparency and understanding of information.
While metadata has been managed for decades, new strategies & approaches have been developed to support the ever-evolving data landscape, and provide more innovative ways to drive business value from metadata. This webinar will provide an overview of metadata strategies & technologies available to today’s organization, and provide insights into building successful business strategies for metadata adoption & use.
• History of Data Management
• Business Drivers for implementation of data governance • Building Data Strategy & Governance Framework
• Data Management Maturity Models
• Data Quality Management
• Metadata and Governance
• Metadata Management
• Data Governance Stakeholder Communication Strategy
Streaming Real-time Data to Azure Data Lake Storage Gen 2Carole Gunst
Check out this presentation to learn the basics of using Attunity Replicate to stream real-time data to Azure Data Lake Storage Gen2 for analytics projects.
Building an Effective Data Warehouse ArchitectureJames Serra
Why use a data warehouse? What is the best methodology to use when creating a data warehouse? Should I use a normalized or dimensional approach? What is the difference between the Kimball and Inmon methodologies? Does the new Tabular model in SQL Server 2012 change things? What is the difference between a data warehouse and a data mart? Is there hardware that is optimized for a data warehouse? What if I have a ton of data? During this session James will help you to answer these questions.
Modern Data Warehousing with the Microsoft Analytics Platform SystemJames Serra
The traditional data warehouse has served us well for many years, but new trends are causing it to break in four different ways: data growth, fast query expectations from users, non-relational/unstructured data, and cloud-born data. How can you prevent this from happening? Enter the modern data warehouse, which is able to handle and excel with these new trends. It handles all types of data (Hadoop), provides a way to easily interface with all these types of data (PolyBase), and can handle “big data” and provide fast queries. Is there one appliance that can support this modern data warehouse? Yes! It is the Analytics Platform System (APS) from Microsoft (formally called Parallel Data Warehouse or PDW) , which is a Massively Parallel Processing (MPP) appliance that has been recently updated (v2 AU1). In this session I will dig into the details of the modern data warehouse and APS. I will give an overview of the APS hardware and software architecture, identify what makes APS different, and demonstrate the increased performance. In addition I will discuss how Hadoop, HDInsight, and PolyBase fit into this new modern data warehouse.
High Performance Data Streaming with Amazon Kinesis: Best Practices (ANT322-R...Amazon Web Services
Amazon Kinesis makes it easy to collect, process, and analyze real-time, streaming data so you can get timely insights and react quickly to new information. In this session, we dive deep into best practices for Kinesis Data Streams and Kinesis Data Firehose to get the most performance out of your data streaming applications. Comcast uses Amazon Kinesis Data Streams to build a Streaming Data Platform that centralizes data exchanges. It is foundational to the way our data analysts and data scientists derive real-time insights from the data. In the second part of this talk, Comcast zooms into how to properly scale a Kinesis stream. We first list the factors to consider to avoid scaling issues with standard Kinesis stream consumption, and then we see how the new fan-out feature changes these scaling considerations.
Integrating data across systems has been a perpetual challenge. Unfortunately, the current technology-focused solutions have not helped IT to improve its dismal project success statistics. Data warehouses, BI implementations, and general analytical efforts achieve the same levels of success as other IT projects – approximately 1/3rd are considered successes when measured against price, schedule, or functionality objectives. The first step is determining the appropriate analysis approach to the data system integration challenge. The second step is understanding the strengths and weaknesses of various approaches. Turns out that proper analysis at this stage makes actual technology selection far more accurate. Only when these are accomplished can proper matching between problem and capabilities be achieved as the third step and true business value be delivered. This webinar will illustrate that good systems development more often depends on at least three data management disciplines in order to provide a solid foundation.
Find more Data-Ed webinars here: http://www.datablueprint.com/resource-center/webinar-schedule/
Cassandra concepts, patterns and anti-patternsDave Gardner
An introduction to the fundamental concepts behind Apache Cassandra. This talk explains the engineering principles that make Cassandra such an attractive choice for building highly resilient and available systems and then goes on to explain how to use it - covering basic data modelling patterns and anti-patterns.
Metadata management is critical for organizations looking to understand the context, definition and lineage of key data assets. Data models play a key role in metadata management, as many of the key structural and business definitions are stored within the models themselves. Can data models replace traditional metadata solutions? Or should they integrate with larger metadata management tools & initiatives?
Join this webinar to discuss opportunities and challenges around:
How data modeling fits within a larger metadata management landscape
When can data modeling provide “just enough” metadata management
Key data modeling artifacts for metadata
Organization, Roles & Implementation Considerations
Enabling a Data Mesh Architecture with Data VirtualizationDenodo
Watch full webinar here: https://bit.ly/3rwWhyv
The Data Mesh architectural design was first proposed in 2019 by Zhamak Dehghani, principal technology consultant at Thoughtworks, a technology company that is closely associated with the development of distributed agile methodology. A data mesh is a distributed, de-centralized data infrastructure in which multiple autonomous domains manage and expose their own data, called “data products,” to the rest of the organization.
Organizations leverage data mesh architecture when they experience shortcomings in highly centralized architectures, such as the lack domain-specific expertise in data teams, the inflexibility of centralized data repositories in meeting the specific needs of different departments within large organizations, and the slow nature of centralized data infrastructures in provisioning data and responding to changes.
In this session, Pablo Alvarez, Global Director of Product Management at Denodo, explains how data virtualization is your best bet for implementing an effective data mesh architecture.
You will learn:
- How data mesh architecture not only enables better performance and agility, but also self-service data access
- The requirements for “data products” in the data mesh world, and how data virtualization supports them
- How data virtualization enables domains in a data mesh to be truly autonomous
- Why a data lake is not automatically a data mesh
- How to implement a simple, functional data mesh architecture using data virtualization
Big data architectures and the data lakeJames Serra
With so many new technologies it can get confusing on the best approach to building a big data architecture. The data lake is a great new concept, usually built in Hadoop, but what exactly is it and how does it fit in? In this presentation I'll discuss the four most common patterns in big data production implementations, the top-down vs bottoms-up approach to analytics, and how you can use a data lake and a RDBMS data warehouse together. We will go into detail on the characteristics of a data lake and its benefits, and how you still need to perform the same data governance tasks in a data lake as you do in a data warehouse. Come to this presentation to make sure your data lake does not turn into a data swamp!
This talk discusses how we structure our analytics information at Adjust. The analytics environment consists of 20+ 20TB databases and many smaller systems for a total of more than 400 TB of data. See how we make it work, from structuring and modelling the data through moving data around between systems.
Practical Partitioning in Production with PostgresEDB
Has your table become too large to handle? Have you thought about chopping it up into smaller pieces that are easier to query and maintain? What if it's in constant use? An introduction to the problems that can arise and how PostgreSQL's partitioning features can help, followed by a real-world scenario of partitioning an existing huge table on a live system. We will be looking at the problems caused by having very large tables in your database and how declarative table partitioning in Postgres can help. Also, how to perform dimensioning before but also after creating huge tables, partitioning key selection, the importance of upgrading to get the latest Postgres features and finally we will dive into a real-world scenario of having to partition an existing huge table in use on a production system.
From: DataWorks Summit 2017 - Munich - 20170406
HBase hast established itself as the backend for many operational and interactive use-cases, powering well-known services that support millions of users and thousands of concurrent requests. In terms of features HBase has come a long way, overing advanced options such as multi-level caching on- and off-heap, pluggable request handling, fast recovery options such as region replicas, table snapshots for data governance, tuneable write-ahead logging and so on. This talk is based on the research for the an upcoming second release of the speakers HBase book, correlated with the practical experience in medium to large HBase projects around the world. You will learn how to plan for HBase, starting with the selection of the matching use-cases, to determining the number of servers needed, leading into performance tuning options. There is no reason to be afraid of using HBase, but knowing its basic premises and technical choices will make using it much more successful. You will also learn about many of the new features of HBase up to version 1.3, and where they are applicable.
NA Adabas & Natural User Group Meeting April 2023Software AG
Join us as we explore:
• Adabas & Natural 2050+ commitment to innovation and roadmap
• How technology health assessments are helping identify potential risks and opportunities
• Modernize by unlocking legacy and mainframe data to leverage Snowflake on AWS
• New features for Adabas & Natural on Linux and the cloud
• Enhanced security and administration features
• How to access SQL Server from Natural
• Options to skill up your staff
To learn more about Software AG Adabas & Natural, please visit www.adabasnatural.com
Building an Effective Data Warehouse ArchitectureJames Serra
Why use a data warehouse? What is the best methodology to use when creating a data warehouse? Should I use a normalized or dimensional approach? What is the difference between the Kimball and Inmon methodologies? Does the new Tabular model in SQL Server 2012 change things? What is the difference between a data warehouse and a data mart? Is there hardware that is optimized for a data warehouse? What if I have a ton of data? During this session James will help you to answer these questions.
Modern Data Warehousing with the Microsoft Analytics Platform SystemJames Serra
The traditional data warehouse has served us well for many years, but new trends are causing it to break in four different ways: data growth, fast query expectations from users, non-relational/unstructured data, and cloud-born data. How can you prevent this from happening? Enter the modern data warehouse, which is able to handle and excel with these new trends. It handles all types of data (Hadoop), provides a way to easily interface with all these types of data (PolyBase), and can handle “big data” and provide fast queries. Is there one appliance that can support this modern data warehouse? Yes! It is the Analytics Platform System (APS) from Microsoft (formally called Parallel Data Warehouse or PDW) , which is a Massively Parallel Processing (MPP) appliance that has been recently updated (v2 AU1). In this session I will dig into the details of the modern data warehouse and APS. I will give an overview of the APS hardware and software architecture, identify what makes APS different, and demonstrate the increased performance. In addition I will discuss how Hadoop, HDInsight, and PolyBase fit into this new modern data warehouse.
High Performance Data Streaming with Amazon Kinesis: Best Practices (ANT322-R...Amazon Web Services
Amazon Kinesis makes it easy to collect, process, and analyze real-time, streaming data so you can get timely insights and react quickly to new information. In this session, we dive deep into best practices for Kinesis Data Streams and Kinesis Data Firehose to get the most performance out of your data streaming applications. Comcast uses Amazon Kinesis Data Streams to build a Streaming Data Platform that centralizes data exchanges. It is foundational to the way our data analysts and data scientists derive real-time insights from the data. In the second part of this talk, Comcast zooms into how to properly scale a Kinesis stream. We first list the factors to consider to avoid scaling issues with standard Kinesis stream consumption, and then we see how the new fan-out feature changes these scaling considerations.
Integrating data across systems has been a perpetual challenge. Unfortunately, the current technology-focused solutions have not helped IT to improve its dismal project success statistics. Data warehouses, BI implementations, and general analytical efforts achieve the same levels of success as other IT projects – approximately 1/3rd are considered successes when measured against price, schedule, or functionality objectives. The first step is determining the appropriate analysis approach to the data system integration challenge. The second step is understanding the strengths and weaknesses of various approaches. Turns out that proper analysis at this stage makes actual technology selection far more accurate. Only when these are accomplished can proper matching between problem and capabilities be achieved as the third step and true business value be delivered. This webinar will illustrate that good systems development more often depends on at least three data management disciplines in order to provide a solid foundation.
Find more Data-Ed webinars here: http://www.datablueprint.com/resource-center/webinar-schedule/
Cassandra concepts, patterns and anti-patternsDave Gardner
An introduction to the fundamental concepts behind Apache Cassandra. This talk explains the engineering principles that make Cassandra such an attractive choice for building highly resilient and available systems and then goes on to explain how to use it - covering basic data modelling patterns and anti-patterns.
Metadata management is critical for organizations looking to understand the context, definition and lineage of key data assets. Data models play a key role in metadata management, as many of the key structural and business definitions are stored within the models themselves. Can data models replace traditional metadata solutions? Or should they integrate with larger metadata management tools & initiatives?
Join this webinar to discuss opportunities and challenges around:
How data modeling fits within a larger metadata management landscape
When can data modeling provide “just enough” metadata management
Key data modeling artifacts for metadata
Organization, Roles & Implementation Considerations
Enabling a Data Mesh Architecture with Data VirtualizationDenodo
Watch full webinar here: https://bit.ly/3rwWhyv
The Data Mesh architectural design was first proposed in 2019 by Zhamak Dehghani, principal technology consultant at Thoughtworks, a technology company that is closely associated with the development of distributed agile methodology. A data mesh is a distributed, de-centralized data infrastructure in which multiple autonomous domains manage and expose their own data, called “data products,” to the rest of the organization.
Organizations leverage data mesh architecture when they experience shortcomings in highly centralized architectures, such as the lack domain-specific expertise in data teams, the inflexibility of centralized data repositories in meeting the specific needs of different departments within large organizations, and the slow nature of centralized data infrastructures in provisioning data and responding to changes.
In this session, Pablo Alvarez, Global Director of Product Management at Denodo, explains how data virtualization is your best bet for implementing an effective data mesh architecture.
You will learn:
- How data mesh architecture not only enables better performance and agility, but also self-service data access
- The requirements for “data products” in the data mesh world, and how data virtualization supports them
- How data virtualization enables domains in a data mesh to be truly autonomous
- Why a data lake is not automatically a data mesh
- How to implement a simple, functional data mesh architecture using data virtualization
Big data architectures and the data lakeJames Serra
With so many new technologies it can get confusing on the best approach to building a big data architecture. The data lake is a great new concept, usually built in Hadoop, but what exactly is it and how does it fit in? In this presentation I'll discuss the four most common patterns in big data production implementations, the top-down vs bottoms-up approach to analytics, and how you can use a data lake and a RDBMS data warehouse together. We will go into detail on the characteristics of a data lake and its benefits, and how you still need to perform the same data governance tasks in a data lake as you do in a data warehouse. Come to this presentation to make sure your data lake does not turn into a data swamp!
This talk discusses how we structure our analytics information at Adjust. The analytics environment consists of 20+ 20TB databases and many smaller systems for a total of more than 400 TB of data. See how we make it work, from structuring and modelling the data through moving data around between systems.
Practical Partitioning in Production with PostgresEDB
Has your table become too large to handle? Have you thought about chopping it up into smaller pieces that are easier to query and maintain? What if it's in constant use? An introduction to the problems that can arise and how PostgreSQL's partitioning features can help, followed by a real-world scenario of partitioning an existing huge table on a live system. We will be looking at the problems caused by having very large tables in your database and how declarative table partitioning in Postgres can help. Also, how to perform dimensioning before but also after creating huge tables, partitioning key selection, the importance of upgrading to get the latest Postgres features and finally we will dive into a real-world scenario of having to partition an existing huge table in use on a production system.
From: DataWorks Summit 2017 - Munich - 20170406
HBase hast established itself as the backend for many operational and interactive use-cases, powering well-known services that support millions of users and thousands of concurrent requests. In terms of features HBase has come a long way, overing advanced options such as multi-level caching on- and off-heap, pluggable request handling, fast recovery options such as region replicas, table snapshots for data governance, tuneable write-ahead logging and so on. This talk is based on the research for the an upcoming second release of the speakers HBase book, correlated with the practical experience in medium to large HBase projects around the world. You will learn how to plan for HBase, starting with the selection of the matching use-cases, to determining the number of servers needed, leading into performance tuning options. There is no reason to be afraid of using HBase, but knowing its basic premises and technical choices will make using it much more successful. You will also learn about many of the new features of HBase up to version 1.3, and where they are applicable.
NA Adabas & Natural User Group Meeting April 2023Software AG
Join us as we explore:
• Adabas & Natural 2050+ commitment to innovation and roadmap
• How technology health assessments are helping identify potential risks and opportunities
• Modernize by unlocking legacy and mainframe data to leverage Snowflake on AWS
• New features for Adabas & Natural on Linux and the cloud
• Enhanced security and administration features
• How to access SQL Server from Natural
• Options to skill up your staff
To learn more about Software AG Adabas & Natural, please visit www.adabasnatural.com
Apache Spark and Apache Ignite: Where Fast Data Meets the IoTDenis Magda
It is not enough to build a mesh of sensors or embedded devices to obtain more insights about the surrounding environment and optimize your production systems. Usually, your IoT solution needs to be capable of transferring enormous amounts of data to storage or the cloud where the data have to be processed further. Quite often, the processing of the endless streams of data has to be done in real-time so that you can react on the IoT subsystem's state accordingly.
This session will show attendees how to build a Fast Data solution that will receive endless streams from the IoT side and will be capable of processing the streams in real-time using Apache Ignite's cluster resources.
In this presentation, we explore the options that enterprises have for deploying and managing cloud databases, including:
Data and application migration strategies.
New private, public, and hybrid cloud deployment models.
Enterprise tools for data management and resiliency.
Database workflow automation with virtual machines and containers.
About the presenter:
Jamie Watt is EDB's Vice President of Global Support Services and is in charge of leading technical support, customer care, remoteDBA services, and EDB's online community platform of PostgresRocks.com.
How to Run Containerized Enterprise SQL Applications in the Cloud with NuoDB ...MayaData Inc
Deploying an enterprise SQL database across geographically located OpenShift or Kubernetes clusters can be challenging. These deployments often require zero-downtime, ANSI standard SQL, ACID-compliant transactions, seamless day-2 operations, and highly performant and durable persistent storage systems. How can your organization easily deploy container-native storage with a distributed SQL database to deliver containerized apps in the cloud?
In this webinar, NuoDB and OpenEBS (MayaData) guide you as you build containerized apps that check these critical boxes:
[✓] Always on
[✓] At scale
[✓] High-performance persistent storage
Developing and Deploying Microservices to IBM Cloud PrivateShikha Srivastava
IBM Cloud Private (ICP) is a Kubernetes based environment that hosts a variety of workloads that helps developers create secure and highly available services for their cloud environment. Developers will experience a catalog of enterprise software that is deployed and managed as containers and run a complete microservices-based application in ICP.Join us to get hands-on experience using the Stock Trader sample (https://github.com/IBMStockTrader) running on IBM Cloud Private. Run the app and see it talk to Db2, MQ, and Redis, all also running in IBM Cloud Private. The app also talks to API Connect running in the public IBM Cloud. Developers will also experience how to author and deploy a microservice in ICP. Experience both the IBM Cloud Private web console and the kubectl command line interface to see how things are running, and to perform problem determination. You’ll also learn some tips and tricks that arose from this sample.
NuoDB + MayaData: How to Run Containerized Enterprise SQL Applications in the...NuoDB
Deploying an enterprise SQL database across geographically located OpenShift or Kubernetes clusters can be challenging. These deployments often require zero-downtime, ANSI standard SQL, ACID compliant transactions, seamless day-2 operations, and highly performant and durable persistent storage systems. How can your organization easily deploy container-native storage with a distributed SQL database to deliver containerized apps in the cloud?
In this webinar, NuoDB and MayaData guide you as you build containerized apps that check these critical boxes:
[✓] Always on
[✓] At scale
[✓] High performance persistent storage
---
Resources:
NuoDB & OpenEBS Solution Guide
https://mayadata.io/assets/pdf/nuodb-openebs-solution-docs.pdf
OpenEBS Documentation:
https://docs.openebs.io/docs/next/nuodb.html
OpenEBS Getting Started Workshop
https://www.katacoda.com/openebs/scenarios/openebs-intro
https://github.com/openebs/community/tree/master/workshop
OpenEBS & Litmus Repositories
https://github.com/openebs/openebs
https://github.com/openebs/litmus
NuoDB Documentation:
http://doc.nuodb.com/Latest/Default.htm
NuoDB CE Download:
https://www.nuodb.com/download
Which PostgreSQL is right for your multi cloud strategy? P2Ashnikbiz
The adoption of PostgreSQL in enterprises is becoming a strategic choice, more so with the adoption of Multi-Cloud now becoming a need for enterprise deployment. This availability creates multiple combinations of deployment options for you. So, it is important to identify the right strategy fitting into your organization’s needs.
PostgreSQL is versatile and used for a wide range of applications and use cases in the enterprise. It is more than just database technology, it is an accelerator for innovation. Much innovation today is happening in new application development, application modernization, and re-platforming to the cloud across the information architecture landscape. In this webinar, you will learn how EDB supercharges PostgreSQL to re-platform to cloud and containers more efficiently and develop new applications that are more scalable and secure.
The combination of StackPointCloud with NetApp creates NetApp Kubernetes Service, the industry’s first complete Kubernetes platform for multi-cloud deployments and a complete cloud-based stack for Azure, Google Cloud, AWS, and NetApp HCI. Further, Trident is a fully supported open source project maintained by NetApp, designed from the ground up to help meet the sophisticated persistence demands of containerized applications.
Openbar Kontich // Google Cloud: past, present and the (oh so sweet) future b...Openbar
Although a giant player in anything software related Google Cloud still feels a bit under appreciated. How did it get where it is now? What are its core strengths? Most of all, we want to provide a glimpse of the future in determining major shifts in Cloud computing. Every company is a data company but their data still remains under-utilised due to a lack of execution power, let’s find this power.
Due to Cloud pricing models efficient software engineering is gaining in importance, let’s unlock this efficiency. Hybrid and multi-Cloud is easily one of the largest investment domains in the Cloud world. Let’s find out why and see how we can stay vendor neutral as much as possible.
Discover the benefits of Kubernetes to host a SaaS solutionScaleway
What you can take away from this presentation:
- What a SaaS solution is
- Key figures on the SaaS market
- Advantages of Kubernetes Kapsule for SaaS
- How to optimize your costs and loads while maintaining stability
- How to guarantee the security of your infrastructures
- The difference between a multi-instance and a multi-tenant architecture
Accelerate Digital Transformation with IBM Cloud PrivateMichael Elder
Accelerate the journey to cloud-native, refactor existing mission-critical workloads, and catalyze enterprise digital transformations.
How do you ensure the success of your enterprise in highly competitive market landscapes? How will you deliver new cloud-native workloads, modernize existing estates, and drive integration between them?
Companies create IoT proof of concepts (PoCs) or small tests to fine-tune IoT designs before deploying new technology across a plant or plants. Large-scale deployments present challenges that might not be uncovered during the PoC stage. In this session, we cover the most common challenges companies fall victim to when they move from testing to deployment and how AWS IoT services give customers flexible and scalable solutions that help them scale to meet their IoT needs regardless of the number of devices connected.
Adabas & Natural World: Strategic Vision and Directions for Adabas and NaturalSoftware AG
Innovation World presentation.
Learn about the enhanced ease-of-use, openness and high-performance capabilities of Adabas-Natural. Get insights into the strategic vision and innovations roadmap for these products—and how to use them in your transformation into a Digital Enterprise. Topics include Adabas and big data analytics, mobile access to Natural applications and our cloud strategy as well as application modernization advances and success stories.
Speaker:
Guido Falkenberg
SVP, Adabas-Natural Product Marketing, Software AG
GCP Meetup #3 - Approaches to Cloud Native Architecturesnine
Talk by Daniel Leahy and Nic Gibson, given at the Google Cloud Meetup on March 3, 2020, hosted by Nine Internet Solutions AG - Your Swiss Managed Cloud Service Provider.
Similar to Adabas & Natural Virtual User Group Meeting NAM 2022 (20)
Modernization - Capabilities of NaturalONE, Mainframe data integration, and Cloud/hybrid cloud architectures | Devops deployments, cloud architectures, and application/data integration best practices.
One Path to a Successful Implementation of NaturalONESoftware AG
One path to a successful implementation of NaturalONE | Software AG
Join the Natural Administration team from Texas Comptroller of Public Accounts and discover how they overcame programmer resistance to successfully implement and thrive using NaturalONE and DevOPs. Get tips and techniques as well as real-world samples of architecture, configuration and implementations.
The Texas Comptroller of Public Accounts successfully implemented NaturalONE in the spring of 2019, deploying the NaturalONE client to 40+ Windows 10 laptops, and upgraded to mainframe Natural V9 a few months later. We had a rocky start and a lot of resistance from senior programmers, but we survived and are thriving – even the programmer with Natural 1.2 mainframe editing experience has made the leap and is editing Natural code in NaturalONE.
Join us as we share our experienced-based insights on the following topics:
- How to get your programming staff to accept the change to NaturalONE
- Overview of TX CPA NDV Architecture for Application Development Life Cycle
- Sample an NDV configuration reference guide provided to NaturalONE users
- Discuss differences between configuration files for NDV batch server and NDV server with the CICS adapter
- How to set up NDV Monitor (NATMOPI)
- Review pre-requisites/restrictions to adhere to for NDV CICS Adapter
- External Security Configuration requirements you won’t want to miss
- How do I DEBUG code in NaturalONE? (Just an overview reference)
- Lessons learned from issues we encountered, so you can have a smoother implementation
- Tips and techniques for using NaturalONE features that highlight the power of the NaturalONE IDE
To learn more about Software AG’s NaturalOne, please visit https://www.softwareag.com/en_corporate/platform/adabas-natural/devops.html
Apama, Terracotta, webMethods Upgrade: Avoiding Common Pitfalls Software AG
Get some valuable tips and techniques to optimize your upgrade process, including:
• The single most commonly overlooked source of upgrade information (and where to find the rest)
• Highlights of the upgrade guide (including a new section on databases)
• Supported upgrade paths and the optimal sequence of events for a smooth upgrade transition
• Tips on database migration
• When to install fixes
• Managing widely dispersed information
Innovation World 2015 General Session - Dr. Wolfram JostSoftware AG
Software AG's Chief Technology Officer, Dr. Wolfram Jost's General Session Presentation from Innovation World 2015.
https://www.youtube.com/watch?v=6aZsRW5I_t4
In-Memory Data Management Goes Mainstream - OpenSlava 2015Software AG
Manish Devgan's presentation from the OpenSlava 2015 Conference. The presentation will cover Ehcache and Terracotta Server, its recent milestones, and how it continues to help developers easily leverage in-memory storage for current and emerging workloads.
Watch the full presentation here: http://bit.ly/1MGwGUv
Thingalytics, a composite of “Things” and “Analytics,” shows organizations how to use real-time analytics and algorithms to seize the opportunities that flow from IoT while simultaneously minimizing threats.
The 7 Pillars of Market Surveillance 2.0Software AG
Software AG explores the Seven Pillars of Market Surveillance 2.0 that will lead you to the next generation of bigger and better market surveillance, leaving the fines and prison sentences behind.
Top 10 Manufacturing and Supply Chain 2015 TrendsSoftware AG
Software AG has released the company’s top ten predictions for the next year in the manufacturing and supply chain industry, calling 2015 the tipping point for supply chain sentience.
Next year, innovative manufacturers will leap ahead by deploying more and better sensors, add more comprehensive automation and increasingly choose local fabrication.
Sean Riley, Global Manufacturing & Supply Chain Solutions Director, Software AG noted: “In 2015, we will see manufacturers and their partners accelerate the implementation of initiatives that will deliver on the promise of the Internet of Things. For some companies, this will be a challenge, but for many others it will be a year of great opportunity leading to significant competitive advantage.”
Into the Box Keynote Day 2: Unveiling amazing updates and announcements for modern CFML developers! Get ready for exciting releases and updates on Ortus tools and products. Stay tuned for cutting-edge innovations designed to boost your productivity.
Exploring Innovations in Data Repository Solutions - Insights from the U.S. G...Globus
The U.S. Geological Survey (USGS) has made substantial investments in meeting evolving scientific, technical, and policy driven demands on storing, managing, and delivering data. As these demands continue to grow in complexity and scale, the USGS must continue to explore innovative solutions to improve its management, curation, sharing, delivering, and preservation approaches for large-scale research data. Supporting these needs, the USGS has partnered with the University of Chicago-Globus to research and develop advanced repository components and workflows leveraging its current investment in Globus. The primary outcome of this partnership includes the development of a prototype enterprise repository, driven by USGS Data Release requirements, through exploration and implementation of the entire suite of the Globus platform offerings, including Globus Flow, Globus Auth, Globus Transfer, and Globus Search. This presentation will provide insights into this research partnership, introduce the unique requirements and challenges being addressed and provide relevant project progress.
Check out the webinar slides to learn more about how XfilesPro transforms Salesforce document management by leveraging its world-class applications. For more details, please connect with sales@xfilespro.com
If you want to watch the on-demand webinar, please click here: https://www.xfilespro.com/webinars/salesforce-document-management-2-0-smarter-faster-better/
Modern design is crucial in today's digital environment, and this is especially true for SharePoint intranets. The design of these digital hubs is critical to user engagement and productivity enhancement. They are the cornerstone of internal collaboration and interaction within enterprises.
Gamify Your Mind; The Secret Sauce to Delivering Success, Continuously Improv...Shahin Sheidaei
Games are powerful teaching tools, fostering hands-on engagement and fun. But they require careful consideration to succeed. Join me to explore factors in running and selecting games, ensuring they serve as effective teaching tools. Learn to maintain focus on learning objectives while playing, and how to measure the ROI of gaming in education. Discover strategies for pitching gaming to leadership. This session offers insights, tips, and examples for coaches, team leads, and enterprise leaders seeking to teach from simple to complex concepts.
Developing Distributed High-performance Computing Capabilities of an Open Sci...Globus
COVID-19 had an unprecedented impact on scientific collaboration. The pandemic and its broad response from the scientific community has forged new relationships among public health practitioners, mathematical modelers, and scientific computing specialists, while revealing critical gaps in exploiting advanced computing systems to support urgent decision making. Informed by our team’s work in applying high-performance computing in support of public health decision makers during the COVID-19 pandemic, we present how Globus technologies are enabling the development of an open science platform for robust epidemic analysis, with the goal of collaborative, secure, distributed, on-demand, and fast time-to-solution analyses to support public health.
How to Position Your Globus Data Portal for Success Ten Good PracticesGlobus
Science gateways allow science and engineering communities to access shared data, software, computing services, and instruments. Science gateways have gained a lot of traction in the last twenty years, as evidenced by projects such as the Science Gateways Community Institute (SGCI) and the Center of Excellence on Science Gateways (SGX3) in the US, The Australian Research Data Commons (ARDC) and its platforms in Australia, and the projects around Virtual Research Environments in Europe. A few mature frameworks have evolved with their different strengths and foci and have been taken up by a larger community such as the Globus Data Portal, Hubzero, Tapis, and Galaxy. However, even when gateways are built on successful frameworks, they continue to face the challenges of ongoing maintenance costs and how to meet the ever-expanding needs of the community they serve with enhanced features. It is not uncommon that gateways with compelling use cases are nonetheless unable to get past the prototype phase and become a full production service, or if they do, they don't survive more than a couple of years. While there is no guaranteed pathway to success, it seems likely that for any gateway there is a need for a strong community and/or solid funding streams to create and sustain its success. With over twenty years of examples to draw from, this presentation goes into detail for ten factors common to successful and enduring gateways that effectively serve as best practices for any new or developing gateway.
We describe the deployment and use of Globus Compute for remote computation. This content is aimed at researchers who wish to compute on remote resources using a unified programming interface, as well as system administrators who will deploy and operate Globus Compute services on their research computing infrastructure.
Unleash Unlimited Potential with One-Time Purchase
BoxLang is more than just a language; it's a community. By choosing a Visionary License, you're not just investing in your success, you're actively contributing to the ongoing development and support of BoxLang.
How Recreation Management Software Can Streamline Your Operations.pptxwottaspaceseo
Recreation management software streamlines operations by automating key tasks such as scheduling, registration, and payment processing, reducing manual workload and errors. It provides centralized management of facilities, classes, and events, ensuring efficient resource allocation and facility usage. The software offers user-friendly online portals for easy access to bookings and program information, enhancing customer experience. Real-time reporting and data analytics deliver insights into attendance and preferences, aiding in strategic decision-making. Additionally, effective communication tools keep participants and staff informed with timely updates. Overall, recreation management software enhances efficiency, improves service delivery, and boosts customer satisfaction.
OpenFOAM solver for Helmholtz equation, helmholtzFoam / helmholtzBubbleFoamtakuyayamamoto1800
In this slide, we show the simulation example and the way to compile this solver.
In this solver, the Helmholtz equation can be solved by helmholtzFoam. Also, the Helmholtz equation with uniformly dispersed bubbles can be simulated by helmholtzBubbleFoam.
Cyaniclab : Software Development Agency Portfolio.pdfCyanic lab
CyanicLab, an offshore custom software development company based in Sweden,India, Finland, is your go-to partner for startup development and innovative web design solutions. Our expert team specializes in crafting cutting-edge software tailored to meet the unique needs of startups and established enterprises alike. From conceptualization to execution, we offer comprehensive services including web and mobile app development, UI/UX design, and ongoing software maintenance. Ready to elevate your business? Contact CyanicLab today and let us propel your vision to success with our top-notch IT solutions.
Prosigns: Transforming Business with Tailored Technology SolutionsProsigns
Unlocking Business Potential: Tailored Technology Solutions by Prosigns
Discover how Prosigns, a leading technology solutions provider, partners with businesses to drive innovation and success. Our presentation showcases our comprehensive range of services, including custom software development, web and mobile app development, AI & ML solutions, blockchain integration, DevOps services, and Microsoft Dynamics 365 support.
Custom Software Development: Prosigns specializes in creating bespoke software solutions that cater to your unique business needs. Our team of experts works closely with you to understand your requirements and deliver tailor-made software that enhances efficiency and drives growth.
Web and Mobile App Development: From responsive websites to intuitive mobile applications, Prosigns develops cutting-edge solutions that engage users and deliver seamless experiences across devices.
AI & ML Solutions: Harnessing the power of Artificial Intelligence and Machine Learning, Prosigns provides smart solutions that automate processes, provide valuable insights, and drive informed decision-making.
Blockchain Integration: Prosigns offers comprehensive blockchain solutions, including development, integration, and consulting services, enabling businesses to leverage blockchain technology for enhanced security, transparency, and efficiency.
DevOps Services: Prosigns' DevOps services streamline development and operations processes, ensuring faster and more reliable software delivery through automation and continuous integration.
Microsoft Dynamics 365 Support: Prosigns provides comprehensive support and maintenance services for Microsoft Dynamics 365, ensuring your system is always up-to-date, secure, and running smoothly.
Learn how our collaborative approach and dedication to excellence help businesses achieve their goals and stay ahead in today's digital landscape. From concept to deployment, Prosigns is your trusted partner for transforming ideas into reality and unlocking the full potential of your business.
Join us on a journey of innovation and growth. Let's partner for success with Prosigns.
How Does XfilesPro Ensure Security While Sharing Documents in Salesforce?XfilesPro
Worried about document security while sharing them in Salesforce? Fret no more! Here are the top-notch security standards XfilesPro upholds to ensure strong security for your Salesforce documents while sharing with internal or external people.
To learn more, read the blog: https://www.xfilespro.com/how-does-xfilespro-make-document-sharing-secure-and-seamless-in-salesforce/
Designing for Privacy in Amazon Web ServicesKrzysztofKkol1
Data privacy is one of the most critical issues that businesses face. This presentation shares insights on the principles and best practices for ensuring the resilience and security of your workload.
Drawing on a real-life project from the HR industry, the various challenges will be demonstrated: data protection, self-healing, business continuity, security, and transparency of data processing. This systematized approach allowed to create a secure AWS cloud infrastructure that not only met strict compliance rules but also exceeded the client's expectations.
Your Digital Assistant.
Making complex approach simple. Straightforward process saves time. No more waiting to connect with people that matter to you. Safety first is not a cliché - Securely protect information in cloud storage to prevent any third party from accessing data.
Would you rather make your visitors feel burdened by making them wait? Or choose VizMan for a stress-free experience? VizMan is an automated visitor management system that works for any industries not limited to factories, societies, government institutes, and warehouses. A new age contactless way of logging information of visitors, employees, packages, and vehicles. VizMan is a digital logbook so it deters unnecessary use of paper or space since there is no requirement of bundles of registers that is left to collect dust in a corner of a room. Visitor’s essential details, helps in scheduling meetings for visitors and employees, and assists in supervising the attendance of the employees. With VizMan, visitors don’t need to wait for hours in long queues. VizMan handles visitors with the value they deserve because we know time is important to you.
Feasible Features
One Subscription, Four Modules – Admin, Employee, Receptionist, and Gatekeeper ensures confidentiality and prevents data from being manipulated
User Friendly – can be easily used on Android, iOS, and Web Interface
Multiple Accessibility – Log in through any device from any place at any time
One app for all industries – a Visitor Management System that works for any organisation.
Stress-free Sign-up
Visitor is registered and checked-in by the Receptionist
Host gets a notification, where they opt to Approve the meeting
Host notifies the Receptionist of the end of the meeting
Visitor is checked-out by the Receptionist
Host enters notes and remarks of the meeting
Customizable Components
Scheduling Meetings – Host can invite visitors for meetings and also approve, reject and reschedule meetings
Single/Bulk invites – Invitations can be sent individually to a visitor or collectively to many visitors
VIP Visitors – Additional security of data for VIP visitors to avoid misuse of information
Courier Management – Keeps a check on deliveries like commodities being delivered in and out of establishments
Alerts & Notifications – Get notified on SMS, email, and application
Parking Management – Manage availability of parking space
Individual log-in – Every user has their own log-in id
Visitor/Meeting Analytics – Evaluate notes and remarks of the meeting stored in the system
Visitor Management System is a secure and user friendly database manager that records, filters, tracks the visitors to your organization.
"Secure Your Premises with VizMan (VMS) – Get It Now"
Experience our free, in-depth three-part Tendenci Platform Corporate Membership Management workshop series! In Session 1 on May 14th, 2024, we began with an Introduction and Setup, mastering the configuration of your Corporate Membership Module settings to establish membership types, applications, and more. Then, on May 16th, 2024, in Session 2, we focused on binding individual members to a Corporate Membership and Corporate Reps, teaching you how to add individual members and assign Corporate Representatives to manage dues, renewals, and associated members. Finally, on May 28th, 2024, in Session 3, we covered questions and concerns, addressing any queries or issues you may have.
For more Tendenci AMS events, check out www.tendenci.com/events
TROUBLESHOOTING 9 TYPES OF OUTOFMEMORYERRORTier1 app
Even though at surface level ‘java.lang.OutOfMemoryError’ appears as one single error; underlyingly there are 9 types of OutOfMemoryError. Each type of OutOfMemoryError has different causes, diagnosis approaches and solutions. This session equips you with the knowledge, tools, and techniques needed to troubleshoot and conquer OutOfMemoryError in all its forms, ensuring smoother, more efficient Java applications.
Quarkus Hidden and Forbidden ExtensionsMax Andersen
Quarkus has a vast extension ecosystem and is known for its subsonic and subatomic feature set. Some of these features are not as well known, and some extensions are less talked about, but that does not make them less interesting - quite the opposite.
Come join this talk to see some tips and tricks for using Quarkus and some of the lesser known features, extensions and development techniques.
2. Agenda
▪ Adabas & Natural 2050+ Innovations and Roadmap
▪ Mainframe modernization and cross-agency data sharing at Delaware Justice Information Systems (DELJIS)
▪ Bi-directional API implementation at Teachers Retirement Systems of Texas
▪ Bridging the skills gap
▪ End of support considerations for Natural 8 on z/OS
▪ How to liberate your data for modern analytics
▪ Adabas & Natural for z/OS License Key Management
38. EntireX
Broker
RPC RPC
RPC Server
Subprogram 1
Subprogram 2
Subprogram n
Stub
RPC Runtime
RPC = Remote Procedure Call
REST = Representational State Transfer
webMethods
Integration Server
EntireX
Adapter
R
E
S
T
Service
Consumer
41. USE Cases Teacher Retirement System of Texas
1. Phased low code applicationmodernizationrequirements
- Health Insurance Line Of Business (LOB)
- Annuity Payroll
- Web Self Service
- Pension LOB Member & Retiree Updates
- RetirementApplication
- Death Claims
2. API’s for handheld devices and portal for integration between Payrolland Medical Provider
3. What was easy Web Service development and testing
SAG customer support - quick turnaround to resolve issues
4. What was hard Learning the tools when unfamiliarwith Eclipse
Setting up multiplehighly complex TRS test environments for code versioning
Close coordination required between developers and infrastructure support
5. Benefits to business Immediatedata updates and real-timedecision making
Replaced batch LOB to Legacy Bridging with real-timecommunication
Increased productivity with departments not waiting for overnight processing
Sharing data and communicatingissues between Legacy and TRS Line Of Business applications
Sharing demographicand payment data with Web Self Service application for Health Insurance and Pension LOB
6. Focus on REST Initial development with SOAP, current development using REST
7. Future use The plan is to retire the mainframe, TBD
8. Who was involved? Legacy developers, middleware infrastructure,database, Java development, business users
42. USE Cases Teacher Retirement System of Texas
Inbound and Outbound Requests
43. Line of Business RTC with Legacy Applications
Pension LOB Member & Retiree Update Process
44. USE Cases - General
1. Customer Service Portals
2. Mainframe processes needing data like email lookup, SSA verification, Billing Address etc
from SAAS environments
3. Phased low code application modernization requirements
4. Retirement setup, Annuity Payroll and Death Claims
5. Member, Retiree and Beneficiary synchronization
6. Eligibility for Health Insurance Coverage
7. Mainframe business Tax integration requirements
45. Benefits
1. Easy to use Eclipse Based Design Tool
2. Very Intuitive with a single UI for all development processes
3. Very easy to learn
4. Faster development time
5. Dev/OPS integration
52. Application and Database Managed Services - Overview
Managed Services
▪ Long-term contract, usually 3-5 years with options
to extend
▪ Defined and agreed upon detailed scope
▪ Fixed Monthly Fee
▪ Service Level Agreements
▪ All technical resources are Software AG selected
and owned by Software AG (both employees and
subcontractors)
▪ Customer is on Active Software AG Maintenance
and supported versions of software
▪ Remote with occasional onsite visits
Staff Augmentation
▪ Contract term is usually a minimum of 1 year
▪ Customer manages all activities of the resources
▪ Time and Materials or Fixed Monthly Fee
▪ Customer is on Active Software AG Maintenance
▪ Remote or Onsite
Support Services
▪ Contract term usually a minimum of 1 year or a
defined period of time for Fixed Price projects
▪ Performed during standard working hours for non-
production related activities
▪ After hours support for Production
▪ Provides additional support for customer resources
▪ Customer has a stable, but critical production
environment
▪ Defined and agreed upon scope
▪ On-call fee
▪ Fixed Price or Fixed Monthly Fee based on number of
hours (additional hours are charged on a T&M basis)
▪ Customer is on Active Software AG Maintenance
▪ Remote with occasional onsite visits as requested
Support Offerings Description
4
68. Benefits
NaturalONE
NaturalONE is an Eclipse™-based Integrated Development
Environment (IDE) that lets developers code, test and maintain
applications, expose Natural objects as services, create rich internet
applications and Web interfaces, and manage the development life
cycle. From one environment, you can use all your mainframe tools
and Eclipse-based plugins to immediately satisfy users.
69. NaturalONE
Editing Capabilities
Object-based code templates for new Natural
objects
Code templates based on content assist
Toggle source lines into comment lines and
vice versa
Context sensitive help
Object dependency view
Outline view
Supports full Natural syntax (Mainframe,
Linux, UNIX, Windows)
Reliable, instant parsing
Hovering error messages
Folding of source code
Upper/lower case translation
Context-focused code completion
70. Upgrade Today!
NaturalONE
• Get ready for the future of the Natural
environment with a range of new
capabilities and features, while
empowering your team with the product
support and coverage you need!
• The time is now to update to avoid any
critical lapses in coverage, compliance,
and support for previous versions.
• Update today for the smoothest
transition during the switch, and to get
the maximum benefit from SoftwareAG’s
easy-to-use suite of tutorials.
78. Liberate Data for Cloud Analytics
A new approach to data integration with
Software AG CONNX + StreamSets
Nicole Ritchie, Director Product Marketing, Software AG
79. Executives say mainframe-
based applications are central
to their business strategy
71%
Enterprises prefer deploying
data integration in the
cloud over on-premises
2X
Corporate data stored on the
Cloud, as cloud adoption
doubles in past 7 yrs
60%
Converging market trends
Demand new approach to data integration
80. StreamSets acquisition completes the Software AG portfolio
Software AG now has a complete iPaaS offering covering all aspects of integration
=
A SOFTWARE AG COMPANY
81. Cloud data platforms revolutionizing business & analytics
Computing power & scale to handle extremely large volume of data
Business modernization - customer engagement, data applications, channels to market
Cloud analytics platforms - test ideas fast / fail fast, scale on demand
Rising importance of core enterprise data - custom apps, legacy data stores, mainframes
INDUSTRY
LANDSCAPE
A SOFTWARE AG COMPANY
82. Data analytics initiatives focus on business outcomes
Gartner analysts predict that by 2025, 80% will be considered an essential business capability
Customer
Service
R&D
Business optimization
• Product or vendor portfolio
• Inventory & asset management
• Manufacturing & logistics
Operations
Sales &
Marketing
HR
Finance
Risk management
• Fraud detection
• Regulatory reporting
• Dashboards
Growth
• Customer segmentation
• Single view of customer
• Capitalize on historic trends
Differentiation
• Customer service
• Built-to-order
• Pricing
A SOFTWARE AG COMPANY
83. Cloud Data Platforms
SaaS Applications & Cloud Services
How to liberate volume & variety of data for analytics
Current approach
Enterprise Data Stores
VSAM
Adabas
Db2® IMS
100s of prebuilt connectors
Wizard-driven (set it/forget it)
Point & click
Bespoke
Traditional tools
A SOFTWARE AG COMPANY
84. Data Lakes
Data Warehouses
Object Stores,
Landing Areas
Business
Intelligence
Data Science
AI/ML
{API}
Analytics
Messaging,
Event Hub
Real Time
Apps
Database
{API}
Schema Change
Platform & Version
Change
Consumer Request
Change
Storage Layer Change
Format Change
Semantic Change
Source & API Change
Connection Change
Data integration for the modern data ecosystem
More Sources, More Platforms, More Patterns, More Consumers
A SOFTWARE AG COMPANY
Data
Sources
Data
Consumers
data drift: unexpected, unannounced and unending changes to data structure, infrastructure & semantics
86. StreamSets DataOps Platform
Centralized deployment and monitoring for hybrid environments
CONTROL
PLANE
CUSTOMER
DATA PLANE
Control Hub
Build-Operate-Monitor
Conformed:
Data Warehouses
Curated:
Data Lakes
Raw:
Object Stores,
Landing Areas
Data
Sources
{API}
Streaming
CDC
Batch
ETL
Applications:
Analytics enabled
Reverse ETL
Microservice
BI
Analytics
AI/ML
ELT
Independent Control & Data Planes
● Design & control (control plane) are fully
separated from physical data execution
(data plane)
● Engines execute how/where best for
performance, security; can operate
independently of control plane
A SOFTWARE AG COMPANY
87. StreamSets DataOps Platform
CONNX + StreamSets liberates mainframe data
CONTROL
PLANE
CUSTOMER
DATA PLANE
Control Hub
Build-Operate-Monitor
Mainframe Collector
70+ Predefined processors
● Shape data
● Look ups
● Rename, Replace
● Parse, Merge
Cloud Data
Platforms
Present, Search, Deliver
● Relational format
● SQL virtualized view
● Move in batch or CDC
Connect, Translate, Secure
● Connect to files & databases
● Translate EBCDIC to ASCII
● Fit in & extend security (SSL)
● Extensible 7 access levels
A SOFTWARE AG COMPANY
Data
Sources
Db2
IMS DB
IDMS
Adabas
VSAM
QSAM
Etc.
88. A SOFTWARE AG COMPANY
Single Design Experience
Access and move IBM mainframe VSAM data via CONNX and StreamSets to Snowflake
89. Why Software AG CONNX + StreamSets
Modern Data Integration
Deepest DB Coverage - Access more mainframe & legacy data sources +
deploy across more hybrid & multi-cloud destinations
Adaptable Data Pipelines -Deliver data continuously even when data drifts (i.e.,
schema, location, infrastructure changes)
All in one solution - Single design experience for all patterns including batch,
streaming, CDC, ETL, ELT and Machine Learning
A SOFTWARE AG COMPANY
90. CONNX Data Integration
Unlock legacy data value without disrupting business continuity
API &
Microservices
Data-driven
Applications
Business
Intelligence
Advanced Data
Analytics
IBM®
Legacy
UNIX | Windows®
CONNX Data Integration
Cloud DBs
Data Lakes
Access
150+ databases on
any platform with
SQL
Virtualization
of data from legacy to
cloud in a single, real-
time view
Movement
to hybrid cloud,
reusing legacy
without disruption
150+
Databases
OpenVMS
AS/400
RDBMS
Data Warehouse
Data Marts
Databases
Cloud Data Lakes
Db Adapters
Metadata Mgt
Real-time CDC
ELT - ETL
Security Replication